<img alt="" src="https://secure.perk0mean.com/184386.png" style="display:none;">

Can the Cloud be a GxP-compliant Solution for the Pharma Industry?

Expert Interview

Introduction

Past issues of PM QM have dealt with the topic of embedding a cloud-based solution in the pharmaceutical environment. Many questions have arisen, especially in this highly regulated industry, as to the appropriateness and compliance of a cloud-based approach. Common cloud models have been presented briefly and clearly. Special use cases were also highlighted. The following interview of experts provides brief and compact insights into the view and thinking of an XaaS service provider (X as a service–where X can stand for one of the various cloud models), a pharmaceutical company as a customer, a cloud provider as a service provider, a representative of a German government agency, and, finally, an application provider.

 

Below are the three cloud models, the core terms, and the definitions of each.

      • Cloud provider: This is a provider of cloud-based solutions, such as Amazon, Microsoft, Google, etc., that provides the cloud infrastructure (hardware or software).

      • Application provider: This is the organization such as a company or a contract research organization (CRO) that provides the end-use customer with a cloud-based solution for an application, or operates it on behalf of the end-use customer. The application provider uses the services of a cloud provider for this purpose. For example, a traveler who is planning a trip uses the online travel portal Expedia. In this case, Expedia is the application provider that offers services to the traveler. Expedia itself uses the cloud provider Amazon Web Services for its customers.

  • Customer: Customers ultimately use the cloud-based solution for themselves or their company.

The expert interview that follows provides answers to general questions that are equally important to all stakeholders. These questions are not exhaustive, and answers provided by stakeholders reflect their personal nature, professional experience and practice, and thus should not be considered a general applicable standard. This article provides a basis around which all stakeholders involved in a cloud implementation process can build their own systems. The questions selected thus represent a consensus among the five authors and, in their opinion, are the most important to ask.

 

The General Idea

This article examines the storage of data on servers of external suppliers, but here are two fundamental aspects to the issue that are distinguished.

On the one hand, storing data on servers of various providers yourself is already standard practice. However, the following article does not go into detail about what kind of data is involved specifically, but remains rather general given the complexity of the matter. The use of the term “GMP data” would complicate matters, and such an article would exceed the scope of the journal. Very few companies are likely to store their batch reports externally. With documents such as SOPs or guidance documents, however, we are already a bit further ahead today.

The second aspect, and probably the more interesting one, is the storage of data by third parties. It is perhaps a useful and transparent approach to use an example to explore the issue. Therefore, the topic in question is explained by an example: Data logger leasing – what happens to this data?

Based on this, the questions are easier to understand or answer, but should not be applied to all other areas.

 

Introduction

We are all familiar with the “cloud.” It has become an indispensable part of everyday life in most places and makes our personal lives easier. The days of worrying about how to get data from point A to point B, so that someone else can access it too, are in the past. As already explained in the previous article, the cloud is being used increasingly in industrial applications. This is a good thing because in times of climate change, lean management and globalization, a “rethink” is needed. Faster, more efficient, and climate-friendly are just a few of the important keywords. Still, the adoption and application of cloud-based solutions is potentially difficult in many areas. There are many reasons for this, but above all, there are concerns about compliance with guidelines, laws and regulations, maintaining data integrity, data security, traceability, failsafe measures, etc. For this reason, all stakeholders in the supply chain, all the way up to the government agency, have come together to provide answers to the most important questions as a basic building block for this decision: Cloud, yes or no? If anything, the term cloud must be “avoided” here – technically, we are ultimately talking about a server and the relevant customer interface.

 

Who owns the data in the cloud? As a customer, which regulations do I have to pay particular attention to in contracts with the application provider?

AT: I would like to use the term RU (“regulated user” which corresponds to the license holder HE, IE, ...) in this context. According to Section 10(2) of the German Ordinance for the Manufacture of Medicinal Products and Active Pharmaceutical Ingredients (Arzneimittel- und Wirkstoffherstellungsverordnung, AMWHV), the RU is responsible for ensuring that if records are made using electronic, photographic or other data processing systems, the system must be validated adequately. At a minimum, it must ensure that the data are available for the duration of the retention period, and can be readable within a reasonable timeframe. The stored data must be protected against loss and damage. In addition, the RU must comply with the European General Data Protection Regulation (GDPR). In particular, data that are not confidential and/or contain personal data are critical. This is relevant, for example, for clinical platforms or blood bank software deployed as SaaS using a cloud-based model. Territorial boundaries are, of course, irrelevant in cloud computing from a technical point of view. However, the place of processing is relevant for the applicable law.

When drafting agreements (SLA) with the CSP, the following aspects should be taken into account:

    • Scope of service, response times, availability

    • Server/data locations

    • Subcontractors, service providers of CSP

    • Certification, proof of compliance with security standards

    • Communication and contact partners

    • Ability and tools to monitor the service

    • Handling of security events

    • Regulations on the procedure for changes

    • Escape strategies, business continuity

    • Contractual penalties for non-compliance with the agreements

 

PO: Data ownership, i.e., the question as to who owns the data, must be clarified contractually in any case. When it comes to monitoring data, I think it is obvious that it belongs to the customer. However, additional questions may arise in the future. For example, whether the customer as data owner chooses to grant an anonymized right of use to the application provider, so that the latter can aggregate a heat map of the problem points in the global transport network based on the data of many customers. This would allow customers to benefit with regard to better planning of their transport routes. As already mentioned, having a clear contractual arrangement is important.

 

What is the significance of the geographical location, i.e., the location of the servers?

AT: The choice of the geographical storage location is relevant if you want to ensure that third parties cannot access the RU’s data under existing national laws (e.g., USA), and that the RU can thus avoid violating the GDPR. Another motivation is that the RU protects their intellectual property by specifying the location. The fact that the physical storage location is the only place where the data is stored and not, for example, at subcontractors in other geographic regions as backup or IaaS, should be stipulated in the SLA, and verified as part of the CSP qualification.

PO: The geographical storage location is critical for privacy and other legal reasons to prevent access by unwanted “authorities.” Meanwhile, in view of the increasing use of database clusters, the question of the specific server or hard disk should no longer matter, even for on-premise solutions.

 

From a regulatory point of view, does it make a difference whether I use a private instance of the application running in the cloud or a shared instance?


AT: Based on the results of the data assessment, the criticality assessment of the application, and the business continuity assessment, a decision should be made as to whether outsourcing and, in particular, the use of a CSP is possible without compromising patients and/or the quality of the drug product. If outsourcing is agreed to, the deployment model should be chosen based on criticality. Private and community cloud models are preferable to the public cloud where confidential data is concerned. The way different tenants are demarcated, and separated from each other, depends on the deployment model, and is better in a private cloud than in a public cloud.

PO: Of course, such decisions must always be made on a risk basis. From a technical point of view, it could be argued that for GxP applications, secure control of access must be ensured at the user level, i.e., at a more granular level than between different organizations sharing an instance of a cloud application. Looked at in this way, from a data security and integrity perspective, the question of public or private cloud should not matter. But there will generally be differences in other important aspects, e.g., in determining when updates are to be applied. Here, private instances certainly offer advantages.

 

What is the minimum documentation required from the cloud provider? Is a current SOC 2 report sufficient?

AT: According to Annex 11, competence and reliability of the supplier are key factors in the selection of a product or service provider. The need for an audit should be assessed in terms of risk. In other words, the higher the requirements to be met by the service and the deployment model, the more important it is to qualify and continuously monitor a CSP. Poor configuration of the infrastructure can lead to service failure, or data may be lost or compromised. Based on a risk assessment, a decision must be made as to whether an on-site audit is necessary. Certification can attest to compliance with security st


andards. Preference should be given to international standards that address the specific subject matter of the service: ISO/IEC 27001 Information technology – Security techniques – Information security management systems – Requirements, ISO/IEC 27017: Cloud Computing Security and Privacy Management System-Security Controls, ISO/IEC 27036-4: Guidelines for security of cloud services. In any case, the scope of the certificate should be assessed.


How do I go about auditing a cloud provider if they do not allow audits?

AT: If the outcome of the assessment shows that an on-site audit is required, then a CSP that does not allow it is not appropriate. At this point, reference should be made to the possibility of joint audits or shared audits.

BN: You also always have the option of requesting a joint audit from a cloud provider. This means that several companies/customers join forces, and jointly perform an audit that is valid and applicable to all. This reduces the workload on all sides, but at the same time provides good evidence of compliance with legal and customer requirements. This is currently being planned at Amazon Cloud Services in Munich, for example.

 

How can a cloud be validated? What are the approaches that are absolutely necessary?

AT: The application must be validated, and the infrastructure qualified. The validation of the application is essentially the same as for an on-premise application. Annex 11 and GAMP®5 provide relevant guidance here. The challenge is the triad of CSP, RU, and software provider. Good communication and project management are required here.

The qualification of a dynamic infrastructure, which is also subject to very dynamic ongoing evolution, is the real challenge. Here, there are often weaknesses of the CSP in providing the “documented evidence” to the RU, which enables the RU to assume a qualified infrastructure.

PO: When it comes to validating GxP-relevant applications, such as those provided by ELPRO, there is no difference between cloud-based and on-premise solutions.

 

How can data ownership rights be protected?

PO: Since the ownership rights of data are identical to those of other tangible objects, we believe that the same protection mechanisms can be used by analogy:

Data access to customer data by the cloud provider must be defined very clearly and set out contractually.

    • Strict limitation of data access from the application provider to customer data (only very few, selected employees of the application provider have access to the root server).

    • Access to customer data by the application provider via the root server is logged and recorded in an audit trail.

    • Access by third parties is prevented as far as possible by clearly defined security mechanisms. The infrastructure is monitored continuously for hacker attacks. Successful hacking attacks are analyzed and documented. Hacked access data (passwords, etc.) must be replaced immediately by the cloud provider or application provider. The customer is informed about the data attack, and the new credentials are transmitted. The security gap is remedied and closed as quickly as possible by taking appropriate measures. If possible, the hacker attack is traced.

    • At the request of the customer, the data must be securely and irrevocably destroyed by the application provider, e.g., once the contract ends.

    • All of the above points, and certainly many more, should be identified and defined clearly in the SLA. The application provider should include appropriate clauses in the SLA with the cloud provider, and the customer should have an appropriate SLA agreement with the application provider.

 

How can I ensure data integrity in the cloud if I am not the cloud owner myself?

AT: According to Annex 11, physical and electronic measures should be taken to protect data from damage. The availability, readability, and accuracy of stored data should be checked. Access to data should be guaranteed throughout the retention period. In the view of the German Federal Office for Information Security (BSI), this can only be ensured by cryptographic procedures. I share that view.

PO: Encryption is the logical first starting point. In order to effectively rule out manipulation, blockchain-based solutions are also an option. However, the additional security is set against high computing efforts, and ultimately also the distributed data storage in blockchains. Still, we will see new solutions in this area in the future.

 

Do I always know the locations where my data are stored? Do I have any control over it?

AT: If the storage location is critical, it will have to be defined as part of the SLA, and verified as part of the qualification process. This is not only critical to GxP, but also to business.

 

How are cloud servers protected from external access? Who is responsible and ultimately liable for this safeguard? Is sensitive data (patient data, study results, etc.) treated differently than non-sensitive data? Can cloud data be deleted?

AT: It must be ensured for the RU that their data on all storage media and locations, as well as in all versions (e.g., various backup versions), are deleted when the business relationship is terminated, or if the RU requests this. This should be part of the SLA and qualification.

 

Is there such a thing as an audit trail in the cloud (ability to find out who made which changes in the cloud and when)?

AT: Yes, at the very least, there should be a log file.

 

During an audit of a cloud provider, how can I ensure that these are the very servers hosting my data? How knowledgeable are cloud providers about regulatory requirements, such as GAMP®5 and 21CFR Part 11?

AT: Cloud Service Providers (CSPs) are not required to operate under GAMP®5 or 21CFR Part 11, but they must have an appropriate framework that complies with equivalent principles. If this framework is aligned with the principles of the EU GMP Guide and/or GAMP®5, it will be useful for the pharmaceutical industry.

 

Is it necessary and technically possible to verify the storage of data and its attributes by means of a regular report?

AT: According to Annex 11, the IT infrastructure (IAAS, PAAS) should be qualified, and the application (SAAS) validated. Specifically related to data storage, this means that data must be protected from damage by physical and electronic measures and that the availability, readability, and accuracy of the stored data must be checked (see above). Access to data should be guaranteed throughout the retention period. The following are requirements for the quality of the CSP and the data integrity (for data in motion and at rest), which are not explicitly found in the EU GMP Guide, but are considered useful from the point of view of EFG 11:

    • Transmission of data only in encrypted form, and in a manner that ensures that the data have been transmitted completely and unchanged.

    • The type of storage of critical data must be determined on a risk basis (e.g. use of appropriate cryptographic procedures).

 

What is the minimum documentation required from the provider? What is the scope of the qualification of the system? Degree of validation of the interfaces and functions.

AT: Basically, the same requirements apply to a CSP as to a regulated user. The following deficits are frequently observed:

    • The qualification of the infrastructure has not been documented.

    • Changes to the hardware and middleware, as well as to the security-relevant components, are made without the permission of the license holder. Notification of changes is given at short notice or not at all.

    • The QM system of the cloud service provider does not comply with EU GMP standards.

    • Subcontractors/service providers are used by the CSP to build out the infrastructure (computing power, storage) without the prior consent of the license holder. This does not affect regulations on data protection.

 

Conclusion

There are many questions and concerns. Probably an almost infinite number. Nevertheless, we have summarized what we consider the most important questions with the appropriate answers from different perspectives. Ultimately, however, the responsibility falls to the customer, and thus, to the user of the cloud. The customer alone decides how and which path to take in order to face this issue well prepared. In the end, patient safety is, and remains, the most valuable asset to protect in the pharmaceutical sector. Software must work. Data must be available, tamper-proof, traceable, and must have its integrity protected. Signatures and authentication are equally important. Data are what make up our digital daily lives today: our virtual elixir of life. We each have to decide for ourselves who we can entrust with it. It is our hope that that answers provided here should and can help in decision-making. However they are not, nor can they ever be, conclusive.

 

DOWNLOAD THE ORIGINAL ARTICLE HERE

*only available in German | PM QM issue 11/2020

 

Authors:

Dr. Arno Terhechte, Administrative District of Münster, Germany after studying pharmacy and obtaining his doctorate, he worked for five years in the pharmaceutical industry in the business areas of national approval, international approval and quality control, most recently in the position of Deputy Head of Control. In 1998, he joined the Düsseldorf District Government, where he was responsible for the supervision of pharmaceutical manufacturers. Since 2003, he has been working for the Münster District Government in the Pharmacy Department. Here, in addition to monitoring drug manufacturers, he conducts inspections of medical device manufacturers and operators. He is the head of Expert Group 11 Computerized Systems and a member of the Arbeitsgemeinschaft für Pharmazeutische Verfahrenstechnik e.V. (Working Group for Pharmaceutical Process Engineering). (APV) Expert Group "Information Technology".

 

Patrick Pichler, Director, Head of Global Distribution, Artwork & Product Security Quality Healthcare Quality, Global Healthcare Operations, Merck studied biotechnology and worked for the Austrian Agency for Health and Food Safety (AGES) as an inspector for eight years. Before that, he worked for eleven years as Head of Quality Control Laboratory in several companies, among other positions. He joined Merck in 2019 and since 2014 has held the role of Head of Distribution Quality in the area of Good Distribution Practice (GDP) responsible

csm_EL_Philipp_Osl_AuthorImage_web_5dd67e9871-2Dr. Philipp Osl, CEO ELPRO-BUCHS AG is CEO of the ELPRO Group. Besides his studies in
Business Informatics and a doctorate in Business Innovation, he looks back on experience as a project manager, product manager and founder of a technology start-up with branches in Switzerland and Poland.

 

EL_Bjoern_Niggemann_01592_webBjörn Niggemann, President GQMA has been Chief Quality Officer at ELPRO-BUCHS AG since April 2016. In 2004, he was initially tasked with setting up and implementing GMP in parallel with the existing DIN ISO 17025 certification. In 2007, as Compliance Manager, he established a GMP system in addition to an existing Good Laboratory Practice (GLP) system. Between 2009 and 2010, he worked for a pharmaceutical service provider as GLP/cGMP Compliance Manager. From 2010 to 2016, he worked in a Swiss biotech company in the role of Head of Operations and Quality. He is also President of GQMA - Germany Quality Management Association e.V.

Leave a Comment

From Other Categories

Read More