Safeguarding Data Integrity with People, Process, and Technology
Data integrity refers to maintaining the accuracy and consistency of data over its entire life cycle. From creation, deletion to preservation and everything in between. It is the bedrock of GLP, GCP and GMP. Executed successfully, it has the potential to empower healthcare professionals who use medicinal products and devices to improve the lives of their patients. Put simply, protecting data integrity is the foundation upon which the health, safety and wellbeing of patients and consumers rests.
The pharmaceutical manufacturing landscape continues to change, with new products and better science becoming available and new generations of companies becoming involved in this increasingly complex, connected and data-driven ecosystem. As this ecosystem is scrutinised against increased regulation, the technology that has acted as a catalyst for elements of this change becomes even more important in the practice of safeguarding the integrity of data generated by pharmaceutical organisations.
Data integrity is the foundation upon which the health, safety and wellbeing of patients and consumers rests
Technology is simplifying the process of collecting, monitoring and analysing data and making it quicker and easier for quality professionals to demonstrate compliance. Yet if the culture of an organisation is one that fails to fully recognise the role of data integrity in promoting safe and profitable practice, regardless of the technology used, the door remains wide open for risk to enter. Technology, however brilliant, will never be a solution in isolation. In order to effectively manage the integrity of data, it is essential that all factors that can have an adverse effect on a specific situation are explored and addressed. It is also imperative to fully incorporate the three pillars of effective quality assurance and compliance management—people, process and technology—and the role each can play in maintaining safe and successful operations.
Understanding the Context
In recent years we have seen a number of high-profile cases in the United States where quality control analytical testing results and data have been deliberately manipulated in order to falsify the test results associated with the release of medicinal products Admittedly an extreme example of data integrity failure, but one that has prompted regulators, and the wider quality community, to prioritise restricting the use of quality-related processes and systems that provide users with the option to manually manipulate data.
Most instances of poor data integrity result from human error or negligence, rather than malice, yet the consequences of the action—be it deliberate or accidental—remain the same. By corrupting the continuity and completeness of an audit trail, e.g. by removing pages of a physical record and/or replacing them with alternatives, or deleting electronic records and not being able to reconcile this with who or why or when and under what authority–integrity is irreparably compromised.
To effectively promote quality in an organisation, it is paramount that data is trustworthy and accessible, be it to auditors or as part of a schedule of internal reviews against key performance indicators.
Data Integrity 2.0
Data integrity can be broken down into two core elements: physical and logical. The first refers to elements such as the physical location, personnel access controls, environmental controls and processes for retrieving or moving the data sources. The latter refers to the controls and records relating to any access and use of the electronic data after it has been created and being able to demonstrate the accuracy and completeness of this data—how can we demonstrate it is still the same?
Modern technology, like Electronic Quality Management Systems (EQMS), does not just provide a quicker, more consistent and more easily accessible means to store data—to search for and to retrieve documents at the click of a mouse—it also makes data meaningful. Knowing you have reams of technical documentation, standard operating procedures perhaps, filed neatly in alphabetical order, in a secure cabinet in a locked room—or hidden within a computerised document storage system—is one thing. Being able to explore the data contained within those documents and scrutinise it in order to make informed business decisions within a few minutes is quite another. And this is where technology’s role in adding real business value as well as safeguarding effective quality assurance and compliance management is really starting to become pivotal.
The pharmaceutical sector is evolving. It is recognising the need to operate more effectively and to harness technology to its advantage. The majority of organisations with track records of continuous improvement in these areas almost always have teams of quality experts that look beyond their daily roles to seek ‘a better way’. Subsequently, true innovation in software advancement usually starts with an agile business that identifies potential improvement, and a vendor innovative enough to develop technology to empower the change. A good example of this is the increasing requirements for demonstrable ‘Quality metrics’ placed on pharma and biopharma organisations that needs to be an integrated part of the QMS. So a holistic approach that considers all areas of a business’ electronic document creation, storage, maintenance and retrieval needs is essential to get over the foreseeable and unforeseeable hurdles that will inevitably lie between idea and implementation.
A Programme for Cultural Change
Technology aside, for the true value of data integrity to be recognised it is essential that ownership rests with each and every individual who comes into contact with that data. There is an onus on everyone within an organisation that is involved in the preparation, recording, checking, transferring, storage and use of GMP data to make sure they understand and adhere to the internal processes and regulations associated with maintaining that data’s integrity. And this means that there is also an obligation placed on such individuals to challenge anything that is considered to be a potential risk to the integrity of the data managed by the organisation—be it the use of inappropriate methods of organising and archiving documents, to blowing the whistle on instances of perceived tampering or foul play.
Yet in order to encourage individuals to take ownership, there must first be an appreciation of why it is important in the first place. What are the consequences of poor data integrity? And what are the benefits of effective data management? Championing data integrity or improving practice does not necessarily equal the need for financial investment or increased auditing. Often small, cultural changes like instigating reward and recognition programmes, appointing data integrity champions for each relevant area of the organisation and holding regular refresher courses on what best practice looks like, what is expected at individual level and what the bigger commercial picture looks like can instil a sense of context and responsibility in all staff. If people are not clear on why data is monitored and measured, or feel empowered to challenge processes, how can they be expected to care about it? Equally, if methods of managing quality assurance and safeguarding data integrity are outdated or ineffective, how can organisations expect to remain compliant?