Is ELN up to the challenge?
An area of accelerating market growth for electronic laboratory notebook (ELN) technology is in analytical sciences, particularly late state biopharmaceutical development and quality. The large number of ELN users in the un-regulated world of research has inspired organizations to gain similar improvements in efficiency downstream on the R&D continuum. Quality by design (QbD) initiatives, combined with pushes to accelerate time-to-market, are motivating clients to evaluate solutions to expedite technology and method transfer.
While, for years, other industries have taken advantage of product lifecycle management (PLM) software and the ISA-S88 standard process model to streamline product commercialization, the biopharmaceutical industry has been stuck in paper-based workflows driven both by a fear of regulatory compliance and a general reluctance to change. Forward-looking companies are turning to ELN as one of the tools to help build a bridge over the traditional development/quality divide. Not coincidentally, this is the hottest area of supplier investment for developing new capabilities and for augmenting their portfolio through acquisitions.
It has been well-documented that the quest for an ever-expanding user community has resulted in LIMS and ELN products evolving beyond historic areas of comfort1 (a.k.a. “convergence”), as shown in Figure 1. R&D ELNs have added structured data management and some level of method automation, while ELN laboratory execution systems (LES) found in quality are adding method development and LIMS-like sample tracking functionality. Not to be outdone, LIMS suppliers are adding LES-type components. In most cases, these new capabilities have been watered-down additions not as functional as those systems that were purpose-built. Currently, the method execution modules added to research ELNs are a far cry from the depth of features found in LES products. Even within the same platform, the conversion of a method from development into a fully functional executable is about the same level of work as it is to transfer it to another supplier’s platform.
The path chosen by customers is dependent on a number of factors. For some, having a common platform with less functionality is sufficient, as it is advantageous from an IT management, training and single interface perspective. It could be less costly, as there is more leverage over the supplier. A company may have customized their LIMS to supply many of the bench-supporting capabilities found in LES platforms. For others, meeting the detailed requirements of the laboratories is paramount. Or their LIMS infrastructure is insufficient for their workflow requirements; they want a new tool so they can have less dependence on an inadequate LIMS. This may lead to deployment of multiple vendor solutions to support, which was common when the distinction between product categories was less grey. This is leading to some very interesting — and in some cases heated — debates inside prospective user organizations.
For method transfer, the current state is, therefore, a “brick wall” between development and quality, necessitating re-building the method in the lab execution system and undergoing several validation cycles (Figure 2). This is not only time-consuming, but expensive; the cost of method conversion has been the greatest hindrance of companies adopting an LES, especially when there are hundreds of existing validated protocols to migrate.
Wouldn’t it be great to have the best of both worlds, allowing for a seamless transition of analytical methods regardless of the platform chosen? A vision where users can be guided through development and validation of a method and transform it into an easy-to-use executable procedure independent of the platform would be ideal. Methods would need to be validated only once. In a perfect world, clients could mix and match different vendor technologies to select the best tool for the task at hand.
Unfortunately, this vision involves some level of standardization across suppliers, and we have little reason to be optimistic this will ever happen. So many initiatives of this type have failed already; suppliers seem to take a parochial view that it is not in their interest to play nicely with others. Just harmonizing terminology between internal groups at an end-user company, particularly those under regulatory control, is difficult enough. However, there is some hope that, within a supplier product suite, a simpler method of transfer offers a light at the end of the tunnel — at least if you purchase all the components from the same company.
Current state workflow and ELN
Consider the workflow of analytical methods for drug substances, drug products and raw materials. The work for developing a new method is free-flowing and lends itself well to many of the research ELN products on the market. In reality, for most chromatographic methods, much of the early development is performed in the chromatography data system (CDS) to evaluate the impact of changes in flow rate, gradient, injection volume, detector wavelength, etcetera. A basic template in the ELN is used for documentation of objectives, materials, buffers, standards, instrument run conditions and entry of a select set of results like a representative chromatogram. If the assay is similar to one used in the past, the ELN library can be searched to retrieve the method and use it as a starting point.
Method assessment follows for optimization and checking for analyst-to-analyst variability. This process is more rigid than early development, with a greater emphasis on documentation and data analysis. Information on standards, buffers, materials and equipment are entered into pre-formatted tables in the ELN. A typical scenario is for several analysts to run multiple experiments, uploading their results. ELN calculation templates consolidate data and analyze for accuracy, sensitivity, specificity and reproducibility. Depending on the outcome, the process may cycle back for further tweaking and optimization, requiring the ELN to track the various versions of the method and experiments. Once it passes specification, a manager, peers and/or quality assurance (QA) use the electronic signature workflow for review, permitting the method to progress to a full validation cycle.
During assessment, the analyst may take advantage of several modules that are commonly found in ELN products, e.g., guiding them through the creation of solutions, managing chemical inventory, performing tasks such as daily balance checks, and direct importation of instrument data through integration. These modules streamline data entry and help to comply with the documentation requirements of good manufacturing practices (GMP) in later phases.
Around the same time as assessment, a standard operating procedure (SOP) document is written which contains some of the ELN content on materials, equipment and instrument conditions. The SOP may describe the specific steps for an analyst to follow in the preparation of buffers, stock solutions, standards and samples, and the set-up of an instrument and executing its procedure. Oftentimes, the SOP is routed through yet another workflow for review via a document management system, usually as a draft.
Naturally, the process that follows for validating the method under GMP is far more stringent. After the creation of a validation plan, the ELN method tuned during the assessment phase is used as the beginning for running similar tests in a controlled manner, as documentation can be subject to regulatory inspection. ELN templates carefully control the entry of data through lists of values, format checks and range specifications that alert the user to any exceptions. During validation, the analyst also may wish to retain the raw instrument data using a scientific data management system (SDMS)-like tool to link raw files to the record in the ELN. At the end of the process, a report describing the results of the validation is created with the acceptance criteria and any events and non-conformities. If approved, the method in the ELN is signed and released to the validated library that secures it against modification.
The method is then transferred to the manufacturing plant(s) and the quality department. If the site has an LES or LES-like module of a laboratory information management system (LIMS) or research and development (R&D) ELN, it is natural for them to want analysts to execute the method via that solution. This is where the pain begins. In the current state, the method and its SOP will have to be re-formatted and entered — or actually programmed with some solutions — via a proprietary method builder. The resulting SOP/form-like interface will guide technicians along the procedure documenting every step to reduce risks of regulatory non-conformity.
In quality, converting to electronic versions of methods from paper requires analysis to prevent a cluttered database and fractured data model. This also is true if a “paper-on-glass” (e.g., PDF) version of a method is provided by development and requires conversion. Since many of the components in the executable form might be common to other methods, the implementer has to examine it to determine if modules developed previously can be re-used. If this is a modification of an existing method, this is relatively easy. If it is a completely new workflow, the process can be rather daunting — not just from the time it takes to reformat the SOP and ELN method into the LES, but the effort that must be undertaken for validating the executable form.
A future state based on S88
In the December 2012 issue of Scientific Computing, I discussed the ISA S88 standard and how it could be applied to formulations.2 In that article I wrote, “several pharmaceutical companies are turning to the ANSI/ISA-88 and ANSI/ISA-S95 standards (commonly referred to as “S88” and “S95” respectively. See www.isa.org for more). S88 is a set of standards describing a process model for batch control through the description of and order set of procedures, materials, stages, operations and equipment” and “the key concept is a recipe which is really no different from a recipe in a kitchen. Simply, it is a list of materials, combined in a particular order using equipment to make a product.”
Some organizations are turning to S88 and adapting the recipe concept to method formats and transfer. Bill Buote, CTO of the Analytical, Development, Quality and Manufacturing (ADQM) group at Accelrys said, while at VelQuest (VelQuest was acquired by Accelrys in early 2012), they “quickly realized the format of methods within the system was analogous to an S88 recipe with four levels: a method has sections, sections have unit operations, unit operations contain basic operations.” In S88, a general recipe procedure has stages, stages have process operations, which have one or more process actions. A master recipe also can describe equipment and materials, all very similar to a method SOP. “With one of our customers, we took advantage of their S88 initiative to write a translation utility to move methods as recipes from their Symyx ELN (now Accelrys ELN) to the SmartLab (now Accelrys LES) platform.”
One of the benefits of S88 is in its flexibility. This is also one of the framework’s weaknesses. There is a high degree of interpretation on how to deploy S88 within an organization, and it does not specifically demand entities to be described with standard attributes. Within a single company, leveraging S88 can be difficult, as it is a commitment for standard data representation from development through quality and manufacturing. This requires a structured approach to method representation in the development process across all analysts; an extra level of change management not all companies would wish to tackle.
Buote says Accelrys is actively working on a newer approach not requiring conversion of methods with active sharing of components and entities between their two platforms. “The two systems are distinct to meet the needs of the domains they serve. The Accelrys ELN allows freedom to design new methods, while LES more rigidly controls workflows for GMP compliance. However, there needs to be a common infrastructure between the two to share properties, core modules like inventory and to manage entities.” He says their vision will be to allow customers to create general method entities in development, store them in a common management platform, and load them into the LES. The site can apply equipment and controls specific to that site and modify for only the parameters necessary, streamlining transfer and eliminating much of the necessary validation. In S88 parlance, this is, in effect, turning a general recipe into a site recipe and then, finally, into a master recipe. However, Buote was unwilling to provide a release date for this new functionality.
ELN has improved efficiency and reduced compliance risks in both analytical development and quality. Breaking through the brick wall between the two has not been straightforward, nor will it be overcome overnight. It is important for customer organizations to determine what capabilities they need in both the short and long term and consider their existing infrastructure and business direction. Choices will have to be made and, in Atrium’s experience, those choices are not easy.
1. Elliott, Michael H., “Informatics Convergence Presents Opportunities and Challenges,” Scientific Computing, October 2011
2. Elliott, Michael H., “Structuring Data Management for ELN in Formulations,” Scientific Computing, December 2012
Michael Elliott is CEO of Atrium Research & Consulting. He may be reached at editor@ScientificComputing.com.