PharmApps: An Integrated Compound Management System for Tracking Compound Replicating and Usage in High Throughput Screens

Aim: To improve the quality assurance, auditability and performance in compound management systems. Background: Managing hundreds of thousands of compounds in a drug screening facility requires the use of comprehensive data management software that can track and keep track of compound usage and movement such as duplicating multiple copies of mother-daughter plates and generating of intermediate and assay plates. Commercial software for Compound Management (CM) is usually expensive, requires an annual subscription fee for continuous support and updates. Such software is usually beyond the means of academic or publicly funded drug screening facilities. Objective: We report here the development of an in-house web-based CM System (CMS), PharmApps (short for Pharmaceutical Applications), which tracks and logs compounds dispensed for primary High Throughput Screens (HTS), compounds hit-picked for retest and dose response, plate management and administrative functions. Results: The program has security measures to allow only assigned bioassayist to access plate information particular to his/her assigned screening campaigns while inventory functions are assigned to CM personnel only. Unique features include visual plate and well dispensation status, plate formatting record, barcode information, inventory update and query functions. Lessons drawn are applicable to both Commercial Off-The-Shelf (COTS) and in-house CMS in areas such as quality control and quality assurance, auditability /traceability and system performance. Method: A system design and development methodology that identifies and addresses the gaps in both workflow and information processes. Conclusion: We have successfully created and deployed a functional CMS in our Centre. The conceptual approaches suggested are applicable for both commercial and customized CMS, and highly relevant to any organizations aiming to achieve positive returns on investment in such systems.


Introduction
One of the major challenges in supporting High-Throughput Screening (HTS) is the need for a high-performance CMS that can perform accurate tracking, close monitoring, error reporting, and doing so while dispensing hundreds of thousands of compounds over a short period. Thus, a CMS must meet high standards of Quality Control (QC), Quality Assurance (QA), auditability, traceability and system performance to ensure operational efficiency and productivity. Satisfying all these require identifying and addressing weaknesses and gaps in the entire CMS. In this report, we detail how gaps in our CM process were uncovered and surmounted.
We categorized gaps as either knowledge or information, with knowledge gap defined as information needed but the process of acquiring it is unclear. In contrast, information gap refers to missing information that can be gather through known processes.
Identifying these gaps pose the first hurdle. For knowledge gaps, there is the additional challenge of finding or creating techniques to overcome them. There were two knowledge gaps. The first was how to verify a successful dispensing from a specific source to destination well occurred. The second was how to ascertain a destination well contains a target compound.
Along the way, we addressed information gaps within our CM process. We also described challenges faced with COTS software (used previously to support CM operations) and how an inhouse CMS was developed to surmount the above. Although the solution appears unique, we believed the approaches taken to be generalizable and can be applied in either CMS or Lab Information Management System (LIMS) implementation. We reviewed the literature relating to QA, QC, system costs and performance. A study of existing CMS/LIMS [1,2] painted a diverse picture. In systems such as MScreen and SAVANAH [3], there were no explicit QC measures in the software to verify that samples had been dispensed properly. Hints of dispensing problems would only surface when test results are inconsistent. For example, initial screening results might be positive, but retest results turned out negative either due to wrong samples dispensed or target samples not being dispensed. Such incidents pointed to issues upstream but do little for problem monitoring, root cause identification or even aid in the troubleshooting process. In Charles et al. [4], improvements in QA and QC were gained through capturing information from Liquid Chromatography-Mass Spectrometry (LCMS) equipment into a centralized system. The accumulated records made it possible to troubleshoot low quality samples or anomalies in biological testing. COTS software offers great cost and time savings when a lab workflow requirement can be met either directly or through one-off software configurations [5]. In cases where extensive customizations are needed or where customizations affect system performance, their value propositions are questionable. Various factors like cost of customizations, time and effort required and degree of performance degradation need to be taken into account. One way in tackling the cost of customizing COTS software is to use configurable, open source software to support lab workflows [6]. However, capturing information from lab instruments used is necessary to validate and verify that processes are working properly in the overall CMS/LIMS. Data generated by one instrument when processing samples should be compared with information generated by the next instrument, when feasible, to ensure that there are no discrepancies. Underlying this validation approach is the observation that the output of one instrument often forms the input for the next.
Although, this approach appears onerous, the successful examples [7] of existing LIMS that have used rigorous software validation and verification to achieve national accreditation highlight its importance for successful implementations. Lastly, HTS operation generate huge amount of data for analysis. We therefore expected that reviews of CMS system performance would be common in the literature. Unfortunately, most literature on CMS do not detail performance capabilities or how performance improvements are achieved; perhaps due to commercial considerations. In the next section, we will describe in details the process of developing an in-house CMS known as PharmApps. As we did not patented our system and due to its heavy customizations and uniqueness to our environment, we did not make the source code freely available. However, we are willing to work with collaborators who are interested to implement a similar system within their organizations.

Requirements
PharmApps was developed as a replacement for a COTS CMS.
There were no requirements for adding analytical capabilities as those features were available in another software. Instead, information of samples was transferred from PharmApps to the analytic software as necessary. Therefore, we will only focus on listing CMS requirements in Table 1

Modular System Architecture
PharmApps is a modular system with a core sub-system   The JBoss application server is used as the platform for publishing in-house developed Java Enterprise Edition (JEE) modules. These modules control and coordinate the CM workflow processes and ensure that the business logic is adhered to. The application server also provides other services such as rendering web pages, controlling user authentication and information access and simplifying the complexities of data manipulation, storage and extraction.

Modern Web Browsers and Javaserver Faces (JSF) Framework
Web browsers have been enhanced with the introduction of Asynchronous Javascript and XML (AJAX) technology as well as Document Object Modeling (DOM), allowing for a more immersive and interactive experience for users. JSF is a Java framework that makes it easier for developers building web applications.
Leveraging on these two elements, a number of JSF frameworks had emerged to simplify the task of developing interactive and responsive web applications. PharmApps leverage on Rich Faces PharmApps is able to deliver a more user-friendly and interactive experience.

Analytic Software: IDBS Activity Base
Activity Base is the analytic software used by the bioassayists to analyze their experimental results. Before doing this, the assay plate data such as well sample description, well volume and sample concentration must be transferred from PharmApps to Activity Base via Activity Base's PL/SQL packages. Once transferred, bioassayists can then upload experimental results into Activity Base for analysis.

Emailing and Lightweight Directory Access Protocol (LDAP) System: Microsoft Exchange Server and Active Directory
To meet the workflow requirements, we identified three prerequisites.
a) The ability to send task alerts through emails. This is done through the emailing capabilities of Exchange.
b) Access to users' information such as email addresses, departments and names. By acting as a LDAP server, the AD is the source for this. c) Details of the user roles and their projects association.
These are assigned and stored within PharmApps.

System Framework
The PharmApps system is written using JEE 6 with JBoss

System Pre-Requisites
All assay plates and tubes processed through PharmApps have to be barcoded in one-or two-dimensional format. All barcodes must be unique. One-dimensional barcodes are generated using a running number while two-dimensional barcodes are provided by a vendor (with the assurance that there are no replicates).     Each category can contain further sub-groups in a format emulating folder, sub-folders and so on with plates resembling files.  b) Users can identify assay plates/wells that are closely related. The association arises because the LCMS plates and the assay plates are both prepared from the same source or ancestor.

Results and Discussion
Work on the new system began in October 2013. PharmApps was running in parallel with its predecessor system (a customized COTS software) in December 2014 and officially took over one month later. In total, the project took one year and three months. During this period, one system analyst was assigned solely to the project.

Challenges Revisited
The predecessor of PharmApps recorded dispensing information through a series of user interactions. Initial source plate information was uploaded from manually created files.
Subsequent information on compound transfer was generated via user instructions. For example, a user would instruct the system to create a destination plate and transfer a certain volume from a specific source plate to it. The whole process was done in silico and not verified by other sources. Thus, actual events could differ from recorded events such as the wrong source plate used for dispensing. Such errors were hard to trace and the only clue was the huge deviations uncovered in experimental results. It was hard to reconstruct what happened from the users' recollections, as it was difficult to recall specific details against a background of voluminous activities and fading memories. All these factors placed severe strains on the QA and QC process. We realized the need to improve QA and QC through traceability and auditability in the whole workflow process. Gaps in the CM process had created blind spots that weakened the entire QC chain.

System Design and Development
A review of the previous CMS was conducted. This was followed by a series of interviews with end-users (mainly CM administrators and bioassayists) and a study of the HTS operations. Both explicit and implicit concerns were documented. In the process, an information gap ( Figure 5, dotted red circle) was identified that arose from the explicit need to match dispensing with requests. As a result, sample requests from bioassayists were captured in a new module. Implicit concerns were given special attention as they indicate the existence of knowledge gaps in the current environment. Two knowledge gaps were deduced: a) Whether a source to destination well transfer was successful, b) Whether a library compound is present in a well. Figure 5: An analysis of a typical HTS operation revealed both information (dotted red circle) and knowledge (normal red circle) gaps. We overcome the information gap by recording the information generated the users. The knowledge gap required developing of new techniques for capturing inputs from dispensing instruments. By overcoming these gaps, we created an integrated workflow that enhances the overall operational quality.
Addressing the first knowledge gap required developing new tracking capabilities in the dispensing process ( Figure 5, normal red circle). In PharmApps, the dispensing tasks were tracked by recording their interactions with lab equipment. Through the event logs, there is a real-time, accurate and reliable way of knowing how and when the dispensing was done as well as the equipment used (see Application module 3 in Materials and Methods section). The later addition of a LCMS equipment and its corresponding module addressed the second knowledge gap in PharmApps. PharmApps was designed for massive data processing from plate generation to queries. Generating two hundred 384-wells plates took less than fifteen minutes. These outcomes were achieved by combining large number of transactions into one or more batch processing.
Compared to processing one well at a time, both throughput and performance were vastly improved. The use of asynchronous batch jobs also created positive improvements in user experience.

Lessons Learned
Our experience with a COTS and a subsequent in-house CMS highlights the importance of considering the type of research as well as the mix and match of instruments in a lab during implementation.
It is also important to alert key decision makers whenever discrepancies are detected and allow reviews at crucial stages.
Doing so will improve productivity and operational effectiveness.  Note that there is a possibility that step 2, 3 and 4 may not address the knowledge gaps uncovered in step 1. For example, PharmApps was unable to determine the presence of a library compound in a well until the LCMS module was added.
A three-step approach was taken to integrate the workflows: a) Identify the information that will impact the users and their operational effectiveness, b) Extract, gather and disseminate this information to the affected users, and c) Develop the workflow process that will automate this information exchange.
The degree of integration to lab equipment varies from low coupling to tight coupling. An example of low coupling will be copying and sanitizing the equipment log files manually before feeding them into the CMS/LIMS. With tight coupling, the system uses the API of the lab equipment software to actively interrogate and influence the operations of the lab equipment. An example would be to check if the correct source and destination plates were used and to halt operation if discrepancies are detected. The extent will depend on users' requirements and the technical feasibility.
Although integrating the workflow processes and lab equipment within a CMS offers many benefits, there are dangers involved. These dangers occur when the constraints of integration are not well understood. These constraints are described below. Both the old and new PharmApps are currently in use as modules are migrated one at a time to avoid a "Big Bang" approach. This is possible because user sessions are transparently created in both systems upon login and destroyed when the user logout. When the users click on a migrated module in the old system's menu, they are simply transfer to the new PharmApps and vice versa.

Constraints of Integration
Although more efforts are needed to ensure that both systems' menus and appearance are similar, this reduces any adverse effect on the overall user experience. We created PharmApps to meet certain requirements and for operation within our unique environment. It would not be appropriate to publish its source code without giving detailed guidance on how to customize it for another lab environment. This would necessitate a process of close collaboration and knowledge transfer. As such, we are willing to offer our knowledge and expertise to any interested collaborators seeking to implement a similar system.