A Total Error Approach for the Validation of Quantitative Analytical Methods

by Anagha Kakade

The total error approach is a statistical technique for assessing performance of analytical methods. It could be used for any lab data analysis in oncology or other clinical trials or for the testing of new drugs. Due to variable parameters involved in inter-laboratory transfer of clinical trials like analysts, instruments, day/sessions and geographic location, there are chances of different assignable and non-assignable causes of variations. Here, the statistical analysis of measurement processes helps to identify and quantify sources of variation. The total error approach evaluates the variability in the parameters and helps the scientist in decision making.

Continue Reading »

Data Visualization: The Epilepsy Story

by Surabhi Dutta

How do we extract the right data to tell the most important stories and then present them to others in a way that can make a difference for clinicians and patients? This question is truly at the heart of data visualization. In this paper, Data Visualization: The Epilepsy Story, we aim to do just that; we tell the story of epilepsy based upon the exploratory analysis and visualization of data for clarity.

Continue Reading »

Critical KPIs for a Successful FSP Outsourcing Model

by Michael Cox

Pointillism is a technique of painting using tiny dots of various pure colors, which become blended in the viewer’s eye when seen at a distance.  It was developed by George Seurat with the aim of producing a greater degree of luminosity and brilliance of color.  Alone, each colored point on the canvas is just that and only that.  It is only when these colors are properly grouped together does the picture make sense.

Continue Reading »

Dissolution Analyses: Comparison of Profiles Using f2 Analysis Calculation

by Anagha Kakade

A dosage form is the form in which a drug is produced and dispensed; for example, tablet, capsule or suspension. The rate and extent to which the amount of drug substance dissolved over a period of time is called dissolution. Dissolution testing is the primary pharmaceutical test that is designed to probe the performance of dosage forms.  The dissolution method developed is compared with the innovator’s reference product to evaluate the release pattern and establish the method comparison for estimating the drug release. The purpose of this article is to provide some insight into the comparison of dissolution profiles using f2 analyses.

Continue Reading »

The Risks of In-Sourcing All of Your Programming & Statistical Analysis

by Antony Goncalves

It’s certainly a challenge that most – if not all – companies face these days: which aspects of our business do we keep in-house and which pieces do we outsource? And nowhere is that more of a challenge than within the regulated environment of the pharmaceutical industry.

Continue Reading »

An Introduction to the Standard Data Tabulation Model (SDTM)

by Anilkumar Jangili

For programmers who analyze clinical trial data in the pharmaceutical and biotechnology industries, the Standard Data Tabulation Model (SDTM) is incredibly important and valuable. Prepared by the Submission Data Standards (SDS) team of the Clinical Data Interchange Standards Consortium (CDISC), the purpose of SDTM is to organize, structure and format standard clinical trial tabulation datasets submitted to regulatory authorities such as the US Food and Drug Administration (FDA).

Continue Reading »

Are you attending PhUSE in Deerfield, IL?

by Eliassen Group

In case you’re not familiar with PhUSE, it’s an independent organization that serves as a global platform to discuss topics surrounding programming in the regulated environments of the pharmaceutical and biotechnology industries. And on July 21st, 2016, PhUSE is hosting a single day event entitled, “Utilizing Risk-Based Monitoring to More Effectively Identify and Mitigate Risk While Ensuring Patient Safety, Data Quality and Integrity” in Deerfield, IL.

Continue Reading »