Biology Reference
In-Depth Information
Mechanistic toxicogenomics also showed advantages over conventional toxicity measures
in a study of cardiotoxicity [197] . During a two week study at a high dose of a cardiotoxicant,
rats showed myocardial degeneration and necrosis. At lower and middle doses, however,
there was no evidence of cardiotoxicity using the traditional toxicological end points [198] . A
mechanistic toxicogenomic investigation revealed only minor gene expression pattern altera-
tions in the heart from the low dose group. However, rats given high dose for one and five
days showed striking, similar gene expression alterations, even though there were no clinical
signs or symptoms in rats treated with the high dose on day one. Thus, the alteration in gene
expression provided earlier detection than the traditional toxicologic endpoints. To explore
these results further, the researchers were able to determine that a number of the differen-
tially regulated genes were related to mitochondrial impairment. The authors emphasized
that the gene expression assay can be used to generate a hypothesis - that the mechanism of
toxicity was the inhibition of mitochondrial function. Further studies are needed to confirm
this hypothesis.
6.5 EFFORTS TO ADDRESS TECHNICAL CHALLENGES
IN TOXICOGENOMICS
There are many technical issues that need to be addressed when using high-throughput
technologies such as transcriptomics, proteomics, and metabolomics for toxicogenomics. For
example, microarray experiments are fraught with potential sources of variation including
multiple options of the process that alter the target quality [199,200] . In particular, the compa-
rability and reliability of microarray gene expression data across laboratories has previously
been questioned [201-203] . The overall quality of a specific array design also depends on the
consistency of manufacturing and the limits of the platform's dynamic range [204-207] . To
address these concerns, the microarray community and regulatory agencies have developed
a consortium to establish a set of quality assurance and quality control criteria to assess and
assure data quality, to identify critical factors affecting data quality and to optimize and stand-
ardize microarray procedures so that biological interpretation and regulatory decision making
are not based on unreliable data. These fundamental issues were addressed by the MicroArray
Quality Control (MAQC) project consortium ( http: // edkb.fda / gov / MAQC / , accessed on Nov.
18, 2012). The MAQC project originally aimed to establish quality control metrics and thresh-
olds for the objective assessment of the performance achievable by different microarray plat-
forms [208] . Meanwhile, the project addressed the parallel issues related to genome-wide
association studies (GWAS), biomarker development and outcome prediction and, more
recently, NGS. In addition to addressing data quality metrics, the MAQC project evaluated
the merits and limitations of various data analysis methodologies [209] . It confirmed that with
careful experimental design and appropriate data transformation and analysis, microarray
data are reproducible and comparable across different platforms from different laboratories.
It is anticipated that the MAQC projects will help to improve microarray and other emerg-
ing biomarker technologies and foster their appropriate use in the discovery, development
and review of FDA-regulated products. The results of these efforts are published in a compen-
dium of papers [210-212] . At the time of this writing, the MAQC consortium had completed
the third phase of MAQC, also known as Sequencing Quality Control (SEQC). The project
 
Search WWH ::




Custom Search