6.3. Generalized Logarithm Transformation The following two-component measurement error model is proposed to model the measured expression levels, (1) y=α+μeη+ϵ where y is the measured raw expression level, α is the mean background noise, μ is the true expression level and η and ϵ are the multiplicative and additive measurement errors, which are assumed to be normally-distributed with mean 0 and variances ση2 and σϵ2, respectively [17,18,19]. The variance of y under this model is Var(y)=μ2Sη2+σϵ2, where Sη2=eση2(eση2−1). To ease the analyses of gene-expression microarrays using some standard statistical techniques, the following generalized logarithm transformation that stabilizes the variance has been proposed:(2) fc(z)=lnz+z2+c22 where c=σϵ/Sη. The performance of the GLOG is further studied, and simulation results show that it is a better choice compared with the “started logarithm” transformation and the “log-linear hybrid” transformation [20,60,61,62,63,64,65,66].