Anomaly detection is the task of uncovering out-of-distribution samples from the majority of data. Typically, this is treated as a one-class classification problem where the only data available to analyze is the normal data. With regard to collecting features of normal data, the high-dimensional features from CNN can be used to learn the normality. The last layer of CNN with more semantic information is generally used to learn the normality. In contrast, this work proposes learning features from different levels of high-dimensional features instead of using only high-level features. With the assumption that the training data is normally distributed, we present an anomaly detection algorithm consisting of a deep feature extraction stage with ResNet18 followed by dimensionality reduction via PCA. The anomaly classification stage comprises two class-conditional transformation models implemented via Gaussian Mixture Model. Our proposal leverages feature-reconstruction error as anomaly scores between two high-dimensional feature vectors. In this study, we analyze and compare the effect of using different blocks of a pre-trained ResNet18 on a well-known industrial anomaly detection dataset. Results suggest that using the best output features of CNN can significantly improve the model's ability to predict anomalous samples.