Matrices can be diagonalized by singular vectors or, when they are symmetric, by eigenvectors. Pairs of square matrices often admit simultaneous diagonalization, and always admit block wise simultaneous diagonalization. Generalizing these possibilities to more than two (non-square) matrices leads to methods of simplifying three-way arrays by nonsingular transformations. Such transformations have direct applications in Tucker PCA for three-way arrays, where transforming the core array to simplicity is allowed without loss of fit. Simplifying arrays also facilitates the study of array rank. The typical rank of a three-way array is the smallest number of rank-one arrays that have the array as their sum, when the array is generated by random sampling from a continuous distribution. In some applications, the core array of Tucker PCA is constrained to have a vast majority of zero elements. Both simplicity and typical rank results can be applied to distinguish constrained Tucker PCA models from tautologies. An update of typical rank results over the real number field is given in the form of two tables.Key words: tensor decomposition, tensor rank, typical rank, sparse arrays, Candecomp, Parafac, Tucker component analysis.Three-way data arise when a complete set of measures is obtained for a set of individuals or objects on a set of variables, in a set of circumstances or conditions. For instance, when I subjects are evaluated on J tests on K different occasions, the data consist of an I × J × K three-way array. The array can be cut up into slices: there are K frontal slices of order I × J, J lateral slices of order I × K, and I horizontal slices of order J × K. Tucker (1966), Carroll andChang (1970), and Harshman (1970) have proposed generalizations of principal component analysis (PCA) for the analysis of three-way arrays. For a long time, these methods were well known in psychometrics, but did not receive much attention elsewhere. A few decades ago, however, chemometricians discovered that these methods are particularly useful for their own purposes of data analysis. Nowadays, three-way component analysis is at the heart of what is called tensor decomposition analysis, which has applications in various areas such as signal processing, numerical linear algebra, computer vision, numerical analysis, data mining, graph analysis, neuroscience, and others (Kolda & Brader, 2009). Although tensor decomposition analysis covers multiway arrays beyond the three-way level, only three-way component analysis will be considered in the present paper.Three-way component analysis poses various mathematical problems that have been studied extensively. Among these are the uniqueness properties of three-way decompositions and the transformational freedom that exists when they are absent, the possibilities of transforming arrays to simple forms (with as many vanishing elements as possible, that is), and the rank of three-way arrays. The present paper is focused on the latter two topics. The purpose is to explain simplifying transformations...