Some measures of multivariate skewness and kurtosis have been proposed by Mardia (1970). For further theoretical investigations and applications, alternative forms of these measures are obtained. One of these forms is convenient for computer programming and, incidentally, provides a simpler proof of the invariance property of these measures. For mixtures of multivariate normal distributions, the measures are studied in relation to a measure of non-normality of Day (1969). Our investigation together with those of Hopkins and Clay (1963) indicates that the size of the normal theory tests of covariance matrices is extremely sensitive to kurtosis. A method to derive exact moments of these measures for samples from a multivariate normal population is developed; our approach is somewhat similar to Geary (1933) for the univariate case rather than Fisher (1930) who first solved this problem. Some suitable small sample approximations to the null distributions of the measures are then derived. Monte Carlo studies and these approximations are used to calculate critical values of a test of multivariate normality.