Abstract
In this study, a Bayesian method will be introduced to describe outlying
observations in multivariate linear regression. This method was proposed by
Chaloner and Brant [3]. Later on, Varbanov [7] extended this method to
apply to multivariate linear regression. According to Chaloner and Brant,
an observation will be accepted as an outlier if the following condition is ful-
filled: the posterior probability of the occurrence of the realized error (Arnold
Zelner [8]) of an observation being greater than a critical value "k", is higher
than the probability an error occurred in the model, with a critical value
"k" over the assumed distribution. That is, if $pr[(\epsilon_i/sigma, y) > k] > pr(\epsilon_i > k)$
then the $i^{th}$ observation will be accepted as an outlier. In the second section,
the method proposed by Varbanov [7] will be considered. In the application
section the existence or non-existence of outlying observations over the pos-
terior distribution of the square form of the realized error in multivariate
linear regression data is discussed