Abstract
In the class of unbiased estimators for the parameter functions, the variance of estimator is one of the basic criteria to compare and evaluate the accuracy of the estimators. In many cases the variance has complicated form and we can not compute it, so, by lower bounds, we can approximate it. Many studies have been done on the lower bounds for the variance of an unbiased estimator of the parameter.
Another common and popular method that is used in many statistical problems such as variance estimation, is bootstrap method. This method has some advantages and disadvantages that must be careful when using them.
In this paper, first we briefly introduce the two famous lower bounds named "Kshirsagar" (one parameter case) and "Bhattacharyya" (one and multi parameter case) bounds and then we extend the Kshirsagar bound in multi parameter case. Also, by giving some examples in different distributions, we compare one and multi parameter Bhattacharyya and Kshirsagar lower bounds with bootstrap method for approximating the variance of the unbiased estimators and show that the mentioned bounds have a better performance than bootstrap method.