In testing statistical hypotheses based on Bayesian approach, one of the important assumption is to be known the prior distribution. But, in some cases the prior distribution is unknown. One way for testing hypotheses in these cases is to use an empirical Bayes method. This thesis deals with empirical Bayes testing for the mean of a normal distribution. At first, we study the case in which the variance is known. In this situation, we abtain the empirical Bayes test, and then, we show that the empirical Bayes test is asymptotically optimal, and its associated regret Bayes risk converges to zero at a rate Afterwards, we extend the problem to the case in which the variance is unknown. In this case, we must achieve variance unbiased estimator. Then, we construct the empirical Bayes test. Again, we establish that the rate of convergence goes to zero at the same rate as in the known variance case. It is show that, this rate is faster than the optimal rate of convergence obtained, so far, by the others.