A novel unsupervised learning rule, called Boundary Adaptation Rule (BAR), is introduced for scalar quantization. It is shown that the rule maximizes information-theoretic entropy and thus yields equiprobable quantizations of univariate probability density functions. It is shown by simulations that BAR outperforms other unsupervised competitive learning rules in generating equiprobable quantizations. It is also shown that our rule can do better or worse than the Lloyd I algorithm in minimizing average mean square error, depending on the input distribution. Finally, an application to adaptive non-uniform analog to digital (A/D) conversion is considered.