Top

Akimichi Takemura (Center for Data Science Education and Research, Shiga University), Use of Asymptotics for Holonomic Gradient Method

Abstract: In usual implementation of the holonomic gradient method (HGM), a vector of some lower-order derivatives of a function is numerically integrated along a path parameterized by a scalar. The initial values of the derivatives are usually evaluated at a point corresponding to a small positive value of the scalar. As the scalar diverges to infinity, some of the derivatives may also diverge, which causes numerical instability. In this case it helps to rescale the vector by the asymptotic behavior the function and its derivatives. We discuss use of this asymptotics to HGM with some examples.

Session: Holonomic Gradient Method in Statistics:

Holonomic gradient method is a new method to evaluate normalizing constants and their derivatives in statistics. Algorithms of HGM consist of computer algebra algorithms, numerical algorithms, and geometry algorithms. The interplay of them gives a new strong method in statistics.