You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
BHHH is a second-order algorithm that (conceptually) uses the self-outer-product of the gradient to approximate the Hessian. This is justified by the information matrix equality in statistics, which states that E(x * x') = E(hessian(x)), making the self-outer-product an unbiased and consistent estimator of the Hessian (that can usually be calculated much more easily than the full Hessian). This method is widely used in statistics.
Is there an implementation of BHHH in Optim.jl, or are there any plans to add it?
The text was updated successfully, but these errors were encountered:
I'm well aware of BHHH, but I'm not sure what you would want beyond the Newton method? Is it because you want Optim to automatically write the outer product of the score using AD?
Okay, then I suppose you'd have to a) have a vector objective type that can then be interpreted according to some aggregation (I suppose a sum here because you'd have likelihood contributions in your use case) or simply a uhhh-constructor that constructs a normal objective type?
BHHH is a second-order algorithm that (conceptually) uses the self-outer-product of the gradient to approximate the Hessian. This is justified by the information matrix equality in statistics, which states that
E(x * x') = E(hessian(x))
, making the self-outer-product an unbiased and consistent estimator of the Hessian (that can usually be calculated much more easily than the full Hessian). This method is widely used in statistics.Is there an implementation of BHHH in Optim.jl, or are there any plans to add it?
The text was updated successfully, but these errors were encountered: