Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The Hessian matrix is covered in just about any standard calculus book.

The Hessian is a generalization of the second derivative of elementary calculus. Recall that the second derivative of a function f(x) allows to distinguish concave (f''>0) and convex (f''<0) parts of the function. If you are at a local extremum (f'=0) it allows to distinguish maxima and minima.

The Hessian does the same thing, but for functions of several variables. It says in which spatial directions your function is convex or concave. At a local maximum, it is concave in all directions, and at a local minimum it is convex in all directions. At a saddle point, there will be directions where it is concave and directions where it is convex. The eigen decomposition of the hessian allows to find these directions.

It is useful for function approximation because the 2nd degree coefficients of a polynomial that better fits your data are precisely the entries of the Hessian matrix (due to Taylor theorem).



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: