{"id":424,"date":"2025-10-09T17:50:32","date_gmt":"2025-10-09T21:50:32","guid":{"rendered":"http:\/\/stephendavies.org\/nlp\/?p=424"},"modified":"2025-10-09T17:50:32","modified_gmt":"2025-10-09T21:50:32","slug":"its-the-jacobian-not-the-hessian","status":"publish","type":"post","link":"http:\/\/stephendavies.org\/nlp\/index.php\/2025\/10\/09\/its-the-jacobian-not-the-hessian\/","title":{"rendered":"It&#8217;s the Jacobian, not the Hessian"},"content":{"rendered":"<p>I misspoke today in response to Garrett&#8217;s question about a vector-valued loss function (instead of a scalar loss function). If your loss (or any other) function is a vector of values, then computing the partial derivative of each of those values with respect to each of those inputs is called the <b>Jacobian<\/b> matrix. It&#8217;s normally denoted as \\( J_{\\!f}(x) \\), and its entries are \\( J_{{\\!f}_{ij}}(x) = \\frac{\\partial f_i}{\\partial x_j} \\).<\/p>\n<p>The <b>Hessian<\/b> matrix, \\( H_{\\!f}(x) \\), is actually similar to the Jacobian but has <i>second<\/i>-order partial derivatives. (In other words, its entries are \\( H_{{\\!f}_{ij}}=\\frac{\\partial^2 f}{\\partial x_i\\,\\partial x_j} \\).)<\/p>\n","protected":false},"excerpt":{"rendered":"<p>I misspoke today in response to Garrett&#8217;s question about a vector-valued loss function (instead of a scalar loss function). If your loss (or any other) function is a vector of values, then computing the partial derivative of each of those values with respect to each of those inputs is called the Jacobian matrix. It&#8217;s normally [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":"","_links_to":"","_links_to_target":""},"categories":[1],"tags":[],"class_list":["post-424","post","type-post","status-publish","format-standard","hentry","category-uncategorized"],"_links":{"self":[{"href":"http:\/\/stephendavies.org\/nlp\/index.php\/wp-json\/wp\/v2\/posts\/424","targetHints":{"allow":["GET"]}}],"collection":[{"href":"http:\/\/stephendavies.org\/nlp\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"http:\/\/stephendavies.org\/nlp\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"http:\/\/stephendavies.org\/nlp\/index.php\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"http:\/\/stephendavies.org\/nlp\/index.php\/wp-json\/wp\/v2\/comments?post=424"}],"version-history":[{"count":8,"href":"http:\/\/stephendavies.org\/nlp\/index.php\/wp-json\/wp\/v2\/posts\/424\/revisions"}],"predecessor-version":[{"id":432,"href":"http:\/\/stephendavies.org\/nlp\/index.php\/wp-json\/wp\/v2\/posts\/424\/revisions\/432"}],"wp:attachment":[{"href":"http:\/\/stephendavies.org\/nlp\/index.php\/wp-json\/wp\/v2\/media?parent=424"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"http:\/\/stephendavies.org\/nlp\/index.php\/wp-json\/wp\/v2\/categories?post=424"},{"taxonomy":"post_tag","embeddable":true,"href":"http:\/\/stephendavies.org\/nlp\/index.php\/wp-json\/wp\/v2\/tags?post=424"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}