{"id":4372,"date":"2021-02-07T18:00:07","date_gmt":"2021-02-07T18:00:07","guid":{"rendered":"https:\/\/www.aiproblog.com\/index.php\/2021\/02\/07\/function-optimization-with-scipy\/"},"modified":"2021-02-07T18:00:07","modified_gmt":"2021-02-07T18:00:07","slug":"function-optimization-with-scipy","status":"publish","type":"post","link":"https:\/\/www.aiproblog.com\/index.php\/2021\/02\/07\/function-optimization-with-scipy\/","title":{"rendered":"Function Optimization With SciPy"},"content":{"rendered":"<p>Author: Jason Brownlee<\/p>\n<div>\n<p>Optimization involves finding the inputs to an objective function that result in the minimum or maximum output of the function.<\/p>\n<p>The open-source Python library for scientific computing called SciPy provides a suite of optimization algorithms. Many of the algorithms are used as a building block in other algorithms, most notably machine learning algorithms in the scikit-learn library.<\/p>\n<p>These <strong>optimization algorithms<\/strong> can be used directly in a standalone manner to optimize a function. Most notably, algorithms for local search and algorithms for global search, the two main types of optimization you may encounter on a machine learning project.<\/p>\n<p>In this tutorial, you will discover optimization algorithms provided by the SciPy library.<\/p>\n<p>After completing this tutorial, you will know:<\/p>\n<ul>\n<li>The SciPy library provides a suite of different optimization algorithms for different purposes.<\/li>\n<li>The local search optimization algorithms available in SciPy.<\/li>\n<li>The global search optimization algorithms available in SciPy.<\/li>\n<\/ul>\n<p>Let\u2019s get started.<\/p>\n<div id=\"attachment_11746\" style=\"width: 809px\" class=\"wp-caption aligncenter\">\n<img decoding=\"async\" aria-describedby=\"caption-attachment-11746\" loading=\"lazy\" class=\"size-full wp-image-11746\" src=\"https:\/\/machinelearningmastery.com\/wp-content\/uploads\/2021\/02\/Function-Optimization-With-SciPy.jpg\" alt=\"Function Optimization With SciPy\" width=\"799\" height=\"533\" srcset=\"http:\/\/3qeqpr26caki16dnhd19sv6by6v.wpengine.netdna-cdn.com\/wp-content\/uploads\/2021\/02\/Function-Optimization-With-SciPy.jpg 799w, http:\/\/3qeqpr26caki16dnhd19sv6by6v.wpengine.netdna-cdn.com\/wp-content\/uploads\/2021\/02\/Function-Optimization-With-SciPy-300x200.jpg 300w, http:\/\/3qeqpr26caki16dnhd19sv6by6v.wpengine.netdna-cdn.com\/wp-content\/uploads\/2021\/02\/Function-Optimization-With-SciPy-768x512.jpg 768w\" sizes=\"(max-width: 799px) 100vw, 799px\"><\/p>\n<p id=\"caption-attachment-11746\" class=\"wp-caption-text\">Function Optimization With SciPy<br \/>Photo by <a href=\"https:\/\/www.flickr.com\/photos\/mlemos\/3125484412\/\">Manoel Lemos<\/a>, some rights reserved.<\/p>\n<\/div>\n<h2>Tutorial Overview<\/h2>\n<p>This tutorial is divided into three parts; they are:<\/p>\n<ol>\n<li>Optimization With SciPy<\/li>\n<li>Local Search With SciPy<\/li>\n<li>Global Search With SciPy<\/li>\n<\/ol>\n<h2>Optimization With SciPy<\/h2>\n<p>The Python SciPy open-source library for scientific computing provides a suite of optimization techniques.<\/p>\n<p>Many of the algorithms are used as building blocks for other algorithms within the SciPy library, as well as machine learning libraries such as scikit-learn.<\/p>\n<p>Before we review specific techniques, let\u2019s look at the types of algorithms provided by the library.<\/p>\n<p>They are:<\/p>\n<ul>\n<li>\n<strong>Scalar Optimization<\/strong>: Optimization of a convex single variable function.<\/li>\n<li>\n<strong>Local Search<\/strong>: Optimization of a unimodal multiple variable function.<\/li>\n<li>\n<strong>Global Search<\/strong>: Optimization of a multimodal multiple variable function.<\/li>\n<li>\n<strong>Least Squares<\/strong>: Solve linear and non-linear least squares problems.<\/li>\n<li>\n<strong>Curve Fitting<\/strong>: Fit a curve to a data sample.<\/li>\n<li>\n<strong>Root Finding<\/strong>: Find the root (input that gives an output of zero) of a function.<\/li>\n<li>\n<strong>Linear Programming<\/strong>: Linear optimization subject to constraints.<\/li>\n<\/ul>\n<p>All algorithms assume the objective function that is being optimized is a minimization function. If your function is maximizing, it can be converted to minimizing by adding a negative sign to values returned from your objective function.<\/p>\n<p>In addition to the above list, the library also provides utility functions used by some of the algorithms, as well as the Rosenbrock test problem.<\/p>\n<p>For a good overview of the capabilities of the SciPy library for optimization, see:<\/p>\n<ul>\n<li>\n<a href=\"https:\/\/docs.scipy.org\/doc\/scipy\/reference\/optimize.html\">Optimization and root finding (scipy.optimize) API<\/a>.<\/li>\n<\/ul>\n<p>Now that we have a high-level idea of the types of optimization techniques supported by the library, let\u2019s take a closer look at two groups of algorithms we are more likely to use in applied machine learning. They are local search and global search.<\/p>\n<h2>Local Search With SciPy<\/h2>\n<p><a href=\"https:\/\/en.wikipedia.org\/wiki\/Local_search_(optimization)\">Local search<\/a>, or local function optimization, refers to algorithms that seek the input to a function that results in the minimum or maximum output where the function or constrained region being searched is assumed to have a single optima, e.g. unimodal.<\/p>\n<p>The function that is being optimized may or may not be convex, and may have one or more than one input variable.<\/p>\n<p>A local search optimization may be applied directly to optimize a function if the function is believed or known to be unimodal; otherwise, the local search algorithm may be applied to fine-tune the result of a global search algorithm.<\/p>\n<p>The SciPy library provides local search via the <a href=\"https:\/\/docs.scipy.org\/doc\/scipy\/reference\/generated\/scipy.optimize.minimize.html\">minimize() function<\/a>.<\/p>\n<p>The <em>minimize()<\/em> function takes as input the name of the objective function that is being minimized and the initial point from which to start the search and returns an <a href=\"https:\/\/docs.scipy.org\/doc\/scipy\/reference\/generated\/scipy.optimize.OptimizeResult.html\">OptimizeResult<\/a> that summarizes the success or failure of the search and the details of the solution if found.<\/p>\n<pre class=\"urvanov-syntax-highlighter-plain-tag\">...\r\n# minimize an objective function\r\nresult = minimize(objective, point)<\/pre>\n<p>Additional information about the objective function can be provided if known, such as the bounds on the input variables, a function for computing the first derivative of the function (gradient or <a href=\"https:\/\/en.wikipedia.org\/wiki\/Jacobian_matrix_and_determinant\">Jacobian matrix<\/a>), a function for computing the second derivative of the function (<a href=\"https:\/\/en.wikipedia.org\/wiki\/Hessian_matrix\">Hessian matrix<\/a>), and any constraints on the inputs.<\/p>\n<p>Importantly, the function provides the \u201c<em>method<\/em>\u201d argument that allows the specific optimization used in the local search to be specified.<\/p>\n<p>A suite of popular local search algorithms are available, such as:<\/p>\n<ul>\n<li>\n<a href=\"https:\/\/en.wikipedia.org\/wiki\/Nelder%E2%80%93Mead_method\">Nelder-Mead Algorithm<\/a> (method=\u2019Nelder-Mead\u2019).<\/li>\n<li>\n<a href=\"https:\/\/en.wikipedia.org\/wiki\/Newton%27s_method\">Newton\u2019s Method<\/a> (method=\u2019Newton-CG\u2019).<\/li>\n<li>\n<a href=\"https:\/\/en.wikipedia.org\/wiki\/Powell%27s_method\">Powell\u2019s Method<\/a> (method=\u2019Powell\u2019).<\/li>\n<li>\n<a href=\"https:\/\/en.wikipedia.org\/wiki\/Broyden%E2%80%93Fletcher%E2%80%93Goldfarb%E2%80%93Shanno_algorithm\">BFGS Algorithm and extensions<\/a> (method=\u2019BFGS\u2019).<\/li>\n<\/ul>\n<p>The example below demonstrates how to solve a two-dimensional convex function using the L-BFGS-B local search algorithm.<\/p>\n<pre class=\"urvanov-syntax-highlighter-plain-tag\"># l-bfgs-b algorithm local optimization of a convex function\r\nfrom scipy.optimize import minimize\r\nfrom numpy.random import rand\r\n\r\n# objective function\r\ndef objective(x):\r\n\treturn x[0]**2.0 + x[1]**2.0\r\n\r\n# define range for input\r\nr_min, r_max = -5.0, 5.0\r\n# define the starting point as a random sample from the domain\r\npt = r_min + rand(2) * (r_max - r_min)\r\n# perform the l-bfgs-b algorithm search\r\nresult = minimize(objective, pt, method='L-BFGS-B')\r\n# summarize the result\r\nprint('Status : %s' % result['message'])\r\nprint('Total Evaluations: %d' % result['nfev'])\r\n# evaluate solution\r\nsolution = result['x']\r\nevaluation = objective(solution)\r\nprint('Solution: f(%s) = %.5f' % (solution, evaluation))<\/pre>\n<p>Running the example performs the optimization and reports the success or failure of the search, the number of function evaluations performed, and the input that resulted in the optima of the function.<\/p>\n<pre class=\"urvanov-syntax-highlighter-plain-tag\">Status : b'CONVERGENCE: NORM_OF_PROJECTED_GRADIENT_&lt;=_PGTOL'\r\nTotal Evaluations: 9\r\nSolution: f([3.38059583e-07 3.70089258e-07]) = 0.00000<\/pre>\n<p>Now that we are familiar with using a local search algorithm with SciPy, let\u2019s look at global search.<\/p>\n<h2>Global Search With SciPy<\/h2>\n<p><a href=\"https:\/\/en.wikipedia.org\/wiki\/Global_optimization\">Global search<\/a> or global function optimization refers to algorithms that seek the input to a function that results in the minimum or maximum output where the function or constrained region being searched is assumed to have multiple local optima, e.g. multimodal.<\/p>\n<p>The function that is being optimized is typically nonlinear, nonconvex, and may have one or more than one input variable.<\/p>\n<p>Global search algorithms are typically stochastic, meaning that they make use of randomness in the search process and may or may not manage a population of candidate solutions as part of the search.<\/p>\n<p>The SciPy library provides a number of stochastic global optimization algorithms, each via different functions. They are:<\/p>\n<ul>\n<li>\n<a href=\"https:\/\/en.wikipedia.org\/wiki\/Basin-hopping\">Basin Hopping Optimization<\/a> via the <a href=\"https:\/\/docs.scipy.org\/doc\/scipy\/reference\/generated\/scipy.optimize.basinhopping.html\">basinhopping()<\/a> function.<\/li>\n<li>\n<a href=\"https:\/\/en.wikipedia.org\/wiki\/Differential_evolution\">Differential Evolution Optimization<\/a> via the <a href=\"https:\/\/docs.scipy.org\/doc\/scipy\/reference\/generated\/scipy.optimize.differential_evolution.html\">differential_evolution()<\/a> function.<\/li>\n<li>\n<a href=\"https:\/\/en.wikipedia.org\/wiki\/Simulated_annealing\">Simulated Annealing<\/a> via the <a href=\"https:\/\/docs.scipy.org\/doc\/scipy\/reference\/generated\/scipy.optimize.dual_annealing.html\">dual_annealing()<\/a> function.<\/li>\n<\/ul>\n<p>The library also provides the <a href=\"https:\/\/docs.scipy.org\/doc\/scipy\/reference\/generated\/scipy.optimize.shgo.html\">shgo()<\/a> function for sequence optimization and the <a href=\"https:\/\/docs.scipy.org\/doc\/scipy\/reference\/generated\/scipy.optimize.brute.html\">brute()<\/a> for grid search optimization.<\/p>\n<p>Each algorithm returns an <a href=\"https:\/\/docs.scipy.org\/doc\/scipy\/reference\/generated\/scipy.optimize.OptimizeResult.html\">OptimizeResult<\/a> object that summarizes the success or failure of the search and the details of the solution if found.<\/p>\n<p>The example below demonstrates how to solve a two-dimensional multimodal function using simulated annealing.<\/p>\n<pre class=\"urvanov-syntax-highlighter-plain-tag\"># simulated annealing global optimization for a multimodal objective function\r\nfrom scipy.optimize import dual_annealing\r\n\r\n# objective function\r\ndef objective(v):\r\n\tx, y = v\r\n\treturn (x**2 + y - 11)**2 + (x + y**2 -7)**2\r\n\r\n# define range for input\r\nr_min, r_max = -5.0, 5.0\r\n# define the bounds on the search\r\nbounds = [[r_min, r_max], [r_min, r_max]]\r\n# perform the simulated annealing search\r\nresult = dual_annealing(objective, bounds)\r\n# summarize the result\r\nprint('Status : %s' % result['message'])\r\nprint('Total Evaluations: %d' % result['nfev'])\r\n# evaluate solution\r\nsolution = result['x']\r\nevaluation = objective(solution)\r\nprint('Solution: f(%s) = %.5f' % (solution, evaluation))<\/pre>\n<p>Running the example performs the optimization and reports the success or failure of the search, the number of function evaluations performed, and the input that resulted in the optima of the function.<\/p>\n<pre class=\"urvanov-syntax-highlighter-plain-tag\">Status : ['Maximum number of iteration reached']\r\nTotal Evaluations: 4028\r\nSolution: f([-3.77931027 -3.283186 ]) = 0.00000<\/pre>\n<\/p>\n<h2>Further Reading<\/h2>\n<p>This section provides more resources on the topic if you are looking to go deeper.<\/p>\n<h3>APIs<\/h3>\n<ul>\n<li>\n<a href=\"https:\/\/docs.scipy.org\/doc\/scipy\/reference\/tutorial\/optimize.html\">Optimization (scipy.optimize) API<\/a>.<\/li>\n<li>\n<a href=\"https:\/\/docs.scipy.org\/doc\/scipy\/reference\/optimize.html\">Optimization and root finding (scipy.optimize) API<\/a>.<\/li>\n<\/ul>\n<h3>Articles<\/h3>\n<ul>\n<li>\n<a href=\"https:\/\/en.wikipedia.org\/wiki\/Local_search_(optimization)\">Local search (optimization), Wikipedia<\/a>.<\/li>\n<li>\n<a href=\"https:\/\/en.wikipedia.org\/wiki\/Global_optimization\">Global optimization, Wikipedia<\/a>.<\/li>\n<\/ul>\n<h2>Summary<\/h2>\n<p>In this tutorial, you discovered optimization algorithms provided by the SciPy library.<\/p>\n<p>Specifically, you learned:<\/p>\n<ul>\n<li>The SciPy library provides a suite of different optimization algorithms for different purposes.<\/li>\n<li>The local search optimization algorithms available in SciPy.<\/li>\n<li>The global search optimization algorithms available in SciPy.<\/li>\n<\/ul>\n<p><strong>Do you have any questions?<\/strong><br \/>\nAsk your questions in the comments below and I will do my best to answer.<\/p>\n<p>The post <a rel=\"nofollow\" href=\"https:\/\/machinelearningmastery.com\/function-optimization-with-scipy\/\">Function Optimization With SciPy<\/a> appeared first on <a rel=\"nofollow\" href=\"https:\/\/machinelearningmastery.com\/\">Machine Learning Mastery<\/a>.<\/p>\n<\/div>\n<p><a href=\"https:\/\/machinelearningmastery.com\/function-optimization-with-scipy\/\">Go to Source<\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p>Author: Jason Brownlee Optimization involves finding the inputs to an objective function that result in the minimum or maximum output of the function. The open-source [&hellip;] <span class=\"read-more-link\"><a class=\"read-more\" href=\"https:\/\/www.aiproblog.com\/index.php\/2021\/02\/07\/function-optimization-with-scipy\/\">Read More<\/a><\/span><\/p>\n","protected":false},"author":1,"featured_media":4373,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"_bbp_topic_count":0,"_bbp_reply_count":0,"_bbp_total_topic_count":0,"_bbp_total_reply_count":0,"_bbp_voice_count":0,"_bbp_anonymous_reply_count":0,"_bbp_topic_count_hidden":0,"_bbp_reply_count_hidden":0,"_bbp_forum_subforum_count":0,"footnotes":""},"categories":[24],"tags":[],"_links":{"self":[{"href":"https:\/\/www.aiproblog.com\/index.php\/wp-json\/wp\/v2\/posts\/4372"}],"collection":[{"href":"https:\/\/www.aiproblog.com\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.aiproblog.com\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.aiproblog.com\/index.php\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.aiproblog.com\/index.php\/wp-json\/wp\/v2\/comments?post=4372"}],"version-history":[{"count":0,"href":"https:\/\/www.aiproblog.com\/index.php\/wp-json\/wp\/v2\/posts\/4372\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.aiproblog.com\/index.php\/wp-json\/wp\/v2\/media\/4373"}],"wp:attachment":[{"href":"https:\/\/www.aiproblog.com\/index.php\/wp-json\/wp\/v2\/media?parent=4372"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.aiproblog.com\/index.php\/wp-json\/wp\/v2\/categories?post=4372"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.aiproblog.com\/index.php\/wp-json\/wp\/v2\/tags?post=4372"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}