{"id":4330,"date":"2021-01-24T18:00:14","date_gmt":"2021-01-24T18:00:14","guid":{"rendered":"https:\/\/www.aiproblog.com\/index.php\/2021\/01\/24\/how-to-use-nelder-mead-optimization-in-python\/"},"modified":"2021-01-24T18:00:14","modified_gmt":"2021-01-24T18:00:14","slug":"how-to-use-nelder-mead-optimization-in-python","status":"publish","type":"post","link":"https:\/\/www.aiproblog.com\/index.php\/2021\/01\/24\/how-to-use-nelder-mead-optimization-in-python\/","title":{"rendered":"How to Use Nelder-Mead Optimization in Python"},"content":{"rendered":"<p>Author: Jason Brownlee<\/p>\n<div>\n<p>The <strong>Nelder-Mead optimization<\/strong> algorithm is a widely used approach for non-differentiable objective functions.<\/p>\n<p>As such, it is generally referred to as a pattern search algorithm and is used as a local or global search procedure, challenging nonlinear and potentially noisy and multimodal function optimization problems.<\/p>\n<p>In this tutorial, you will discover the Nelder-Mead optimization algorithm.<\/p>\n<p>After completing this tutorial, you will know:<\/p>\n<ul>\n<li>The Nelder-Mead optimization algorithm is a type of pattern search that does not use function gradients.<\/li>\n<li>How to apply the Nelder-Mead algorithm for function optimization in Python.<\/li>\n<li>How to interpret the results of the Nelder-Mead algorithm on noisy and multimodal objective functions.<\/li>\n<\/ul>\n<p>Let\u2019s get started.<\/p>\n<div id=\"attachment_11694\" style=\"width: 810px\" class=\"wp-caption aligncenter\">\n<img decoding=\"async\" aria-describedby=\"caption-attachment-11694\" loading=\"lazy\" class=\"size-full wp-image-11694\" src=\"https:\/\/machinelearningmastery.com\/wp-content\/uploads\/2021\/01\/How-to-Use-the-Nelder-Mead-Optimization-in-Python.jpg\" alt=\"How to Use the Nelder-Mead Optimization in Python\" width=\"800\" height=\"396\" srcset=\"http:\/\/3qeqpr26caki16dnhd19sv6by6v.wpengine.netdna-cdn.com\/wp-content\/uploads\/2021\/01\/How-to-Use-the-Nelder-Mead-Optimization-in-Python.jpg 800w, http:\/\/3qeqpr26caki16dnhd19sv6by6v.wpengine.netdna-cdn.com\/wp-content\/uploads\/2021\/01\/How-to-Use-the-Nelder-Mead-Optimization-in-Python-300x149.jpg 300w, http:\/\/3qeqpr26caki16dnhd19sv6by6v.wpengine.netdna-cdn.com\/wp-content\/uploads\/2021\/01\/How-to-Use-the-Nelder-Mead-Optimization-in-Python-768x380.jpg 768w\" sizes=\"(max-width: 800px) 100vw, 800px\"><\/p>\n<p id=\"caption-attachment-11694\" class=\"wp-caption-text\">How to Use the Nelder-Mead Optimization in Python<br \/>Photo by <a href=\"https:\/\/www.flickr.com\/photos\/23155134@N06\/47126513761\/\">Don Graham<\/a>, some rights reserved.<\/p>\n<\/div>\n<h2>Tutorial Overview<\/h2>\n<p>This tutorial is divided into three parts; they are:<\/p>\n<ol>\n<li>Nelder-Mead Algorithm<\/li>\n<li>Nelder-Mead Example in Python<\/li>\n<li>Nelder-Mead on Challenging Functions\n<ol>\n<li>Noisy Optimization Problem<\/li>\n<li>Multimodal Optimization Problem<\/li>\n<\/ol>\n<\/li>\n<\/ol>\n<h2>Nelder-Mead Algorithm<\/h2>\n<p><a href=\"https:\/\/en.wikipedia.org\/wiki\/Nelder%E2%80%93Mead_method\">Nelder-Mead<\/a> is an optimization algorithm named after the developers of the technique, <a href=\"https:\/\/en.wikipedia.org\/wiki\/John_Nelder\">John Nelder<\/a> and <a href=\"https:\/\/en.wikipedia.org\/wiki\/Roger_Mead\">Roger Mead<\/a>.<\/p>\n<p>The algorithm was described in their 1965 paper titled \u201c<a href=\"https:\/\/academic.oup.com\/comjnl\/article-abstract\/7\/4\/308\/354237\">A Simplex Method For Function Minimization<\/a>\u201d and has become a standard and widely used technique for function optimization.<\/p>\n<p>It is appropriate for one-dimensional or multidimensional functions with numerical inputs.<\/p>\n<p>Nelder-Mead is a <a href=\"https:\/\/en.wikipedia.org\/wiki\/Pattern_search_(optimization)\">pattern search optimization algorithm<\/a>, which means it does require or use function gradient information and is appropriate for optimization problems where the gradient of the function is unknown or cannot be reasonably computed.<\/p>\n<p>It is often used for multidimensional nonlinear function optimization problems, although it can get stuck in local optima.<\/p>\n<blockquote>\n<p>Practical performance of the Nelder\u2013Mead algorithm is often reasonable, though stagnation has been observed to occur at nonoptimal points. Restarting can be used when stagnation is detected.<\/p>\n<\/blockquote>\n<p>\u2014 Page 239, <a href=\"https:\/\/amzn.to\/3lCRqX9\">Numerical Optimization<\/a>, 2006.<\/p>\n<p>A starting point must be provided to the algorithm, which may be the endpoint of another global optimization algorithm or a random point drawn from the domain.<\/p>\n<p>Given that the algorithm may get stuck, it may benefit from multiple restarts with different starting points.<\/p>\n<blockquote>\n<p>The Nelder-Mead simplex method uses a simplex to traverse the space in search of a minimum.<\/p>\n<\/blockquote>\n<p>\u2014 Page 105, <a href=\"https:\/\/amzn.to\/31J3I8l\">Algorithms for Optimization<\/a>, 2019.<\/p>\n<p>The algorithm works by using a shape structure (called a simplex) composed of <em>n<\/em> + 1 points (vertices), where <em>n<\/em> is the number of input dimensions to the function.<\/p>\n<p>For example, on a two-dimensional problem that may be plotted as a surface, the shape structure would be composed of three points represented as a triangle.<\/p>\n<blockquote>\n<p>The Nelder-Mead method uses a series of rules that dictate how the simplex is updated based on evaluations of the objective function at its vertices.<\/p>\n<\/blockquote>\n<p>\u2014 Page 106, <a href=\"https:\/\/amzn.to\/31J3I8l\">Algorithms for Optimization<\/a>, 2019.<\/p>\n<p>The points of the shape structure are evaluated and simple rules are used to decide how to move the points of the shape based on their relative evaluation. This includes operations such as \u201c<em>reflection<\/em>,\u201d \u201c<em>expansion<\/em>,\u201d \u201c<em>contraction<\/em>,\u201d and \u201c<em>shrinkage<\/em>\u201d of the simplex shape on the surface of the objective function.<\/p>\n<blockquote>\n<p>In a single iteration of the Nelder\u2013Mead algorithm, we seek to remove the vertex with the worst function value and replace it with another point with a better value. The new point is obtained by reflecting, expanding, or contracting the simplex along the line joining the worst vertex with the centroid of the remaining vertices. If we cannot find a better point in this manner, we retain only the vertex with the best function value, and we shrink the simplex by moving all other vertices toward this value.<\/p>\n<\/blockquote>\n<p>\u2014 Page 238, <a href=\"https:\/\/amzn.to\/3lCRqX9\">Numerical Optimization<\/a>, 2006.<\/p>\n<p>The search stops when the points converge on an optimum, when a minimum difference between evaluations is observed, or when a maximum number of function evaluations are performed.<\/p>\n<p>Now that we have a high-level idea of how the algorithm works, let\u2019s look at how we might use it in practice.<\/p>\n<h2>Nelder-Mead Example in Python<\/h2>\n<p>The Nelder-Mead optimization algorithm can be used in Python via the <a href=\"https:\/\/docs.scipy.org\/doc\/scipy\/reference\/optimize.minimize-neldermead.html\">minimize() function<\/a>.<\/p>\n<p>This function requires that the \u201c<em>method<\/em>\u201d argument be set to \u201c<em>nelder-mead<\/em>\u201d to use the Nelder-Mead algorithm. It takes the objective function to be minimized and an initial point for the search.<\/p>\n<pre class=\"urvanov-syntax-highlighter-plain-tag\">...\r\n# perform the search\r\nresult = minimize(objective, pt, method='nelder-mead')<\/pre>\n<p>The result is an <a href=\"https:\/\/docs.scipy.org\/doc\/scipy\/reference\/generated\/scipy.optimize.OptimizeResult.html\">OptimizeResult<\/a> object that contains information about the result of the optimization accessible via keys.<\/p>\n<p>For example, the \u201c<em>success<\/em>\u201d boolean indicates whether the search was completed successfully or not, the \u201c<em>message<\/em>\u201d provides a human-readable message about the success or failure of the search, and the \u201c<em>nfev<\/em>\u201d key indicates the number of function evaluations that were performed.<\/p>\n<p>Importantly, the \u201c<em>x<\/em>\u201d key specifies the input values that indicate the optima found by the search, if successful.<\/p>\n<pre class=\"urvanov-syntax-highlighter-plain-tag\">...\r\n# summarize the result\r\nprint('Status : %s' % result['message'])\r\nprint('Total Evaluations: %d' % result['nfev'])\r\nprint('Solution: %s' % result['x'])<\/pre>\n<p>We can demonstrate the Nelder-Mead optimization algorithm on a well-behaved function to show that it can quickly and efficiently find the optima without using any derivative information from the function.<\/p>\n<p>In this case, we will use the x^2 function in two-dimensions, defined in the range -5.0 to 5.0 with the known optima at [0.0, 0.0].<\/p>\n<p>We can define the <em>objective()<\/em> function below.<\/p>\n<pre class=\"urvanov-syntax-highlighter-plain-tag\"># objective function\r\ndef objective(x):\r\n\treturn x[0]**2.0 + x[1]**2.0<\/pre>\n<p>We will use a random point in the defined domain as a starting point for the search.<\/p>\n<pre class=\"urvanov-syntax-highlighter-plain-tag\">...\r\n# define range for input\r\nr_min, r_max = -5.0, 5.0\r\n# define the starting point as a random sample from the domain\r\npt = r_min + rand(2) * (r_max - r_min)<\/pre>\n<p>The search can then be performed. We use the default maximum number of function evaluations set via the \u201c<em>maxiter<\/em>\u201d and set to N*200, where N is the number of input variables, which is two in this case, e.g. 400 evaluations.<\/p>\n<pre class=\"urvanov-syntax-highlighter-plain-tag\">...\r\n# perform the search\r\nresult = minimize(objective, pt, method='nelder-mead')<\/pre>\n<p>After the search is finished, we will report the total function evaluations used to find the optima and the success message of the search, which we expect to be positive in this case.<\/p>\n<pre class=\"urvanov-syntax-highlighter-plain-tag\">...\r\n# summarize the result\r\nprint('Status : %s' % result['message'])\r\nprint('Total Evaluations: %d' % result['nfev'])<\/pre>\n<p>Finally, we will retrieve the input values for located optima, evaluate it using the objective function, and report both in a human-readable manner.<\/p>\n<pre class=\"urvanov-syntax-highlighter-plain-tag\">...\r\n# evaluate solution\r\nsolution = result['x']\r\nevaluation = objective(solution)\r\nprint('Solution: f(%s) = %.5f' % (solution, evaluation))<\/pre>\n<p>Tying this together, the complete example of using the Nelder-Mead optimization algorithm on a simple convex objective function is listed below.<\/p>\n<pre class=\"urvanov-syntax-highlighter-plain-tag\"># nelder-mead optimization of a convex function\r\nfrom scipy.optimize import minimize\r\nfrom numpy.random import rand\r\n\r\n# objective function\r\ndef objective(x):\r\n\treturn x[0]**2.0 + x[1]**2.0\r\n\r\n# define range for input\r\nr_min, r_max = -5.0, 5.0\r\n# define the starting point as a random sample from the domain\r\npt = r_min + rand(2) * (r_max - r_min)\r\n# perform the search\r\nresult = minimize(objective, pt, method='nelder-mead')\r\n# summarize the result\r\nprint('Status : %s' % result['message'])\r\nprint('Total Evaluations: %d' % result['nfev'])\r\n# evaluate solution\r\nsolution = result['x']\r\nevaluation = objective(solution)\r\nprint('Solution: f(%s) = %.5f' % (solution, evaluation))<\/pre>\n<p>Running the example executes the optimization, then reports the results.<\/p>\n<p><strong>Note<\/strong>: Your <a href=\"https:\/\/machinelearningmastery.com\/different-results-each-time-in-machine-learning\/\">results may vary<\/a> given the stochastic nature of the algorithm or evaluation procedure, or differences in numerical precision. Consider running the example a few times and compare the average outcome.<\/p>\n<p>In this case, we can see that the search was successful, as we expected, and was completed after 88 function evaluations.<\/p>\n<p>We can see that the optima was located with inputs very close to [0,0], which evaluates to the minimum objective value of 0.0.<\/p>\n<pre class=\"urvanov-syntax-highlighter-plain-tag\">Status: Optimization terminated successfully.\r\nTotal Evaluations: 88\r\nSolution: f([ 2.25680716e-05 -3.87021351e-05]) = 0.00000<\/pre>\n<p>Now that we have seen how to use the Nelder-Mead optimization algorithm successfully, let\u2019s look at some examples where it does not perform so well.<\/p>\n<h2>Nelder-Mead on Challenging Functions<\/h2>\n<p>The Nelder-Mead optimization algorithm works well for a range of challenging nonlinear and non-differentiable objective functions.<\/p>\n<p>Nevertheless, it can get stuck on multimodal optimization problems and noisy problems.<\/p>\n<p>To make this concrete, let\u2019s look at an example of each.<\/p>\n<h3>Noisy Optimization Problem<\/h3>\n<p>A noisy objective function is a function that gives different answers each time the same input is evaluated.<\/p>\n<p>We can make an objective function artificially noisy by adding small Gaussian random numbers to the inputs prior to their evaluation.<\/p>\n<p>For example, we can define a one-dimensional version of the x^2 function and use the <a href=\"https:\/\/numpy.org\/doc\/stable\/reference\/random\/generated\/numpy.random.randn.html\">randn() function<\/a> to add small Gaussian random numbers to the input with a mean of 0.0 and a standard deviation of 0.3.<\/p>\n<pre class=\"urvanov-syntax-highlighter-plain-tag\"># objective function\r\ndef objective(x):\r\n\treturn (x + randn(len(x))*0.3)**2.0<\/pre>\n<p>The noise will make the function challenging to optimize for the algorithm and it will very likely not locate the optima at x=0.0.<\/p>\n<p>The complete example of using Nelder-Mead to optimize the noisy objective function is listed below.<\/p>\n<pre class=\"urvanov-syntax-highlighter-plain-tag\"># nelder-mead optimization of noisy one-dimensional convex function\r\nfrom scipy.optimize import minimize\r\nfrom numpy.random import rand\r\nfrom numpy.random import randn\r\n\r\n# objective function\r\ndef objective(x):\r\n\treturn (x + randn(len(x))*0.3)**2.0\r\n\r\n# define range for input\r\nr_min, r_max = -5.0, 5.0\r\n# define the starting point as a random sample from the domain\r\npt = r_min + rand(1) * (r_max - r_min)\r\n# perform the search\r\nresult = minimize(objective, pt, method='nelder-mead')\r\n# summarize the result\r\nprint('Status : %s' % result['message'])\r\nprint('Total Evaluations: %d' % result['nfev'])\r\n# evaluate solution\r\nsolution = result['x']\r\nevaluation = objective(solution)\r\nprint('Solution: f(%s) = %.5f' % (solution, evaluation))<\/pre>\n<p>Running the example executes the optimization, then reports the results.<\/p>\n<p><strong>Note<\/strong>: Your <a href=\"https:\/\/machinelearningmastery.com\/different-results-each-time-in-machine-learning\/\">results may vary<\/a> given the stochastic nature of the algorithm or evaluation procedure, or differences in numerical precision. Consider running the example a few times and compare the average outcome.<\/p>\n<p>In this case, the algorithm does not converge and instead uses the maximum number of function evaluations, which is 200.<\/p>\n<pre class=\"urvanov-syntax-highlighter-plain-tag\">Status: Maximum number of function evaluations has been exceeded.\r\nTotal Evaluations: 200\r\nSolution: f([-0.6918238]) = 0.79431<\/pre>\n<p>The algorithm may converge on some runs of the code but will arrive on a point away from the optima.<\/p>\n<h3>Multimodal Optimization Problem<\/h3>\n<p>Many nonlinear objective functions may have multiple optima, referred to as multimodal problems.<\/p>\n<p>The problem may be structured such that it has multiple global optima that have an equivalent function evaluation, or a single global optima and multiple local optima where algorithms like the Nelder-Mead can get stuck in search of the local optima.<\/p>\n<p>The <a href=\"https:\/\/en.wikipedia.org\/wiki\/Ackley_function\">Ackley function<\/a> is an example of the latter. It is a two-dimensional objective function that has a global optima at [0,0] but has many local optima.<\/p>\n<p>The example below implements the Ackley and creates a three-dimensional plot showing the global optima and multiple local optima.<\/p>\n<pre class=\"urvanov-syntax-highlighter-plain-tag\"># ackley multimodal function\r\nfrom numpy import arange\r\nfrom numpy import exp\r\nfrom numpy import sqrt\r\nfrom numpy import cos\r\nfrom numpy import e\r\nfrom numpy import pi\r\nfrom numpy import meshgrid\r\nfrom matplotlib import pyplot\r\nfrom mpl_toolkits.mplot3d import Axes3D\r\n\r\n# objective function\r\ndef objective(x, y):\r\n\treturn -20.0 * exp(-0.2 * sqrt(0.5 * (x**2 + y**2))) - exp(0.5 * (cos(2 * pi * x) + cos(2 * pi * y))) + e + 20\r\n\r\n# define range for input\r\nr_min, r_max = -5.0, 5.0\r\n# sample input range uniformly at 0.1 increments\r\nxaxis = arange(r_min, r_max, 0.1)\r\nyaxis = arange(r_min, r_max, 0.1)\r\n# create a mesh from the axis\r\nx, y = meshgrid(xaxis, yaxis)\r\n# compute targets\r\nresults = objective(x, y)\r\n# create a surface plot with the jet color scheme\r\nfigure = pyplot.figure()\r\naxis = figure.gca(projection='3d')\r\naxis.plot_surface(x, y, results, cmap='jet')\r\n# show the plot\r\npyplot.show()<\/pre>\n<p>Running the example creates the surface plot of the Ackley function showing the vast number of local optima.<\/p>\n<div id=\"attachment_11693\" style=\"width: 1034px\" class=\"wp-caption aligncenter\">\n<img decoding=\"async\" aria-describedby=\"caption-attachment-11693\" loading=\"lazy\" class=\"size-large wp-image-11693\" src=\"https:\/\/machinelearningmastery.com\/wp-content\/uploads\/2020\/09\/3D-Surface-Plot-of-the-Ackley-Multimodal-Function-1024x768.png\" alt=\"3D Surface Plot of the Ackley Multimodal Function\" width=\"1024\" height=\"768\" srcset=\"http:\/\/3qeqpr26caki16dnhd19sv6by6v.wpengine.netdna-cdn.com\/wp-content\/uploads\/2020\/09\/3D-Surface-Plot-of-the-Ackley-Multimodal-Function-1024x768.png 1024w, http:\/\/3qeqpr26caki16dnhd19sv6by6v.wpengine.netdna-cdn.com\/wp-content\/uploads\/2020\/09\/3D-Surface-Plot-of-the-Ackley-Multimodal-Function-300x225.png 300w, http:\/\/3qeqpr26caki16dnhd19sv6by6v.wpengine.netdna-cdn.com\/wp-content\/uploads\/2020\/09\/3D-Surface-Plot-of-the-Ackley-Multimodal-Function-768x576.png 768w, http:\/\/3qeqpr26caki16dnhd19sv6by6v.wpengine.netdna-cdn.com\/wp-content\/uploads\/2020\/09\/3D-Surface-Plot-of-the-Ackley-Multimodal-Function.png 1280w\" sizes=\"(max-width: 1024px) 100vw, 1024px\"><\/p>\n<p id=\"caption-attachment-11693\" class=\"wp-caption-text\">3D Surface Plot of the Ackley Multimodal Function<\/p>\n<\/div>\n<p>We would expect the Nelder-Mead function to get stuck in one of the local optima while in search of the global optima.<\/p>\n<p>Initially, when the simplex is large, the algorithm may jump over many local optima, but as it contracts, it will get stuck.<\/p>\n<p>We can explore this with the example below that demonstrates the Nelder-Mead algorithm on the Ackley function.<\/p>\n<pre class=\"urvanov-syntax-highlighter-plain-tag\"># nelder-mead for multimodal function optimization\r\nfrom scipy.optimize import minimize\r\nfrom numpy.random import rand\r\nfrom numpy import exp\r\nfrom numpy import sqrt\r\nfrom numpy import cos\r\nfrom numpy import e\r\nfrom numpy import pi\r\n\r\n# objective function\r\ndef objective(v):\r\n\tx, y = v\r\n\treturn -20.0 * exp(-0.2 * sqrt(0.5 * (x**2 + y**2))) - exp(0.5 * (cos(2 * pi * x) + cos(2 * pi * y))) + e + 20\r\n\r\n# define range for input\r\nr_min, r_max = -5.0, 5.0\r\n# define the starting point as a random sample from the domain\r\npt = r_min + rand(2) * (r_max - r_min)\r\n# perform the search\r\nresult = minimize(objective, pt, method='nelder-mead')\r\n# summarize the result\r\nprint('Status : %s' % result['message'])\r\nprint('Total Evaluations: %d' % result['nfev'])\r\n# evaluate solution\r\nsolution = result['x']\r\nevaluation = objective(solution)\r\nprint('Solution: f(%s) = %.5f' % (solution, evaluation))<\/pre>\n<p>Running the example executes the optimization, then reports the results.<\/p>\n<p><strong>Note<\/strong>: Your <a href=\"https:\/\/machinelearningmastery.com\/different-results-each-time-in-machine-learning\/\">results may vary<\/a> given the stochastic nature of the algorithm or evaluation procedure, or differences in numerical precision. Consider running the example a few times and compare the average outcome.<\/p>\n<p>In this case, we can see that the search completed successfully but did not locate the global optima. It got stuck and found a local optima.<\/p>\n<p>Each time we run the example, we will find a different local optima given the different random starting point for the search.<\/p>\n<pre class=\"urvanov-syntax-highlighter-plain-tag\">Status: Optimization terminated successfully.\r\nTotal Evaluations: 62\r\nSolution: f([-4.9831427 -3.98656015]) = 11.90126<\/pre>\n<\/p>\n<h2>Further Reading<\/h2>\n<p>This section provides more resources on the topic if you are looking to go deeper.<\/p>\n<h3>Papers<\/h3>\n<ul>\n<li>\n<a href=\"https:\/\/academic.oup.com\/comjnl\/article-abstract\/7\/4\/308\/354237\">A Simplex Method For Function Minimization<\/a>, 1965.<\/li>\n<\/ul>\n<h3>Books<\/h3>\n<ul>\n<li>\n<a href=\"https:\/\/amzn.to\/31J3I8l\">Algorithms for Optimization<\/a>, 2019.<\/li>\n<li>\n<a href=\"https:\/\/amzn.to\/3lCRqX9\">Numerical Optimization<\/a>, 2006.<\/li>\n<\/ul>\n<h3>APIs<\/h3>\n<ul>\n<li><a href=\"https:\/\/docs.scipy.org\/doc\/scipy\/reference\/tutorial\/optimize.html#nelder-mead-simplex-algorithm-method-nelder-mead\">Nelder-Mead Simplex algorithm (method=\u2019Nelder-Mead\u2019)<\/a><\/li>\n<li>\n<a href=\"https:\/\/docs.scipy.org\/doc\/scipy\/reference\/optimize.minimize-neldermead.html\">scipy.optimize.minimize API<\/a>.<\/li>\n<li>\n<a href=\"https:\/\/docs.scipy.org\/doc\/scipy\/reference\/generated\/scipy.optimize.OptimizeResult.html\">scipy.optimize.OptimizeResult API<\/a>.<\/li>\n<li>\n<a href=\"https:\/\/numpy.org\/doc\/stable\/reference\/random\/generated\/numpy.random.randn.html\">numpy.random.randn API<\/a>.<\/li>\n<\/ul>\n<h3>Articles<\/h3>\n<ul>\n<li>\n<a href=\"https:\/\/en.wikipedia.org\/wiki\/Nelder%E2%80%93Mead_method\">Nelder\u2013Mead method, Wikipedia<\/a>.<\/li>\n<li>\n<a href=\"http:\/\/www.scholarpedia.org\/article\/Nelder-Mead_algorithm\">Nelder-Mead algorithm, Wikipedia<\/a>.<\/li>\n<\/ul>\n<h2>Summary<\/h2>\n<p>In this tutorial, you discovered the Nelder-Mead optimization algorithm.<\/p>\n<p>Specifically, you learned:<\/p>\n<ul>\n<li>The Nelder-Mead optimization algorithm is a type of pattern search that does not use function gradients.<\/li>\n<li>How to apply the Nelder-Mead algorithm for function optimization in Python.<\/li>\n<li>How to interpret the results of the Nelder-Mead algorithm on noisy and multimodal objective functions.<\/li>\n<\/ul>\n<p><strong>Do you have any questions?<\/strong><br \/>\nAsk your questions in the comments below and I will do my best to answer.<\/p>\n<p>The post <a rel=\"nofollow\" href=\"https:\/\/machinelearningmastery.com\/how-to-use-nelder-mead-optimization-in-python\/\">How to Use Nelder-Mead Optimization in Python<\/a> appeared first on <a rel=\"nofollow\" href=\"https:\/\/machinelearningmastery.com\/\">Machine Learning Mastery<\/a>.<\/p>\n<\/div>\n<p><a href=\"https:\/\/machinelearningmastery.com\/how-to-use-nelder-mead-optimization-in-python\/\">Go to Source<\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p>Author: Jason Brownlee The Nelder-Mead optimization algorithm is a widely used approach for non-differentiable objective functions. As such, it is generally referred to as a [&hellip;] <span class=\"read-more-link\"><a class=\"read-more\" href=\"https:\/\/www.aiproblog.com\/index.php\/2021\/01\/24\/how-to-use-nelder-mead-optimization-in-python\/\">Read More<\/a><\/span><\/p>\n","protected":false},"author":1,"featured_media":4331,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"_bbp_topic_count":0,"_bbp_reply_count":0,"_bbp_total_topic_count":0,"_bbp_total_reply_count":0,"_bbp_voice_count":0,"_bbp_anonymous_reply_count":0,"_bbp_topic_count_hidden":0,"_bbp_reply_count_hidden":0,"_bbp_forum_subforum_count":0,"footnotes":""},"categories":[24],"tags":[],"_links":{"self":[{"href":"https:\/\/www.aiproblog.com\/index.php\/wp-json\/wp\/v2\/posts\/4330"}],"collection":[{"href":"https:\/\/www.aiproblog.com\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.aiproblog.com\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.aiproblog.com\/index.php\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.aiproblog.com\/index.php\/wp-json\/wp\/v2\/comments?post=4330"}],"version-history":[{"count":0,"href":"https:\/\/www.aiproblog.com\/index.php\/wp-json\/wp\/v2\/posts\/4330\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.aiproblog.com\/index.php\/wp-json\/wp\/v2\/media\/4331"}],"wp:attachment":[{"href":"https:\/\/www.aiproblog.com\/index.php\/wp-json\/wp\/v2\/media?parent=4330"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.aiproblog.com\/index.php\/wp-json\/wp\/v2\/categories?post=4330"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.aiproblog.com\/index.php\/wp-json\/wp\/v2\/tags?post=4330"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}