Pinter, Janos D.2016-08-182016-08-182012-010957-4174http://hdl.handle.net/10679/4441https://doi.org/10.1016/j.eswa.2011.06.050Due to copyright restrictions, the access to the full text of this article is only available via subscription.Artificial neural networks (ANNs) are used extensively to model unknown or unspecified functional relationships between the input and output of a “black box” system. In order to apply the generic ANN concept to actual system model fitting problems, a key requirement is the training of the chosen (postulated) ANN structure. Such training serves to select the ANN parameters in order to minimize the discrepancy between modeled system output and the training set of observations. We consider the parameterization of ANNs as a potentially multi-modal optimization problem, and then introduce a corresponding global optimization (GO) framework. The practical viability of the GO based ANN training approach is illustrated by finding close numerical approximations of one-dimensional, yet visibly challenging functions. For this purpose, we have implemented a flexible ANN framework and an easily expandable set of test functions in the technical computing system Mathematica. The MathOptimizer Professional global–local optimization software has been used to solve the induced (multi-dimensional) ANN calibration problems.engrestrictedAccessCalibrating artificial neural networks by global optimizationarticle391253200029621490000410.1016/j.eswa.2011.06.050Artificial neural networksCalibration of ANNs by global optimizationANN implementation in MathematicaLipschitz Global Optimizer (LGO) solver suiteMathOptimizer Professional (LGO linked to Mathematica)Numerical examples2-s2.0-81855207560