Другие журналы

электронный научно-технический журнал

ИНЖЕНЕРНЫЙ ВЕСТНИК

Издатель: Общероссийская общественная организация "Академия инженерных наук им. А.М. Прохорова".

Мета-оптимизация методом настройки параметров. Обзор

Инженерный вестник # 11, ноябрь 2015
УДК: 519.6
Файл статьи: Agasiev_T.pdf (963.31Кб)
авторы: Агасиев Т. А., профессор, д.ф.-м.н. Карпенко А. П.

В материале представлен обзор методов настройки параметров алгоритмов оптимизации. Рассмотренные методы классифицированы по цели настройки и принципу действия методов. Описаны различные подходы к настройке алгоритмов: методы однократной и перманентной настройки параметров, выборочные, скрининговые, мета стохастические и многокритериальные методы. Рассмотрены возможности использования математических моделей показателей эффективности настраиваемых алгоритмов, настройка с учетом особенностей решаемых оптимизационных задач и различных показателей эффективности настраиваемого алгоритма. Описаны методы настройки с учетом конфигурации вычислительной системы.

Список литературы

[1].            Карпенко А.П. Современные алгоритмы поисковой оптимизации. Алгоритмы, вдохновленные природой: учебное пособие / А.П. Карпенко. М.: Издательство МГТУ им. Н.Э. Баумана. 2014. 446 с.
[2].             Smit S.K. Parameter tuning and scientific testing in evolutionary algorithms. Amsterdam: Vrije Universiteit. 2012. 195 с. Режим доступа: http://dspace.ubvu.vu.nl/bitstream/handle/1871/38404/dissertation.pdf (дата обращения 01.10.2015)
[3].             Eiben A. E., Michalewicz Z., Schoenauer M., Smith J. E. Parameter Control in Evolutionary Algorithms // Parameter Setting in Evolutionary Algorithms. Springer Verlag. 2007. Vol.54. P. 19–46. DOI: 10.1007/978-3-540-69432-8_2
[4].             Pedersen M. E. H., Chipperfield A. J. Tuning differential evolution for artificial neural networks // Hvass Laboratories Technical Report HL0803. 2008. 19 p. Режим доступа: http://www.hvass-labs.org/people/magnus/publications/pedersen08tuning.pdf (дата обращения 01.10.2015)
[5].             Myers R., Hancock E.R. Empirical modelling of genetic algorithms // Evolutionary computation. 2001. Vol. 9. No. 4. P. 461-493. DOI:  10.1162/10636560152642878
[6].             Taguchi G., Yokoyama Y. Taguchi methods: design of experiments. (Taguchi Methods Series). Book. 4. Amer Supplier Inst. 1993.
[7].             Adenso-Diaz B., Laguna M. Fine-tuning of algorithms using fractional experimental designs and local search // Operations Research. 2006. Vol. 54. No. 1. P. 99-114. Режим доступа: http://opim.wharton.upenn.edu/~sok/papers/a/AdensoDiaz-Laguna-OR.pdf (дата обращения 01.10.2015)
[8].             Pedersen M. E. H. Tuning & simplifying heuristical optimization. Thesis for the degree of Doctor of Philosophy. University of Southampton. 2010. 204 p.  Режим доступа: http://eprints.soton.ac.uk/342792/1.hasCoversheetVersion/MEH_Pedersen_PhD_Thesis_2010.pdf(дата обращения 01.10.2015)
[9].             El-Beltagy M.A., Nair P.B., Keane A.J. Metalmodeling techniques for evolutionary optimization of computationally expensive problems: promises and limitations. // Genetic algorithms and classifier systems. (In, Proceedings of the Genetic and Evolutionary Computation Conference (GECCO-99), Orlando, USA, 13 - 17 Jul 1999). 1999. P. 196 – 203. Режим доступа: http://eprints.soton.ac.uk/23624/1/elbe_99c.pdf (дата обращения 01.10.2015)
[10].         Jin Y. A comprehensive survey of fitness approximation in evolutionary computation // Soft computing. 2005. Vol. 9. Is. 1. P. 3-12. DOI: 10.1007/s00500-003-0328-5
[11].         Czarn A., MacNish C., Vijayan K., Turlach B., Gupta R. Statistical exploratory analysis of genetic algorithms // Evolutionary Computation. IEEE Transactions. 2004. Vol. 8. Is. 4. P. 405-421. DOI: 10.1109/TEVC.2004.831262
[12].         Ramos I.C.O., Goldbarg M.C., Goldbarg E.F.G., Neto A.D.D. Logistic regression for parameter tuning on an evolutionary algorithm // Proceedings of the IEEE Congress on Evolutionary Computation. (CEC 2005, 2-4 September 2005, Edinburgh, UK). IEEE. Poster Session 2. 2005. Vol. 2. P. 1061-1068. DOI: 10.1109/CEC.2005.1554808
[13].         François O., Lavergne C. Design of evolutionary algorithms - A statistical perspective // IEEE Transactions on Evolutionary Computation. 2001. Vol. 5. No. 2. P. 129-148. DOI: 10.1109/4235.918434
[14].         Coy S.P., Goldenet B.L., Runger G.C., Wasil E.A. Using experimental design to find effective parameter settings for heuristics // Journal of Heuristics. 2001. Vol. 7. No. 1. P. 77-97. DOI: 10.1023/A:1026569813391
[15].         Bartz-Beielstein T., Parsopoulos K.E., Vrahatis M.N. Analysis of particle swarm optimization using computational statistics // Proceedings of the International Conference of Numerical Analysis and Applied Mathematics (ICNAAM 2004). Chalkis. Greece. Wiley-VCH Verlag GmbH & Co. Weinheim. 2004. P. 34-37.Режим доступа: http://spotseven.de/wordpress/wp-content/papercite-data/pdf/bpv04.pdf (дата обращения 01.10.2015)
[16].         Lasarczyk C.W.G.  Genetische Programmierung einer algorithmischen chemie. Dissertation zur Erlangung des Grades eines Doktors der Naturwissenschaften. Technische Universiteit Dortmund. 2007. 215 p. Режим доступа: https://eldorado.tu-dortmund.de/bitstream/2003/25029/1/diss_lasarczyk_gpac.pdf (дата обращения 01.10.2015)
[17].         Hutter F., Bartz-Beielstein T., Hoos H., Leyton-Brown K., Murphy K. Sequential model-based parameter optimization: An experimental investigation of automated and interactive approaches // Experimental Methods for the Analysis of Optimization Algorithms. Springer Berlin Heidelberg. 2010. P. 363-414. DOI: 10.1007/978-3-642-02538-9_15
[18].         Goldsman D., Nelson B.L., Schmeiser B. Methods for selecting the best system // Proceedings of the 23rd conference on Winter simulation. IEEE Computer Society. Washington. DC. USA. 1991. P. 177-186. DOI:10.1109/WSC.1991.185613
[19].         Branke J., Chick S. E., Schmidt C. New developments in ranking and selection: an empirical comparison of the three main approaches // Simulation Conference, 2005 Proceedings of the Winter. IEEE. 2005. 10 p. DOI: 10.1109/WSC.2005.1574312
[20].         Schmeiser B. Simulation experiments // Handbooks in operations research and management science. Elsevier Science Publishers (North-Holland), Amsterdam. 1990. Vol. 2. P. 295-330.
[21].         Rinott Y. On two-stage selection procedures and related probability-inequalities // Communications in Statistics-Theory and methods. 1978. Vol. 7. No. 8. P. 799-811. DOI: 10.1080/03610927808827671
[22].         Hochberg Y., Tamhane A.C. Multiple comparison procedures. / Wiley Series in Probability and Statistics. USA. New York: John Wiley & Sons. 1987. 450 p. Режим доступа: http://dl.acm.org/citation.cfm?id=39892 (дата обращения 01.10.2015)
[23].         Kim S.H., Nelson B.L. A fully sequential procedure for indifference-zone selection in simulation // ACM Transactions on Modeling and Computer Simulation (TOMACS). 2001. Vol. 11. No. 3. P. 251-273. DOI: 10.1145/502109.502111
[24].         Maron O., Moore A.W. The racing algorithm: Model selection for lazy learners // Lazy learning. Springer Netherlands. 1997. P. 193-225. DOI: 10.1007/978-94-017-2053-3_8
[25].         Birattari M. Tuning metaheuristics: a machine learning perspective. Berlin: Springer. 2009. Vol. 197. DOI: 10.1007/978-3-642-00483-4
[26].         Balaprakash P., Birattari M., Stützle T. Improvement strategies for the F-Race algorithm: Sampling design and iterative refinement. // (4th International Workshop, HM 2007, Dortmund, Germany, October 8-9, 2007.) Proceedings. Vol. 4771 of the series Lecture Notes in Computer Science. Hybrid Metaheuristics. Springer Berlin Heidelberg. 2007. P. 108-122. DOI: 10.1007/978-3-540-75514-2_9
[27].         Mercer R.E., Sampson J.R. Adaptive search using a reproductive meta-plan // Kybernetes. 1978. Vol. 7. No. 3. P. 215-228. DOI : 10.1108/eb005486 .
[28].         Grefenstette J.J. Optimization of control parameters for genetic algorithms // IEEE Transactions on Systems, Man, and Cybernetics. 1986. Vol. 16. No. 1. P. 122-128. DOI: 10.1109/TSMC.1986.289288
[29].         Hutter F., Hoos H.H., Stützle T. Automatic algorithm configuration based on local search // AAAI. 2007. Vol . 7. P. 1152-1157. Режим доступа: http://www.aaai.org/Papers/AAAI/2007/AAAI07-183.pdf (дата обращения 01.10.2015)
[30].         Yuan B., Gallagher M. Combining Meta-EAs and racing for difficult EA parameter tuning tasks // Parameter Setting in Evolutionary Algorithms. Vol. 54 of the series Studies in Computational Intelligence. Springer Berlin Heidelberg. 2007. P. 121-142. DOI: 10.1007/978-3-540-69432-8_6
[31].         Hutter F., Hoos H.H., Leyton-Brown K. Sequential model-based optimization for general algorithm configuration / 5th International Conference, LION 5, Rome, Italy, January 17-21, 2011. Selected Papers // Learning and Intelligent Optimization. Vol. 6683 of the series Lecture Notes in Computer Science. Springer Berlin Heidelberg. 2011. P. 507-523. DOI: 10.1007/978-3-642-25566-3_40
[32].         Biau G. Analysis of a random forests model // Journal of Machine Learning Research. 2012. V. 3. Vol. 13. No. 1. P. 1063-1095. Режим доступа: http://arxiv.org/pdf/1005.0208 (дата обращения 01.10.2015)
[33].         Pedersen M.E.H., Chipperfield A.J. Simplifying particle swarm optimization // Applied Soft Computing. 2010. Vol. 10. No. 2. P. 618-628. DOI: 10.1016/j.asoc.2009.08.029
[34].         Mersmann O., Bischl B., Trautmann H., Preuss M., Weihs C., Rudolph G. Exploratory landscape analysis // Proceedings of the 13th annual conference on Genetic and evolutionary computation. (GECCO’11, July 12–16, 2011, Dublin, Ireland). New York, NY, USA: ACM. 2011. P. 829-836. DOI:  10.1145/2001576.2001690
[35].         Smit S.K., Eiben A.E., Szlávik Z. An MOEA-based Method to Tune EA Parameters on Multiple Objective Functions // Proceedings of the International Conference on Evolutionary Computation ICEC 2010. (Valencia, Spain, October 24 - 26, 2010). IJCCI (ICEC). 2010. P. 261-268. Режим доступа: http://www.researchgate.net/profile/SK_Smit/publication/221616389_An_MOEA-based_Method_to_Tune_EA_Parameters_on_Multiple_Objective_Functions/links/54ddf8820cf22a26721d9498.pdf (дата обращения 01.10.2015)
[36].         Dréo J. Using performance fronts for parameter setting of stochastic metaheuristics // GECCO '09 Proceedings of the 11th Annual Conference Companion on Genetic and Evolutionary Computation Conference: Late Breaking Papers. New York, NY, USA: ACM. 2009. P. 2197-2200. DOI: 10.1145/1570256.1570301
[37].         Deb K., Pratap A., Agarwal S., Meyarivan T. A fast and elitist multiobjective genetic algorithm: NSGA-II // IEEE Transactions on Evolutionary Computation. 2002. Vol. 6. No. 2. P. 182-197. DOI: 10.1109/4235.996017
[38].         Nannen V., Eiben A.E. A method for parameter calibration and relevance estimation in evolutionary algorithms // GECCO '06 Proceedings of the 8th annual conference on Genetic and evolutionary computation. New York, NY, USA: ACM. 2006. P. 183-190. DOI: 10.1145/1143997.1144029
[39].         Smit S.K., Eiben A.E. Comparing parameter tuning methods for evolutionary algorithms // CEC'09. IEEE Congress on Evolutionary Computation. 2009. P. 399-406. DOI: 10.1109/CEC.2009.4982974
[40].         Shapiro L.G., Haralick R.M. Computer and robot vision. Vol. 1. Addison Wesley. 1992. 672 p.
[41].         Branke J., Schmidt C., Schmec H. Efficient fitness estimation in noisy environments // Proceedings of genetic and evolutionary computation. 2001. 8 p. Режим доступа: http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.9.7071&rep=rep1&type=pdf (дата обращения 01.10.2015)
[42].         Smit S.K., Eiben A.E. Parameter tuning of evolutionary algorithms: Generalist vs. specialist // Applications of evolutionary computation. Springer Berlin Heidelberg. 2010. P. 542-551. DOI: 10.1007/978-3-642-12239-2_56
[43].        Zitzler E., Laumanns M., Thiele L. Spea2: Improving the strength Pareto evolutionary algorithm. Zurich, Switzerland: Swiss Federal Institute of Technology. 2001. 21 p. Режим доступа: http://www.kddresearch.org/Courses/Spring-2007/CIS830/Handouts/P8.pdf (дата обращения 01.10.2015)
[44].         Eiben A.E., Schut M.C., de Wilde A.R. Is self-adaptation of selection pressure and population size possible? - A case study // Parallel Problem Solving from Nature-PPSN IX. Springer Berlin Heidelberg. 2006. P. 900-909. DOI : 10.1007/11844297_91
[45].         Ugolotti R. Meta-optimization of Bio-inspired Techniques for Object Recognition. Diss. Università di Parma. Dipartimento di Ingegneria dell'Informazione. 2015. 158 p.Режим доступа: http://dspace-unipr.cineca.it/bitstream/1889/2827/1/TesiDottoratoUgolottiOnline.pdf (дата обращения 01.10.2015)
[46].         Storn R., Price K. Differential evolution - a simple and efficient adaptive scheme for global optimization over continuous spaces. // Journal of Global Optimization. Kluwer Academic Publishers. 1997. Vol. 11. Is. 4. P. 341-359. DOI: 10.1023/A:1008202821328
[47].         Kennedy J., Eberhart C. A New Optimizer Using Particle Swarm Theory // Micro Machine and Human Science. MHS '95. Proceedings of the Sixth International Symposium. IEEE. 1995. P. 39 - 43 Режим доступа: http://www.ppgia.pucpr.br/~alceu/mestrado/aula3/PSO_2.pdf (дата обращения 01.10.2015).
[48].         Ruxton G.D. The unequal variance t-test is an underused alternative to Student's t-test and the Mann–Whitney U test // Behavioral Ecology. 2006. Vol. 17. No. 4. P. 688-690. DOI: 10.1093/beheco/ark016
[49].         Ugolotti R., Nashed Y.S.G., Mesejo P., Cagnoni S. Algorithm configuration using GPU-based metaheuristics // GECCO '13 Companion Proceedings of the 15th annual conference companion on Genetic and evolutionary computation. New York, NY, USA: ACM. 2013. P. 221-222. DOI: 10.1145/2464576.2464682
[50].         Ugolotti R., Mesejo P., Nashed Y.S.G., Cagnoni S. GPU-Based Automatic Configuration of Differential Evolution: A Case Study // Progress in Artificial Intelligence. Vol. 8154 of the series Lecture Notes in Computer Science. Springer Berlin Heidelberg. 2013. P. 114-125. DOI: 10.1007/978-3-642-40669-0_11
[51].         Hutter F., Hoos H.H.; Leyton-Brown K.; Stützle T. ParamILS: an automatic algorithm configuration framework // Journal of Artificial Intelligence Research. 2009. Vol. 36. No . 1. P. 267-306. Режим доступа: http://www.aaai.org/Papers/JAIR/Vol36/JAIR-3606.pdf (дата обращения 01.10.2015)
[52].         Deb K., Srinivasan A. Innovization: Innovating design principles through optimization // Genetic and Evolutionary Computation of the 8th annual Conference, GECCO 2006, Proceedings, (Seattle, Washington, USA, July 8-12, 2006). New York, NY, USA: ACM. 2006. P. 1629-1636. DOI: 10.1145/1143997.1144266
[53].         Ugolotti R., Cagnoni S. Analysis of evolutionary algorithms using multi-objective parameter tuning // GECCO '14 Proceedings of the 2014 Annual Conference on Genetic and Evolutionary Computation. New York, NY, USA: ACM. 2014. P. 1343-1350. DOI: 10.1145/2576768.2598226
[54].         Montero E., Riff M., Pérez-Caceres L., Coello Coello C.A. Are state-of-the-art fine-tuning algorithms able to detect a dummy parameter? // Parallel Problem Solving from Nature-PPSN XII. (12th International Conference, Taormina, Italy, September 1-5, 2012, Proceedings, Part I). Springer Berlin Heidelberg. 2012. P. 306-315.DOI: 10.1007/978-3-642-32937-1_31
[55].         Fitzgerald T., Malitsky Y., O’Sullivan B., Tierney K. ReACT: Real-Time Algorithm Configuration through Tournaments // Proceedings of the Seventh Annual Symposium on Combinatorial Search. 2014. Режим доступа: http://www.aaai.org/ocs/index.php/SOCS/SOCS14/paper/download/8910/8880 (дата обращения 01.10.2015)
[56].         Branke J., Elomari J. A. Meta-optimization for parameter tuning with a flexible computing budget // GECCO '12 Proceedings of the 14th annual conference on Genetic and evolutionary computation. New York, NY, USA: ACM. 2012. P. 1245-1252. DOI: 10.1145/2330163.2330336


Тематические рубрики:
Поделиться:
 
ПОИСК
 
elibrary crossref neicon rusycon
 
ЮБИЛЕИ
ФОТОРЕПОРТАЖИ
 
СОБЫТИЯ
 
НОВОСТНАЯ ЛЕНТА



Авторы
Пресс-релизы
Библиотека
Конференции
Выставки
О проекте
Rambler's Top100
Телефон: +7 (499) 263-69-71
  RSS
© 2003-2024 «Инженерный вестник» Тел.: +7 (499) 263-69-71