[1] HILLIER F S, LIEBERMAN G J. Introduction to Operations Research[M]. 9th ed. New York: McGraw-Hill, 2014:12-17. [2] 汪定伟, 王俊伟, 王洪峰, 等. 智能优化方法[M]. 北京: 高等教育出版社, 2007: 1-9.(WANG D W, WANG J W, WANG H F, et al. Intelligent Optimization Method[M]. Beijing: Higher Education Press, 2007:1-9). [3] HOLLAND J H. Adaptation in Natural and Artificial Systems[M]. Ann Arbor: University of Michigan Press, 1975: 22-30. [4] KIRKPATRICK S, GELATT J R, VECCHI M P. Optimization by simulated annealing[J]. Science, 1983, 220(4598): 671-680. [5] COLORNI A, DORIGO M, MANIEZZO A. Distributed optimization by ant colonies[EB/OL].[2016-03-01]. http://faculty.washington.edu/paymana/swarm/colorni92-ecal.pdf. [6] KENNEDY J, EBERHART R. Particle swarm optimization[C]//Proceedings of the 1995 IEEE International Conference on Neural Networks. Piscataway, NJ: IEEE, 1995: 1942-1948. [7] PASSINO K M. Biomimicry of bacterial foraging for distributed optimization and control[J]. IEEE Control Systems Magazine, 2002, 22(3): 52-67. [8] 李晓磊, 邵之江, 钱积新. 一种基于动物自治体的寻优模式:鱼群算法[J]. 系统工程理论与实践, 2002, 22(11): 32-38.(LI X L, SHAO Z J, QIAN J X. An optimizing method based on autonomous animats: fish-swarm algorithm[J]. System Engineering & Theory Practice, 2002, 22(11):32-38.) [9] EUSUFF M, LANSEY K. Optimization of water distribution network design using the shuffled frog leaping algorithm[J]. Journal of Water Resources Planning and Management, 2003, 129(3): 210-215. [10] KARABOGA D. An idea based on honey bee swarm for numerical optimization[R]. Kayseri, Turkiye: Erciyes University, 2005. [11] XINSHE Y. Nature -inspired metaheuristic algorithms[M]. London: Luniver Press, 2008: 38-50. [12] XINSHE Y. Cuckoo search via levy flights[C]//NaBIC 2009: Proceedings of the 2009 World Congress on Nature & Biologically Inspired Computing. Piscataway, NJ: IEEE, 2009: 210-214. [13] XINSHE Y, GANDOMI A H. Bat algorithm: a novel approach for global engineering optimization[J]. Engineering Computations, 2012, 29(5): 464-483. [14] GANDOMI A H, ALAVI A H. Krill herd: a new bio-inspired optimization algorithm[J]. Communications in Nonlinear Science and Numerical Simulation, 2012, 17(12): 4831-4845. [15] TANG R, FONG S, YANG X, et al. Wolf search algorithm with ephemeral memory[C]//Proceedings of the Seventh International Conference on IEEE Digital Information Management. Piscataway, NJ: IEEE, 2012: 165-172. [16] YAZDANI M, JOLAI F. Lion Optimization Algorithm (LOA): a nature- inspired metaheuristic algorithm[J]. Journal of Computational Design and Engineering, 2016, 3(1): 24-36. [17] WOLPERT D H, MACREADY W G. No free lunch theorems for optimization[J]. IEEE Transactions on Evolutionary Computation, 1997, 1(1): 67-82. [18] 曾子林, 张宏军, 张睿. 基于元学习思想的算法选择问题综述[J]. 控制与决策, 2014,29(6): 961-968.(ZENG Z L, ZHANG H J, ZHANG R. Summary of algorithm selection based on meta-learning[J]. Control and Decision, 2014,29(6): 961-968.) [19] CRUZ-REYES L, GÓMEZ-SANTILLÁN C, PÉREZ-ORTEGA J. Algorithm selection: from meta-learning to hyper-heuristics[EB/OL].[2016-03-01]. http://cdn.intechweb.org/pdfs/30653.pdf. [20] RICE J R. The algorithm selection problem[J]. Advances in Computation, 1976, 15: 65-118. [21] JOSÉC, MARIO V. Multi-mode resource-constrained project scheduling using RCPSP and SAT solvers[J]. European Journal of Operational Research, 2011, 213(1): 73-82. [22] WATANABE S. Knowing and Guessing: A Quantitative Study of Inference and Information[M]. New York: Wiley, 1969: 376-377. [23] SONG Q B, WANG G T, WANG C. Automatic recommendation of classification algorithms based on data set characteristics[J]. Pattern Recognition, 2012, 45(7): 2672-2689. [24] CAN C, MENGQI H, JEFFERY D W, et al. A recommendation system for meta-modeling: a meta-learning based approach[J]. Expert Systems with Applications, 2016, 46: 33-44. [25] BRODLEY C E. Recursive automatic bias selection for classifier construction[J]. Machine Learning, 1995, 20(1): 63-94. [26] DE SOUZA B F, CARVALHO D, SOARES C. Metalearning for gene expression data classification[C]//HIS 2008: Proceedings of 2008 Eighth International Conference on Hybrid Intelligent Systems. Piscataway, NJ: IEEE, 2008: 441-446. [27] LAN Z, GU J, ZHENG Z. A study of dynamic meta-learning for failure prediction in large-scale systems[J]. Journal of Parallel and Distributed Computing, 2010, 70(6): 630-643. [28] ZHOU S, LAI K K, YEN J. A dynamic meta-learning rate-based model for gold market forecasting[J]. Expert Systems with Applications, 2012, 39(6): 6168-6173. [29] GIRAUD-CARRIER C. Metalearning-a tutorial[EB/OL].[2016-03-10]. https://pdfs.semanticscholar.org/5794/1a4891f673cadf06fba02419372aad85c3bb.pdf. [30] ROSSI AL D, CARVALHO A, SOARES C, et al. Meta-stream: a meta-learning based method for periodic algorithm selection in time-changing data[J]. Neurocomputing, 2014, 127: 52-64. [31] KING R D, FENG C, SUTHERLAND A. STATLOG: comparison of classification algorithms on large real-world problems[J]. Applied Artificial Intelligence, 1995, 9(3): 289-333. [32] PFAHRINGER B, BENSUSAN H, GARRIER C G. Meta-learning by landmarking various learning algorithms[C]//ICML 2000: Proceedings of the Seventeenth International Conference on Machine Learning. San Francisco: Morgan Kaufmann Publishers, 2000: 743-750. [33] RENDELL L, CHO H. Empirical learning as a function of concept character[J]. Machine Learning, 1990, 5(3): 267-298. [34] AHA D. Generalizing from case studies: a case study[C]//ICML 1992: Proceedings of the 9th International Conference on Machine Learning. San Francisco: Morgan Kaufmann Publishers, 1992: 1-10. [35] SMITH-MILES K A. Cross-disciplinary perspectives on meta-learning for algorithm selection[J]. ACM Computing Surveys, 2008, 41(1):1-25. [36] MISIR M. Intelligent hyper-heuristics: a tool for solving generic optimization problems[D]. Flanders, Belgium: Katholieke Universiteit Leuven, 2012. [37] MESSELIS T, CAUSMAECKER P D. An automatic algorithm selection approach for the multi-mode resource-constrained project scheduling problem[J]. European Journal of Operational Research, 2014, 233(3): 511-528. [38] VILALTA R, DRISSI Y. A perspective view and survey of meta-learning[J]. Artificial Intelligence Review, 2002, 18(2): 77-95. [39] MIRANDA P B C, PRUDENCIO R B C, CARVALHO A, et al. A hybrid meta-learning architecture for multi-objective optimization of SVM parameters[J]. Neurocomputing, 2014, 143: 27-43. [40] SMITH-MILES K, LOPES L. Measuring instance difficulty for combinatorial optimization problems[J]. Computers & Operations Research, 2012, 39(5): 875-889. [41] LEYVA E, CAISES Y, GONZÁLEZ A, et al. On the use of meta-learning for instance selection: an architecture and an experimental study[J]. Information Sciences, 2014, 266: 16-30. [42] BHATT N, THAKKAR A, GANATRA A. A survey & current research challenges in meta-learning approaches based on dataset characteristics[J]. International Journal of Soft Computing and Engineering, 2012, 2(1): 239-247. [43] LEMKE C, BUDKA M, GABRYS B. Meta-learning: a survey of trends and technologies[J]. Artificial Intelligence Review, 2015, 44(1):117-130. [44] KOLISCH R, SPRECHER A. PSPLIB-a project scheduling problem library[J]. European Journal of Operational Research, 1996, 96(1): 205-216. [45] 陈龙, 韩兆兰, 崔建双. 求解多模式资源约束的项目调度问题的离散粒子群算法[J]. 计算机应用, 2015, 35(增刊2):101-105.(CHEN L, HAN Z L, CUI J S. Discrete particle swarm optimization for solving multi-mode resource-constrained project scheduling problem[J]. Journal of Computer Applications, 2015, 35(Suppl 2):101-105.) |