Разработка и исследование методов прогнозирования на основе SVM-моделей
Диссертация
Апробация работы. Основные результаты диссертации обсуждались наВсероссийских научно-практических конференциях «Информационные технологии в профессиональной, деятельности: и научной работе» (Йошкар-Ола, 2007 и 2009), международной научно-технической конференции «Интеллектуальные системы» (AIS' 07) и «Интеллектуальные САПР» (СAD-2007) (Дивноморск, 2007), конференции «Технологии Microsoft в теории… Читать ещё >
Список литературы
- Айвазян, С. А. Прикладная статистика и основы эконометрики: Учебник для вузов. — М.: Юнити, 1998. — 1022 с.
- Афанасьев, В.Н. Анализ временных рядов и прогнозирование: Учеб-ник / В. Н. Афанасьев, М. М. Юзбашев. М.: Финансы и статистика, 2001.-228 с.
- Бокс, Дж. Анализ временных рядов. Прогноз и управление / Дж. Бокс, Г. Дженкинс. М.: Мир. — 1974. — 604 с.
- Бурдо, А. И. К вопросу систематизации методов и алгоритмов прогнозирования // Материалы межрегиональной конференции «Студенческая наука экономике научно-технического прогресса». -Ставрополь: СевКав ГТУ, 2001. — С. 33−34.
- Буч, Г. Объектно-ориентированный анализ и проектирование с примерами приложений на С++ / Г. Буч. М.: Издательство Бином. — 1999. — 720 с.
- Вапник, В. Н. Теория распознавания образов / В. Н. Вапник, А. Я. Червоненкис. -М.: Наука, 1974. 416 с.
- Вапник, В.Н. Восстановление зависимостей по эмпирическим данным / В. Н. Вапник. М.: Наука, 1979. — 448 с.
- Дуброва, Т. А. Статические методы прогнозирования в экономике / Т. А. Дуброва. М.: Московский международный институт эконометрики, информатики, финансов и права, 2003. — 50 с.
- Дюк, В. Data Mining: учебный курс / В. Дюк, А. Самойленко. СПб.: Питер, 2001.-367 с.
- Льюнг, JI. Идентификация систем. Теория для пользователя / Л. Льюнг. М.: Наука, 1991.-- 320 с.
- Кремер, Н. Ш. Эконометрика / Н. Ш. Кремер, Б. А. Путко. М.: ЮНИТИ, 2002.-311 с.
- Орлова, И.В. Экономико—математические методы и модели. Выполнение расчетов в среде EXCEL / Практикум: Учебное пособие для вузов. -М.: ЗАО «Финстатинформ», 2000. 136 с.
- Отнес, Р. Прикладной анализ временных рядов / Р. Отнес, JI. Эноксон. М.: Мир, 1982. — 429 с.
- Петере, Э. Хаос и порядок на рынках капитала. Новый аналитический взгляд на циклы, цены и изменчивость рынка / Э. Петере. -М.: Мир, 2000.-333 с.
- Стадник, М. П. Модификация критерия Мэллоуза-Акаике для подбора порядка регрессионной модели / М. П. Стадник // Автоматика и телемеханика. 1988. — № 4. — С. 44−45.
- Тихонов, Э. Е. Методы прогнозирования в условиях рынка: учебное пособие / Э. Е. Тихонов. Н.: Невинномысск, 2006. -221 с.
- Тюрин, Ю. Н. Анализ данных на компьютере / Ю. Н. Тюрин, А. А. Макаров. М.: ИНФРА-М, 2003. — 544 с.
- Цыпкин, Я. 3. Основы информационной теории идентификации / Я. 3. Цыпкин. М.: Наука, 1984. — 320 с.
- Шумков, Д.С. Метод прогнозирования временных рядов с использованием кусочно-линейной аппроксимации / Д. С. Шумков, И. Г. Сидоркина // Вестник Чувашского университета. Чебоксары, 2008. — № 2.-с. 199−203.
- Шумков, Д.С. Философия информационной безопасности:прогнозирование событий на основе накопленной информации /
- Д.С. Шумков, А. В. Егошин, И. Г. Сидоркина // Йошкар-Ола: материалы, региональной научно—практической конференции студентов и молодых ученых. Йошкар-Ола: Марийский^ государственный технический университет, 2007. — с. 257 — 261.
- Abarbanel, H.D.I. Analysis of, observed chaotic data / H.D.I: Abarbanel. first ed. — New York: Springer, 1996.
- Abe, S. Support vector machines for pattern classification / S. Abe. -New York: Springer, 2005. 350 p.
- Angeline, P.J. Evolving predictors for chaotic time series / P.J. Angeline- S. Rogers, D. Fogel, J. Bezdek, and B. Bosacchi, eds. Proceedings of SPIE: Application and Science of Computational Intelligence, vol. 3390. — 1998. -p. 170−180.
- Anguita, D. Evaluating the generalization ability of Support Vector Machines through the Bootstrap / D. Anguita, A. Boni, S. Ridella // Neural Processing Letters. 2000. — № 11.-p. 51−58, 162, 168, 170.
- Anguita, D. Hyperparameter design criteria for support vector classifiers / D. Anguita, S. Ridella, F. Rivieccio, R. Zunino // Neurocomputing. 2003. -№ 51. -p. 109−134, 162, 171.
- Anthony, M. Cross-validation for binary classification by real-valued functions: theoretical analysis / M. Anthony, S.B. Holden // Proc. of the 11th Conf. on Computational Learning Theory. 1998. — p. 218−229, 167.
- Aussem, A. Dynamical recurrent neural networks towards prediction and modeling of dynamical systems / A. Aussem // Neurocomputing. 1999. -№ 28.-p. 207−232.
- Bartlett, P. Model selection and error estimation / P. Bartlett, S. Boucheron, G. Lugosi // Machine Learning. 2001. — № 48. — p. 85−113.
- Bengio, Y. No unbiased estimator of the variance of K-fold cross validation / Y. Bengio, Y. Grandvalet // Advances of Neural Processing Systems. -The MIT Press, 2004 № 16. — p. 166, 168.
- Blum, A. Beating the hold-out: bounds for K-fold and progressive cross-validation / A. Blum, A. Kalai, J. Langford // Proc. of the 12th Conf. on Computational Learning Theory. 1999. — p. 203−208, 166.
- Bontempi, G., Birattari, M. A multi-step-ahead prediction method based4 on local dynamic properties / G. Bontempi, M. Birattari // ESANN 2000 Proceedings European Symposium on Artificial Neural Networks. — 2000. -p. 311−316.
- Boser, В A training algorithm for optimal margin classifiers / B. Boser // Proceedings of the 5th Annual ACM Workshop on Computational Learning Theory. ACM Press, 1992.-p. 144−152.
- Bousquet, O. Introduction to Statistical Learning Theory / O. Bousquet, S. Boucheron, G. Lugosi // Advanced Lectures on Machine Learning Lecture. -Germany, 2004. p. 169−207.
- Bousquet, O. Stability and generalization / O. Bousquet, A. Elisseeff// Journal of Machine Learning Research. 2002. — № 2. — p. 499−526, 168.
- Breiman, L. Bagging Predictors / L. Breiman // Machine Learning. -1996. -№ 24. -p. 123−140.
- Burges, C. J. A tutorial on support vector machines for pattern recognition, 1998.
- Cao, L. Support vector machines experts for time series forecasting / Cao L. // Neurocomputing. 2003. — № 51. — p. 321−339.
- Castillo, E. A minimax method for learning functional networks / J. M. Gutierrez, A. Cobo, C. Castillo // Neural Process. Lett. 11. 2000. — № 1. -p. 39−49.
- Cauwenberghs, G. Incremental and decremental support vector machine learning / G. Cauwenberghs, T. Poggio // Advances in Neural Information Processing Systems (NIPS 2000). -2001. -№ 13. p. 409−415.
- Chalimourda, A. Experimentally optimal in support vector regression for different noise models and parameter settings / A. Chalimourda, B. Scholkopf, A. Smola//Neural Networks. -2004. -№ 17 (1). p. 127−141.
- Chang, C.-C. LIBSVM: a Library for Support Vector Machines / CC. Chang, C.-J. Lin // Dept. of Computer Science and Information Engineering. -National Taiwan University. Режим- доступа: www/URL: http://csis.ntu.edu.tw/~cjlin. — 10.10.2007 г.
- Chapelle, О. Choosing multiple parameters for support vector machines / O. Chapelle, V. Vapnik, O. Bousquet, S. Mukherjee // Machine Learning. 2002. -№ 46 (1−3).-p. 131−159.
- Cherkassky, V. Practical selection of svm parameters and noise estimation for svm regression / V. Cherkassky, Y. Ma // Neural Networks. 2004. -№ 17 (1). -p.l 13−126.
- Corona, F. Variable scaling for time series prediction / F. Corona, A. Lendasse // Proc. ESTSP 2007. 2007. — p. 69−76.
- Cortes, C. Support vector networks / C. Cortes, V. Vapnik // Machine Learning. 1995. -№ 20. — p. 1−25.
- Cristianini, N. An introduction to support vector machines and other kernel-based learning methods / N. Cristianini, J. Shawe-Taylor. Cambridge University Press, 2001. — p. 160.
- De Coste, D. Alpha seeding for support vector machines / D. De Coste, K. Wagstaff// Proc. of the 6th ACM SIGKDD Int. Conf. on Knowledge Discovery and Data Mining. 2000. — p. 345−349.
- Duan, K. Evaluation of simple performance measures for tuning svm hyperparameters / K. Duan, S. Keerthi, A. Poo // Neurocomputing. 2003. — № 5. -p. 41−59.
- Dudul, S.V. Prediction of a lorenz chaotic attractor using two-layer perceptron neural network / S.V. Dudul // Applied Soft Computing. 2004.
- Dudley, R. Central limit theorems for empirical measures // Annals of Probability. 1978. — № 6. — p. 899−929.
- Efron, B. An introduction to the Bootstrap / B. Efron, R. Tibshirani. -Chapman and Hall, 1993.
- Engel, Y. The kernel recursive least-squares algorithm / Y. Engel, S. Mannor, R. Meir // IEEE Transaction on Signal Processing 52. 2004. — № 8. -p. 2275−2285.
- Floyd, S. Sample compression, learnability, and the Vapnik-Chervonenkis dimension / S. Floyd, M. Warmuth // Machine Learning. 1995. — № 21-p. 269−304.
- Farmer, J.D. Predicting chaotic time series / J.D. Farmer, J.J. Sidorowich // Physical Review Letters. 1997. — № 8. — p. 845−848.
- Fernandez, R. Predicting time series with a local support vector regression machine / R. Fernandez. Springer, 1999. -170 p.
- Fletcher, R. Practical methods of optimization / R. Fletcher. 2nd ed. -John Wiley & Sons Ltd., Chichester, 1987. — 425 p.
- Gers, F. A. Applying LSTM to time series predictable through time-window approaches / F.A. Gers, D. Eck, J. Schmidhuber // Lecture Notes in Computer Science. 2001. — 669 p.
- Gine, E. Some limit theorems for empirical processes / E. Gine, J. Zinn // Annals of Probability. 1984. -№ 12. — p. 929−989.
- Grassberger, P. Measuring the strangeness of strange attractors / P. Grassberger, I. Procaccia // Physica D. 1983. — № 9. — p. 189−208.
- Gunn, S. Support vector machines for classification and regression / S. Gunn. Tech. report, Department of Electronics and Computer Science, University of Southampton, 1998. — 56 p.
- Han, M. Prediction of chaotic time series based on the recurrent predictor neural network / M. Han, J. Xi, F. Yin // IEEE Transactions on Signal Processing 52. 2004. — № 12. — p. 3409−3416.
- Hastie, T. The Elements of Statistical Learning / T. Hastie, R. Tibshirani, J. Friedman. Springer, 2001. — 533 p.
- Hegger, R. Practical implementation of nonlinear time series methods: The tisean package / R. Hegger, H. Kantz, T. Schreiber // Chaos 9. 1999. — № 2. -p. 413−435.
- Henon, M. A two-dimensional mapping with a strange attractor / M. Henon. Comm.Math.Phys. 50. — 1976. — № 1. — p. 69−77.
- Herbrich, R. Learning Kernel Classifiers / R. Herbrich. The Mit Press. -2002.-p. 160.
- Huber, P. Robust estimation of location parameter / P. Huber. Annals of Mathematical Statistics 35. — 1964. — p. 73−101.
- Joachims, T. Making large-scale svm learning practical / T. Joachims, B. Scholkopf, C. Burges, A. Smola eds. // Advances in Kernel Methods Support Vector Learning. — 1999. — p. 169−184.
- Joachims, T. The maximum-margin approach to learning text classifiers: method, theory and algorithms: Ph.D. thesis / T. Joachims. University of Dortmund, 2001.
- Kohavi, R. A study of cross-validation and boostrap for accuracy estimation and model selection / R. Kohavi // Proc. of the Int. Joint Conf. on Artificial Intelligence. 1995. — p. 164.
- Kohlmorgen, J. Data set a is a pattern matching problem / J. Kohlmorgen, K. -R. Miiller // Neural Process. Lett. 7. 1998. — № 1. — p.43−47.
- Koltchinskii, V. Rademacher penalties and structural risk minimization // IEEE Transactions on Information Theory. 2001. — № 47. — p. 1902−1914.
- Kugiumtzis, D. Chaotic time series: Part i. estimation of invariant properties in state space, / D. Kugiumtzis, B. Lillekjendlie, N. Christophersen // Modeling, Identification and Control 15. 1994. — № 4. — p. 205−224.
- Kwok, J. Linear dependency between and the input noise in the support vector regression / J. Kwok // IEEE Transactions on Neural Networks ICANN 2001, 2003.-p.405−410.
- Langford, J. Quantitatively tight sample bounds / J. Langford. -Carnegie Mellon University, 2002.-p. 163, 172.
- Lapedes, A. How neural nets work / A. Lapedes, R. Farber. Neural Information Processing Systems, 1987. — p. 442−456.
- Lillekjendlie, B. Chaotic time series: System identification and prediction / B. Lillekjendlie, D. Kugiumtzis, N. Christophersen // Modeling, Identification and Control 15. 1994. — № 4. — p. 225−243.
- Lin, Y. Statistical properties and adaptive tuning of support vector machines/ Y. Lin, G. Wahba, H. Zhang, and Y. Lee // Machine Learning. 2002. -№ 48.-p. 115−136.
- Lorenz, E.N. Deterministic nonperiodic flow / E.N. Lorenz. Science, 1963.-p. 130−141.
- Luxburg, U. A compression approach to support vector model selection / U. Luxburg, O. Bousquet, B. Scholkopf // The Journal of Machine Learning Research. 2004. — № 5. — p. 293−323.
- Mackey, M.C. Oscillation and chaos in physiological control systems / M.C. Mackey and L. Glass. Science, 1977. — p. 287−289.
- McNames, J. Local averaging optimization for chaotic time series prediction / J. McNames // Neurocomputing. 2002. — № 4. — p. 279−297.
- Mendelson, S. A few notes on statistical learning theory / S. Mendelson, A. Smola // Advanced Lectures in Machine Learning. LNCS 2600. Springer, 2003.-p. 1−40.
- Miiller, K. An introduction to kernel-based learning algorithms / K. Miiller, S. Mika, G. Ratsch, K. Tsuda, B. Scholkopf. IEEE Transactions on Neural Networks, 2001 .-p. 181−201.
- Miiller, K. Predicting time series with support vector machines / K. Miiller, A. Smola, G. Ratsch, B. Scholkopf, O. Kohlmorgen, V. Vapnik. -Artificial Neural Networks ICANN 97. — Springer, 1997. -218 p.
- Oliveira, К. A. Using artificial neural networks to forecast chaotic time series / K.A. Oliveira, A. Vannucci, E.C. Silva. Physica A, 2000. — p. 393-^104.
- Omidvar, A.E. Configuring radial basis function network using fractal scaling process with application to chaotic time series prediction / A.E. Omidvar // Chaos, Sol. and Fract. -2004. № 4. — p.757−766.
- Optimal embedding parameters: A modelling paradigm. Physica D, 2004. — p. 283−296.
- Parker, T.S. Practical numerical algorithms for chaotic systems / T.S. Parker, L.O. Chua. -first ed. -Springer, New York, 1989. 425 p.
- Piatt, J. Fast training of support vector machines using sequential minimal optimization / J. Piatt // Advances in Kernel Methods: Support Vector Learning / под общ. ред. В. Scholkopf, C.J.C. Burges, A. Smola. The MIT Press, 1999-p. 161.
- Quinonero-Candela, J. Time series prediction based on the relevance vector machine with adaptive kernels / J. Quinonero-Candela, L. K. Hansen. -International Conference on Acoustics, Speech, and Signal Processing, 2002. -p.985−988.
- Ralaivola, L. Dynamical modeling with kernels for nonlinear time series prediction / L. Ralaivola, F. d’Alche Buc // Modeling, Identification and Control 15.-2004 .-№ 5 .-p. 125−138.
- Ratsch, G. Soft margins for AdaBoost / G. Ratsch, T. Onoda, K.-R. Muller // Machine Learning. 2001. — № 42. — p. 287−320.
- Rosipal, R. Prediction of chaotic time-series with a resourceallocating RBF network / R. Rosipal, M. Koska, I. Farkas // Neural Processing Letters 7. -1998 № 3 .-p. 185−197.
- Russell, D.A. Dimension of strange attractors / D.A. Russell, J.D. Hanson, E. Ott // Phys. Rev. Let. 45. 2000. — p. 1175−1178.
- Riiping, S. SVM kernels for time series analysis / S. Riiping // Technical report, CS Department, University of Dortmund. Dortmund, 2001. -p. 43−50.
- Sauer, Т. Embedology / Т. Sauer, Y. Yorke, M. Casdagli // J. Stat. Phys. 65. 1991. — p. 579−616.
- Sauer, T. Time series prediction by using delay coordinate embedding / T. Sauer- A.S. Weigend, N.A. Gershenfeld, eds. Time Series Prediction: Forecasting the Future and Understanding the Past. — Addison-Wesley, 1994.
- Scholkopf, B. Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond (Adaptive Computation and Machine Learning) / B. Scholkopf, A. Smola. The MIT Press, 2001 — 644 p.
- Scholkopf, B. New support vector algorithms / B. Scholkopf, P. Bartlett, A. Smola, R. Williamson // Neural Computation. 2000. — № 12. -p.1207−1245.
- Scholkopf, B. Statistical learning and kernel methods / B. Scholkopf // Machine Learning. 2006. — № 98. — p.63−95.
- Scholkopf, B. Support vector regression with automatic accuracy control / B. Scholkopf, P. Bartlett, A. Smola, R. Williamson // Proceedings of ICANN'98: Perspectives in neural computing (Berlin). Springer, 1998. -p. 111−116.
- University of London, 1998.
- Small, M. Minimum description length neural networks for time series prediction / M. Small, С. K. Tse // Physical Review E (Statistical, Nonlinear, andi
- Soft Matter Physics). 2002. — № 6, 66 701. i 113. Smola, A. A tutorial on support vector regression /А. Smola,
- B. Sch’olkoptf/ Statistics and Computation. 2004. — № 13. — p. 199−222.
- Smola, A. Generalization bounds and learning rates for regularized principal manifolds / A. Smola, R. Williamson, B. Scholkopf. Tech. report, Royal Holloway, University of London, 1998.
- Smola, A. Learning with kernels / A. Smola. Tech. report, GMD Forschungszentrum Informationstechnik. — St. Augustin, 1998.
- Smola, A. On a Kernel-based Method for Pattern Recognition, Regression, Approximation, and Operator Inversion / A. Smola, B. Scholkopf, J. Lemm and others. Algorithmica, 1997.
- Smola, A. Regression estimation with support vector learning machines / A. Smola. Tech. report, Physik Department, Technische Universitat Munchen, 1996.
- Takens, F. Detecting strange attractors in turbulence, Dynamical Systems of Turbulence (Berlin) / F. Takens- D. A. Rand, B. S. Young, eds. vol. 898 of Lecture Notes in Mathematics. — Springer, 1981. — p. 366−381.
- Tay, F. Modified support vector machines in financial time series forecasting / F. Tay, L. Cao // Neurocomputing. 2002. — № 14. — p. 847−861.
- Thissen, U. Using support vector machines for time series prediction / U. Thissen, R. van Brakela, A. P. de Weijerb, W. J. Melssena, L. M. C. Buyden // Chemometrics and Intelligent Laboratory Systems. 2003. — № 1. — p. 35−49.
- Vapnik, V. An overview of statistical learning theory / V. Vapnik // IEEE Transactions on Neural Networks. 1999. — № 10. — p. 998−1000.
- Vapnik, V. Estimation of dependencies based on empirical data / V. Vapnik. Springer Verlag, New York, 1982.
- Vapnik, V. Bounds on the error expectation for support vector machines / V. Vapnik, O. Chapelle // Neural Computation. 2000. — № 12. -p. 2013−2036.
- Vapnik, V. Necessary and sufficient conditions for the uniform convergence of means to their expectations / V. Vapnik, A. Chervonenkis // Theory of Probability and its Applications. 1981. — № 26. — p. 821−832.
- Vapnik, V. Statistical Learning Theoiy / V. Vapnik. John Wiley, New York, 1998−760 p.
- Vapnik, V. The Nature of Statistical Learning Theory / V. Vapnik. -Springer Verlag, New York, 1995. 315 p.
- Wah, B.W. Violation guided neural-network learning for constrained formulations in time-series predictions / B.W. Wah, M. Qian // Int’l Journal on Computational Intelligence and Applications. -2001. № 4. — p.383−398.
- Wan, E.A. Time series prediction by using a connestionist network with internal delay lines / E.A. Wan- A.S. Weigend, N.A. Gershenfeld, eds. Time Series Prediction: Forecasting the Future and Understanding the Past. — Addison-Wesley, 1994.-p. 195−217.
- Wang, L. Support Vector Machines: Theory and Applications / L. Wang. Spinger, 2005. — 435 p.
- Wang, X. Time-line hidden markov experts for time series prediction / X. Wang, P. Whigham, D. Deng, M. Purvis // Neural Information Processing -Letters and Reviews. 2004. — № 2. — p. 3948.
- Weigend, A.S. Time series prediction: Forecasting the future and understanding the past /A.S. Weigend, N.A. Gershenfeld. Addison-Wesley, 1994.
- Wendt, H. Support vector machines for regression estimation and their application to chaotic time series prediction / H. Wendt. Finkenweg, 2005. -103 p.
- Williamson, R. Generalization performance of regularization networks and support vector machines via entropy numbers of compact operators / R. Williamson, A. Smola, B. Scholkopf // IEEE Transactions on Information Theory. 2001. — № 6. — p. 2516−2532.