報 告 人:張新雨 研究員
報告題目:Optimal Weighted Random Forests
報告時間:2023 年7月14日(周五)上午10:30-11:30
報告地點:靜遠樓1506學術報告廳
主辦單位:數學研究院、數學與統計學院、科學技術研究院
報告人簡介:
張新雨,中科院數學與系統科學研究院預測中心研究員。主要從事統計學和計量經濟學的理論和應用研究工作,具體研究方向包括模型平均、機器學習和組合預測等,發表論文80余篇,其中多篇論文發表在計量經濟學和統計學頂級期刊。擔任SCI期刊《JSSC》領域主編和其他5個國內外重要期刊的編委,是管理科學與工程學會常務理事、國際統計學會當選會員,先后主持自科優秀和杰出青年基金項目,曾獲中國青年科技獎。
報告摘要:
The random forest (RF) algorithm has become a very popular prediction method for its great flexibility and promising accuracy. In RF, it is conventional to put equal weights on all the base learners (trees) to aggregate their predictions. However, the predictive performances of different trees within the forest can be very different due to the randomization of the embedded bootstrap sampling and feature selection. In this paper, we focus on RF for regression and propose two optimal weighting algorithms, namely the 1 Step Optimal Weighted RF (1step-WRFopt) and 2 Steps Optimal Weighted RF (2steps-WRFopt), that combine the base learners through the weights determined by weight choice criteria. Under some regularity conditions, we show that these algorithms are asymptotically optimal in the sense that the resulting squared loss and risk are asymptotically identical to those of the infeasible but best possible model averaging estimator. Numerical studies conducted on real-world data sets indicate that these algorithms outperform the equal-weight forest and two other weighted RFs proposed in existing literature in most cases.