<?xml version="1.0" encoding="UTF-8"?><xml><records><record><source-app name="Biblio" version="7.x">Drupal-Biblio</source-app><ref-type>47</ref-type><contributors><authors><author><style face="normal" font="default" size="100%">Zhang, Guanyu</style></author><author><style face="normal" font="default" size="100%">Feng Li</style></author><author><style face="normal" font="default" size="100%">Kang, Yanfei</style></author></authors></contributors><titles><title><style face="normal" font="default" size="100%">Probabilistic Forecast Reconciliation with Kullback-Leibler Divergence Regularization</style></title><secondary-title><style face="normal" font="default" size="100%">2023 IEEE International Conference on Data Mining Workshops (ICDMW)</style></secondary-title></titles><keywords><keyword><style  face="normal" font="default" size="100%">Computer Science - Artificial Intelligence</style></keyword><keyword><style  face="normal" font="default" size="100%">Computer Science - Machine Learning</style></keyword><keyword><style  face="normal" font="default" size="100%">Conferences</style></keyword><keyword><style  face="normal" font="default" size="100%">data mining</style></keyword><keyword><style  face="normal" font="default" size="100%">Deep learning</style></keyword><keyword><style  face="normal" font="default" size="100%">forecasting</style></keyword><keyword><style  face="normal" font="default" size="100%">Fuses</style></keyword><keyword><style  face="normal" font="default" size="100%">Hierarchical time series</style></keyword><keyword><style  face="normal" font="default" size="100%">Probabilistic forecast reconciliation</style></keyword><keyword><style  face="normal" font="default" size="100%">Probabilistic logic</style></keyword><keyword><style  face="normal" font="default" size="100%">Time series analysis</style></keyword></keywords><dates><year><style  face="normal" font="default" size="100%">2023</style></year></dates><urls><web-urls><url><style face="normal" font="default" size="100%">https://ieeexplore.ieee.org/abstract/document/10411589</style></url></web-urls></urls><pages><style face="normal" font="default" size="100%">601–607</style></pages><language><style face="normal" font="default" size="100%">eng</style></language><abstract><style face="normal" font="default" size="100%">As the popularity of hierarchical point forecast reconciliation methods increases, there is a growing interest in probabilistic forecast reconciliation. Many studies have utilized machine learning or deep learning techniques to implement probabilistic forecasting reconciliation and have made notable progress. However, these methods treat the reconciliation step as a fixed and hard post-processing step, leading to a trade-off between accuracy and coherency. In this paper, we propose a new approach for probabilistic forecast reconciliation. Unlike existing approaches, our proposed approach fuses the prediction step and reconciliation step into a deep learning framework, making the reconciliation step more flexible and soft by introducing the Kullback-Leibler divergence regularization term into the loss function. The approach is evaluated using three hierarchical time series datasets, which shows the advantages of our approach over other probabilistic forecast reconciliation methods.</style></abstract></record></records></xml>