With this work we prove that the maximum entropy approach has important applications in Risk management for the density estimation of aggregated risk events, and even for the calculation of the individual losses that come from the aggregated data (decompounding process), when the available information consists of an observed sample, which we usually do not have any information about the underlying process.
Through the knowledge of a few fractional moments the Maxentropic methodologies provides a good representations of densities when the data is scarce, or when the data presents correlation, large tails or multimodal characteristics. In this work we present an extension of the maxent approach that allows us to incorporate the effect of sample errors that in most of the cases improve the convergence of the method and the quality of the results, in addition this provides an estimator of the additive error in the data. For this procedure the input would be the sample moments E[eαS]=μ(α) and some interval that encloses the difference between the true value of μ(α) and the sample moments (for eight values of the Laplace transform), this interval would be related to the uncertainty (error) in the data, where the width of the interval may be adjusted by convenience. Through a simulation study we analyze the quality of the results, considering the study of the size of the gradient, the time of convergence and the differences with respect to the true density, comparing with the Standard Method of Maximum Entropy(SME) and a extension of this methodology that allows to incorporate additional information through a reference measure.
Although our motivating example come from the field of Operational Risk analysis, the developed methodology may be applied to any branch of applied sciences.
Keywords: maximum entropy, decompounding; density reconstructions.