Quantifying Biometric Life Insurance Risks With Non-Parametric Smoothing Methods.

Authors Publication date
2013
Publication type
Thesis
Summary Life tables are used to describe the one-year probability of death within a well defined population as a function of attained age and calendar year. These probabilities play an important role in the determination of premium rates and reserves in life insurance. The crude estimates on which life tables are based might be considered as a sample from a larger population and are, as a result, subject to random fluctuations. Most of the time, however, the actuary wishes to smooth these quantities to enlighten the characteristics of the mortality of the group considered which he thinks to be relatively regular. This dissertation aims at providing a comprehensive and detailed description of non-parametric graduation methods of experience data originating from life insurance. The term non-parametric refers to the flexible functional form of the regression curve. Like parametric methods, they too are liable to give biased estimates, but in such a way that it is possible to balance an increase in bias with a decrease in sampling variation. In the actuarial literature, the process of smoothing a mortality table is known as graduating the data. The little hills and valleys of the rough data are to be graded into smoothness, just as in building a road over rough terrain. Smoothing alone, however, is not graduation. Graduated rates must be representative of the underlying data and graduation will often turn out to be a compromise between optimal fit and optimal smoothness. Local polynomials regression and local kernel-weighted log-likelihood are discussed extensively. Important issues concerning the choice of the smoothing parameters, statistical properties of the estimators, criteria used for models selection, construction of confidence intervals and comparisons between the models are covered with numerical and graphical illustrations. Local non-parametric techniques combine excellent theoretical properties with conceptual simplicity and flexibility to find structure in many datasets. Considerable attention is devoted to the influence of the boundaries on the choice of the smoothing parameters. These considerations illustrate the need for more flexible approaches. Adaptive local kernel-weighted loglikelihood methods are introduced. The amount of smoothing varies in a location dependent manner and the methods allow adjustments based on the reliability of the data. These methodologies adapt neatly to the complexity of mortality surface, clearly because of the appropriate data-driven choice of the adaptive smoothing parameters. Finally, this manuscript deals with some important topics for practitioners. Those concern the construction and validation of portfolio specific prospective mortality tables, assessment of the model risk and, to a lesser extent, the risk of expert judgment related to the choice of the external data used.
Topics of the publication
Themes detected by scanR from retrieved publications. For more information, see https://scanr.enseignementsup-recherche.gouv.fr