TOMAS Julien

< Back to ILB Patrimony
Affiliations
  • 2012 - 2016
    Laboratoire de sciences actuarielle et financière
  • 2012 - 2013
    University of Amsterdam
  • 2016
  • 2015
  • 2014
  • 2013
  • Climate Change and Insurance.

    Arthur CHARPENTIER, Anne EYRAUD LOISEL, Alexis HANNART, Julien TOMAS
    Variances | 2016
    No summary available.
  • A credibility approach of the Makeham mortality law.

    Yahia SALHI, Pierre emmanuel THEROND, Julien TOMAS, Pierre e. THEROND
    European Actuarial Journal | 2016
    The present article illustrates a credibility approach to mortality. Interest from life insurers to assess their portfolios' mortality risk has considerably increased. The new regulation and norms, Solvency II, shed light on the need of life tables that best reect the experience of insured portfolios in order to quantify reliably the underlying mortality risk. In this context and following the work of Bühlmann and Gisler (2005) and Hardy and Panjer (1998), we propose a credibility approach which consists on reviewing, as new observations arrive, the assumption on the mortality curve. Unlike the methodology considered in Hardy and Panjer (1998) that consists on updating the aggregate deaths we have chosen to add an age structure on these deaths. Formally, we use a Makeham graduation model. Such an adjustment allows to add a structure in the mortality pattern which is useful when portfolios are of limited size so as to ensure a good representation over the entire age bands considered. We investigate the divergences in the mortality forecasts generated by the classical credibility approaches of mortality including Hardy and Panjer (1998) and the Poisson-Gamma model on portfolios originating from various French insurance companies.
  • Prospective mortality tables: Taking heterogeneity into account.

    Julien TOMAS, Frederic PLANCHET
    Insurance: Mathematics and Economics | 2015
    The present article illustrates an approach to construct prospective mortality tables for which the data available are composed by heterogeneous groups observed during different periods. Without explicit consideration of heterogeneity, it is necessary to reduce the period of observation at the intersection of the different populations observation periods. This reduction of the available history can arm the determination of the mortality trend and its extrapolation. We propose a model taking explicitly into account the heterogeneity, so as to preserve the entire history available for all populations. We use local kernel-weighted log-likelihood techniques to graduate the observed mortality. The extrapolation of the smoothed surface is performed by identifying the mortality components and their importance over time using singular values decomposition. Then time series methods are used to extrapolate the time-varying coefficients. We investigate the divergences in the mortality surfaces generated by a number of previously proposed models on three levels. These concern the proximity between the observations and the model, the regularity of the fit as well as the plausibility and consistency of the mortality trends.
  • Constructing entity specific projected mortality table: adjustment to a reference.

    Julien TOMAS, Frederic PLANCHET
    European Actuarial Journal | 2014
    This article presents an operational framework for constructing and validating projected mortality table specific to an insurer. We describe several methods of increasing complexity and the process of validation allowing an insurer to adjust a baseline mortality to get closer to a best estimate assessment of its mortality and longevity risk. They provide to the insurer some latitude of choice while preserving simplicity of implementation for the basic methodology. The methodologies articulate around an iterative procedure allowing to choose parsimoniously the most satisfying one. The process of validation is assessed on three levels. This concerns the proximity between the observations and the model, the regularity of the fit as well as the plausibility and consistency of the mortality trends. Finally, the procedure is illustrated from experience data originating from a French Insurance portfolio.
  • Uncertainty on survival probabilities and solvency capital requirement: application to long-term care insurance.

    Frederic PLANCHET, Julien TOMAS
    Scandinavian Actuarial Journal | 2014
    In this paper, we focus on uncertainty issues on disabled lives survival probabilities of LTC insurance policyholders and its consequences on solvency capital requirement. Among the risks affecting long-term care portfolios, special attention is addressed to the table risk, i.e. the risk of unanticipated aggregate mortality, arising from the uncertainty in modeling LTC claimants survival law. The table risk can be thought as the risk of systematic deviations referring not only to a parameter risk but, as well, to any other sources leading to a misinterpretation of the life table resulting for example from an evolution of medical techniques or a change in rules of acceptance. In fine, the idea is to introduce the risk of systematic deviations arising from the uncertainty on the disabled lives death probabilities directly. We analyze the consequences of an error of appreciation on the disabled lives survival probabilities in terms of level of reserves and describe a framework in an Own Risk and Solvency Ass.
  • Multidimensional smoothing by adaptive local kernel-weighted log-likelihood: Application to long-term care insurance.

    Julien TOMAS, Frederic PLANCHET
    Insurance: Mathematics and Economics | 2013
    We are interested in modeling the mortality of long-term care (LTC) claimants having the same level of severeness (heavy claimant). Practitioners often use empirical methods that rely heavily on expert opinions. We propose approaches not depending on an expert’s advice. We analyze the mortality as a function of both the age of occurrence of the claim and the duration of the care. LTC claimants are marked by a relatively complex mortality pattern. Hence, rather than using parametric approaches or models with expert opinions, adaptive local likelihood methods allow us to extract the information from the data more pertinently. We characterize a locally adaptive smoothing pointwise method using the intersection of confidence intervals rule, as well as a global method using local bandwidth correction factors. The latter is an extension of the adaptive kernel method proposed by Gavin et al. (1995) to likelihood techniques. We vary the amount of smoothing in a location-dependent manner and allow adjustments based on the reliability of the data. Tests, and single indices summarizing the lifetime probability distribution are used to compare the graduated series obtained by adaptive local kernel-weighted log-likelihoods to pp-spline and local likelihood models.
  • Quantifying Biometric Life Insurance Risks With Non-Parametric Smoothing Methods.

    Julien TOMAS
    2013
    Life tables are used to describe the one-year probability of death within a well defined population as a function of attained age and calendar year. These probabilities play an important role in the determination of premium rates and reserves in life insurance. The crude estimates on which life tables are based might be considered as a sample from a larger population and are, as a result, subject to random fluctuations. Most of the time, however, the actuary wishes to smooth these quantities to enlighten the characteristics of the mortality of the group considered which he thinks to be relatively regular. This dissertation aims at providing a comprehensive and detailed description of non-parametric graduation methods of experience data originating from life insurance. The term non-parametric refers to the flexible functional form of the regression curve. Like parametric methods, they too are liable to give biased estimates, but in such a way that it is possible to balance an increase in bias with a decrease in sampling variation. In the actuarial literature, the process of smoothing a mortality table is known as graduating the data. The little hills and valleys of the rough data are to be graded into smoothness, just as in building a road over rough terrain. Smoothing alone, however, is not graduation. Graduated rates must be representative of the underlying data and graduation will often turn out to be a compromise between optimal fit and optimal smoothness. Local polynomials regression and local kernel-weighted log-likelihood are discussed extensively. Important issues concerning the choice of the smoothing parameters, statistical properties of the estimators, criteria used for models selection, construction of confidence intervals and comparisons between the models are covered with numerical and graphical illustrations. Local non-parametric techniques combine excellent theoretical properties with conceptual simplicity and flexibility to find structure in many datasets. Considerable attention is devoted to the influence of the boundaries on the choice of the smoothing parameters. These considerations illustrate the need for more flexible approaches. Adaptive local kernel-weighted loglikelihood methods are introduced. The amount of smoothing varies in a location dependent manner and the methods allow adjustments based on the reliability of the data. These methodologies adapt neatly to the complexity of mortality surface, clearly because of the appropriate data-driven choice of the adaptive smoothing parameters. Finally, this manuscript deals with some important topics for practitioners. Those concern the construction and validation of portfolio specific prospective mortality tables, assessment of the model risk and, to a lesser extent, the risk of expert judgment related to the choice of the external data used.
Affiliations are detected from the signatures of publications identified in scanR. An author can therefore appear to be affiliated with several structures or supervisors according to these signatures. The dates displayed correspond only to the dates of the publications found. For more information, see https://scanr.enseignementsup-recherche.gouv.fr