BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//hacksw/handcal//NONSGML v1.0//EN
CALSCALE:GREGORIAN
BEGIN:VEVENT
SUMMARY:YRD : Young Researchers Day | March 09\, 2018
DTSTART:20180309
DTEND:20180309
DESCRIPTION:Programme YRD 9/3/2018
09h00 : Pauline Ngugnie Diffouo
"Static risk measure of life annuity products"
09h25 : Manon Martin
“Combining ASCA and mixed models to analyse high dimensional designed data”
09h50 : Vincent Bremhorst \;
“Inclusion of time-varying covariates in cure survival models with an application in fertility studies”
\;
10h15-10h35 : Coffee break
\;
10h40 : Joris Chau
“2D Wavelet regression for surfaces of Hermitian positive definite matrices”
11h05 : Nicolas Tavernier\, KUL
"Simple nonlinear shrinkage estimators for large-dimensional covariance matrices”
11h30 : Xavier Piulachs
“Joint Models for Longitudinal and Time-to-Event Data with Applications to Health Insurance”
\;
12h15 \; : Lunch
\;
Abstracts
Pauline Ngugnie Diffouo
Static risk measure of life annuity products
In this work\, we compute the solvency capital (SC) of an insurer based on his investment strategy of the initial premium paid by the policyholders. By Solvency Capital we mean the amount the insurer has to put aside in order to be solvent until the end of the contract and in accordance with the regulations set by the authorities. Our general target is to compute the SC of a life annuity\; that is a series of payments at fixed intervals\, paid while the annuitant is alive. We hence extend the work did by Adrien Lebègue in his PhD thesis [1] in which he computes the solvency capital for a single cash ow (or a lump sum) at retirement. This is achieved using the static risk measure VaR (i.e the worst loss over a time horizon that will not be exceeded with a given level of confidence) and for three different investment strategies. We find that it is almost impossible to find an explicit formula of the solvency capital and we thus make use of the Monte Carlo method to plot out the SC. Our framework is based on some strong assumptions which are then relaxed in order to be more close to a real situation.
References:
[1] LEBEGUE\, A. 2016 Time consistent risk measures for long-term life insurances and pension products. PhD thesis\, UCL\, Belgium
[2] Ben Hcine\, M and Bouallegue\, R. On the Approximation of the Sum of Lognormals by a Log Skew Normal Distribution. Sup'Com\, Higher School of Communication\, University of Carthage\,Ariana\,Tunisia
[3] TASSA\, H. 2012 Solvabilité des Plans de Pension. PhD thesis\, UCL\, Belgium.
Manon Martin
Combining ASCA and mixed models to analyse high dimensional designed data
Chemometrics and especially Omics data produce high-dimensional multivariate data matrices\, here defined as the response matrices\, usually with a larger number of variables than samples and a high biological/instrumental variability. Moreover\, they often involve an advanced experimental design that must be considered during their analysis.
Current multivariate projection methods (PLS\, OPLS\, etc.) and linear models (linear regression\, ANOVA\, MANOVA\, etc.) cannot deal respectively with advanced experimental designs and with data matrices having more variables than observations. To be able to extract meaningful information from these high dimensional data matrices\, two popular approaches\, namely ASCA(+) and APCA(+) [1-3] combine the strengths of statistical modelling and PCA in this context. They have two main steps: (1) the response matrix decomposition using ANOVA or GLM estimators leading to the fixed effect matrices and (2) the multivariate analysis (PCA) of those (residual-augmented) effect matrices.
In order to incorporate random effects from the design\, the scope of this research is to further extend the ASCA/APCA methodology using mixed models. The main modifications of the suggested extension are the following: an initial PCA is applied to the response matrix\; the parallel ANOVA or GLM decomposition is replaced by a parallel mixed model decomposition\; a global measure of factor importance for both fixed and random effects is quantified and their statistical significance is tested based on a likelihood ratio statistic and/or on a parametric bootstrap procedure.
This extended methodology has been successfully applied to 1H-NMR metabolomic data to study the spectral repeatability/reproducibility with one fixed and two random factors. It enabled to quantify the fixed and random effects importance as well as their statistical significance. Further researches will focus on the application of this methodology to longitudinal studies.
References:
[1] Jansen\, J. J. et al. ASCA: analysis of multivariate data obtained from an experimental design. Journal of Chemometrics 19: 469–481\, 2005.
[2] G. Zwanenburg et al\, ANOVA-principal component analysis and ANOVA-simultaneous component analysis: a comparison: ANOVA-PCA and ASCA: a comparison. Journal of Chemometrics 25: 561–567\, 2011.
[3] M. Thiel et al. ASCA+ and APCA+: Extensions of ASCA and APCA in the analysis of unbalanced multifactorial designs\, Journal of Chemometrics 31\, 2017.
Vincent Bremhorst
Inclusion of time-varying covariates in cure survival models with an application in fertility studies
Cure survival models are used when one desires to explicitly acknowledge that an unknown proportion of the studied population will never experience the event-of-interest. In this talk\, an extension of the promotion time cure model enabling to directly include time-varying covariates as regressors when modelling simultaneously the probability and the timing of the monitored event is presented. Our proposal enables to handle non monotone population hazard function without specific parametric assumption on the baseline hazard. This extension is motivated and illustrated on data from the German Socio-Economic Panel (GSOEP) by studying the transition to second and third births in West Germany.
Joris Chau
Intrinsic 2D wavelet regression for surfaces of Hermitian positive definite matrices
In multivariate nonstationary time series analysis\, a non-degenerate time-varying spectrum is a surface of Hermitian positive definite (HPD) matrices across time and frequency. Preserving these HPD constraints is an important aspect in any spectral estimation procedure\, not only for interpretion of the estimate as a spectrum\, but also for subsequent inference\, requiring e.g. the inverse spectrum. In this talk\, we outline the construction of genuinely intrinsic 2D wavelet transforms for surfaces in the non-Euclidean space of HPD matrices\, including wavelet coefficient decay and linear wavelet thresholding convergent rates of intrinsically smooth surfaces of HPD matrices. The intrinsic wavelet transforms are computationally fast and nonlinear wavelet shrinkage or thresholding captures localized features\, such as peaks or troughs in the matrix-valued surfaces\, always guaranteeing an HPD spectral estimate. As an illustrative example\, we estimate the multivariate spectrum and coherence of nonstationary multivariate local field potential time series obtained from a rat brain during seizure.
Nicolas Tavernier
Simple nonlinear shrinkage estimators for large-dimensional covariance matrices
An optimal rule is derived for shrinking large-dimensional sample covariance matrices under Frobenius loss. The rule generalizes Ledoit and Wolf's optimal linear shrinkage rule to broader parametric families of rules. The families include\, for example\, polynomial and spline rules. The oracle version of the optimal rule is very simple and attains the lower bound on the Frobenius loss in finite samples. A feasible version is derived and approximates the lower bound under large-dimensional asymptotics where p/n → c>\;0. In settings that have been studied earlier\, nonlinear shrinkage is found to substantially reduce the Frobenius loss compared to linear shrinkage. Nonlinear shrinkage is conceptually easy\, does not require non-convex optimization in high dimension\, and allows p>\;n.
Xavier Piulachs
Joint Models for Longitudinal and Time-to-Event Data with Applications to Health Insurance
Population aging in most industrialized societies has led to a widespread reconsideration of customer-provider relationship in the health insurance sector. Elderly policyholders need to be provided with fair premiums based on their individual health status\, whereas insurance companies want to plan for the potential costs of tackling lifetimes above mean expectations. In this presentation\, we focus on a large cohort of policyholders in Barcelona (Spain)\, aged above 65 years. A shared-parameter joint model is proposed to analyze the relationship between annual demand for emergency claims and time until death outcomes\, which are subject to left truncation. We compare different functional forms of the association between both processes\, and\, furthermore\, we illustrate how the fitted model provides time-dynamic predictions of survival probabilities. The parameter estimation is performed under the Bayesian framework using Markov chain Monte Carlo methods.
DTSTAMP:20230330
UID:642549014d851
END:VEVENT
END:VCALENDAR