• Media type: E-Book
  • Title: Robust Observation-Driven Models Using Proximal-Parameter Updates
  • Contributor: Lange, Rutger‐Jan [VerfasserIn]; van Os, Bram [VerfasserIn]; van Dijk, Dick J. C. [VerfasserIn]
  • imprint: [S.l.]: SSRN, 2022
  • Extent: 1 Online-Ressource (46 p)
  • Language: English
  • DOI: 10.2139/ssrn.4227958
  • Identifier:
  • Keywords: Implicit gradient ; Proximal point method ; Robust filters ; Score-driven methods ; Time-varying parameter models
  • Origination:
  • Footnote: Nach Informationen von SSRN wurde die ursprüngliche Fassung des Dokuments September 21, 2022 erstellt
  • Description: We propose a novel observation-driven modeling framework that allows for time variation in the model’s parameters using a proximal-parameter (ProPar) update. The ProPar update is the solution to an optimization problem that maximizes the logarithmic observation density with respect to the parameter, while penalizing the squared distance of the parameter from its one-step-ahead prediction. The associated first-order condition has the form of an implicit stochastic-gradient update; replacing this implicit update with its explicit counterpart yields the popular class of score-driven models. Key advantages of the ProPar setup are stronger invertibility properties (especially under model misspecification) as well as extended (global rather than local) optimality properties. For the class of postulated observation densities whose logarithm is concave, ProPar’s robustness is evident from its (i) muted response to large shocks in endogenous and exogenous variables, (ii) stability under poorly specified learning rates, and (iii) global contractivity towards a pseudo-truth—in all cases, even under model misspecification. We illustrate the general applicability and the practical usefulness of the ProPar framework for time-varying regressions, volatility, and quantiles
  • Access State: Open Access