What would you be reading if you were interested in volatility modeling?
In the 90s one would read the papers from Derman and Dupire.
In the 00s books like Gatheral’s “The Volatility Surface” and Rebonato’s “Volatility and Correlation: The Perfect Hedger and the Fox” were mandatory.
In this decade one must know the Bergomi-Guyon expansion ( https://papers.ssrn.com/sol3/papers.cfm?abstract_id=1967470 ) and read Bergomi’s “Stochastic Volatility Modeling” for volatility surface modeling, and the recent developments in rough volatility ( the literature is collected here https://sites.google.com/site/roughvol/home ).
All of the researchers above were here in Lisbon, and the opportunity of being able to discuss and learn from them is what makes this event special.
And one talk in special, Jim Gatheral’s “Diamonds: A quant’s best friend”, was a great way to know what’s happening in the field.
With 5 tracks, it’s hard to attend every interesting talk, and most of the day I was looking at volatility modeling.
So while John Hull offered some 3 examples of machine learning applied to finance (clustering to classify country risk, neural networks to model movements of the implied volatility surface and the different available methodologies for credit risk classification), Rebonato described an adaptation of Heston’s model for FX options that keeps the same set of parameters over time while providing useful information about the market price of volatility risk. This is interesting, because if such a model is a good enough approximation of the co-movements of the surface as a whole (with market implied volatilities presenting a greater volatility of volatility than their actuarial counterparts) is a good risk management tool.
Jessica James discussed the changes in borrowing, lending and hedging; the basis is not only a reality for quants, but it’s volatile enough to be monitored by the FT: https://ftalphaville.ft.com/2018/03/23/1521832181000/Cross-currency-basis-feels-the-BEAT/ . With her experience in interest rates and FX modeling, Jessica showed why some apparent opportunity can still be around: not everything can be traded in a way that enables these funding-related spreads to be captured.
Pierre Henry-Labordere provided a solid theoretical background for the application of Neural networks as universal approximators, with applications in stochastic control problems and examples for the Uncertainty Volatility Model, CVA and Initial Margin calculations.
Luca Capriotti discussed advanced AAD applications for PDE and Monte Carlo Pricing; it’s not even 10am and the number of highly regarded experts that have already spoken gives you an idea of the level of this conference.
Jim Gatheral starts from the (exact) Itô Decomposition Formula of Alòs, which decomposes option prices as the sum of: (i) the classical Black–Scholes formula with volatility parameter equal to the root-mean-square future average volatility, plus (ii) a term due to correlation and (iii) a term due to the volatility of the volatility.
By freezing the derivatives in the Alòs formula, the expected value of a derivative contract can be expressed as an exact expansion of covariance functionals; the diamond notation is used to simplify the expression of these functionals.
The relationship between these autocovariance functionals (expressed directly from the models written in forward-variance form; e.g. Bergomi-Guyon) and the covariances of terminal quantities (easily computed from simulations) allows for an easy calibration of the models.
The function ξt(T), defined as the difference between the conditional variance of the log of the asset price at T and the value of the static hedge portfolio (the log-strip) for a variance swap (it is a tradable asset) is then defined as the stochascity; it can be estimated from the volatility surface (a suitable parametrization that allows for sensible interpolations and extrapolations will still be needed, but an expression that depends on the d2 formula is available). But this function is also equal to a simple expression with two diamond operators, and because these are a function of the model parameters, the model can be calibrated directly to tradable assets without having to rely on expansions.
Trees are then used to help the computation of the diamond terms (indexing the terms and defining when a factor of 1/2 appears), being collected in forests and leading up to the Exponential Theorem, expressing the expected value of H (a solution of the Black&Scholes equation) at the terminal time T as the exponential of the (infinite) sum of the forests times the value of H today.
An important result is that writing a stochastic volatility model in forward variance model and using its characteristic function as H, itds moments can be calculated regardless of the existence of a closed form expression for the characteristic function.
The particular cases of the Bergomi-Guyon and the rough Heston models are then studied, leading to to a closed-form expression for the leverage swap in the rough Heston model, allowing for a fast calibration of the model; the leverage swap is the difference between a gamma swap and a variance swap, and a gamma swap is a weighted variance swap, with weight given by the underlying level (normalized by the initial underlying level).
The paper is available at https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2983180 and it is a must read.
Marcos López de Prado presented “Advances in financial machine learning”; the cusp of it is also available in his recent book, and the traps in strategy selection are worth going over, even if (like Michael Harris) you think that his methodology might bury strategies that are interesting.
Marco Bianchetti and Marco Scaringi use Genetic Algorithms for optimization metaheuristics in “Learning the Optimal Risk”, and were followed by Tore Opsahl (Forecasting loan utilization using neural networks).
While Lorenzo Bergomi continued his theta analysis for the case of barrier options, Peter Jaeckel presented interesting results on the use of MultiQuadric methods in “Cluster Induction”; if Peter says something about numerical methods, one should listen.
Massimo Morini presented an overview of the blockchain and discussed their practical usage on derivatives collateralization, the differences between between private and public blockchains, and many other related topics. He’s been among the first to discuss DLT at the Global Derivatives events.
I really enjoyed Jesper Andreasen’s “Tough vol” presentation. After watching the 5 rough volatility presentations last year, he embarked in a journey to understand it better, going over: (i) fractional brownian motion, rough paths, the Hurst parameter and connections to interest rate models (ii) rough volatility and its motivation (iii) asymptotic results (iv) Hawkes processes and market microstructure (v) numerical implementation.
On the motivation for rough volatility, he listed the empirical observations consistent with rough volatility and touched on a favourite topic of mine: the min variance delta, showing how rough volatility models imply a min variance delta closer to to empirical one that the standard diffusive models.
Expanding and price and implied volatility (using Alòs) he gets the ATM limits for the relevant variables, including the expected result for the time-decay of the skew (T raised to H-1/2), slower convergences for ATM deltas and digitals and flattening of the risk-reversal for short maturities.
After that he showed how Hawkes (self-exciting processes) gives rises to the behavior described on the motivation, even wothout an exogenous stochastic volatility factor, and how it might be a better approach for simulation than simulating a fractional brownian motion.
It is very interesting to see an experienced practitioner approaching a new topic and how he learns it, good enough to skip Google’s Tyler Ward presentation on Regression and information criteria, which covered the problems in applying the common regularization methods without taking into account the increase in entropy error.
Stream C had Michael Pykhtyn and Diana Iercosan, both from the FED, discussing the Basel CVA and Volcker respectively, followed by Vincent Spain from the National Bank of Belgium.
Mikko Pakkanen discussed rough volatility and its application in FX markets. Here the main problem of the rough Bergomi model is the fitting of skews at large expiries; a model that decouples skew and smile is necessary, implemented using Monte Carlo with a hybrid scheme (exact for for the first slices of the approximation of the process). Here the Monte Carlo approach can be made faster by the author’s approach of variance reduction (“Turbocharging Monte Carlo pricing for the rough Bergomi model”, available here: https://arxiv.org/abs/1708.02563 and with code here: https://github.com/ryanmccrickerd/rough_bergomi ) or by machine learning; this last approach can be controlled on a Python notebook using widgets.
At the same time, Wim Schoutens presented on derivative pricing by machine learning (please read his post here: https://knect365.com/quantminds/article/04976242-e05d-42f9-983f-09f1e78e3e82/quant-vs-machine-derivative-pricing-by-machine-learning ).
After lunch, the volatility modeling stream continued with Peter Friz on Stepping Stochvol, introducing a singular parametric Local Stochastic Volatility model (modifying Heston to allow extreme skews in the short end), with an easy implementation, followed by Laura Ballota (“Volatility by Jumps”).
Among the other presentations covering machine learning:
(i) “Learning curve dynamics with Artificial Neural Networks” by Alexei Kondratyev was interesting (especially detection of abnormal curve shapes with autoencoders)
(ii) “Machine learning and complex networks for stock market research” by Juho Kanniainen introduced the Big Data Finance project ( http://bigdatafinance.eu/ ), and “Predicting Jump Arrivals in Stock Prices Using Neural Networks with Limit Order Book Data” ( https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3165408 ) looks interesting
(iii) “From artificial intelligence to machine learning, from logic to probability” by Paul Bilokon discusses the origins of AI and machine learning, presenting the frequentist and bayesian approaches to probability and introduces domain theory as a necessary tool in the construction of partial stochastic process (there’s much more, it’s worth signing up for the Thalesians presentations).
(iv) “Deep Hedging” by Lukas Gonon: Very interesting study on learning model hedging under transaction costs, including LSTM for barrier options ( https://arxiv.org/pdf/1802.03042.pdf )
The day closed with four luminaries: Bruno Dupire with and overview of his work on volatility modeling, Damiano Brigo on risk measures for rogue traders ( check his podcast here: https://www.risk.net/comment/5389041/podcast-damiano-brigo-on-derivatives-ai-machine-learning-and-more ), Fabio Mercurio on the application of the Ito-Chaos expansion to GARCH processes and Julien Guyon on the Joint calibration of SPX and VIX options.
I missed the hacker’s presentation from Day 2, so I’ll discuss it on Day 3.
As always, any errors on the interpretation of the presentations/papers are the responsibility of Marcos. The delay is due to the sheer amount of information presented on the day, and the time needed to make justice to Jim’s presentation, which I think was the main presentation of the event if you’re interested in volatility modeling.