Time future contained in time past – QuantMinds 2018 Main Conference Day 3

Day 3 opened with Emanuel Derman presenting “The Past and Future of Quantitative Finance”:

In “Derivatives without Diffusion”, Derman shows how Spinoza defined emotions as derivatives of the underlying primitive affects of Desire, Pleasure and Pain. The composition of schadenfraude (literally harm-joy) is an example of how these affects can be combined; but most of its definitions are static, with only vacillation involving volatility. The answer to why Spinoza had no anxiety in his system (Vacillation between Hope and Fear? Not a Passion? No Anxiety in the 17th Century?) was left as exercise to the reader/audience. My take is that increased social mobility has increased both the Hope of progression and the Fear of regression.

In “Diffusion without Derivatives”, physicists are shown to have understood diffusion in the 19th and early 20th centuries but, with the exception of Bachelier, they never extended this understanding to derivatives of the underlying particles.

In “The Idea of Replication”, this strategy opens up new avenues for valuation and risk management:

How to avoid risk?

(i) Dilution: Combine security with a riskless bond

(ii) Diversification: Combine security with many uncorrelated securities

(iii) Hedging: Combine security with a correlated security

Then, in the style of Feynman, modern finance is synthesized in one sentence:

“If you can hedge away all correlated risk

And you can then diversify over all uncorrelated risk

Then you should expect only to earn the riskless rate.”

This leads to CAPM: equal (unavoidable) risk expects equal return; but CAPM is not realistic.

There is a counterfactual illusion of probability:

It’s worth reading Peters and Gell-Mann ( https://arxiv.org/abs/1405.0585 ) and Taleb ( https://medium.com/incerto/the-logic-of-risk-taking-107bf41029d3 ) on this concept (ensemble x time, sequence matters). This is probably one of the most important topics to study in finance.

Even though options were priced and hedged in the 60s, only in the 70s every piece of the puzzle (diffusion+volatility+hedging+replication) was put together in the BSM framework.

A stock is very correlated with an option written on it; this connection is much stronger (and more true) than the statistical connection between two stocks, or the stock and the market; but volatility itself might be random, leading to a true derivative definition only at expiration.

Because hedged option traders were indifferent to the return of the stock, what they were forecasting and trading was the future volatility of the stock. But implied volatilities change every day. They are not a consequence of the past. They are raised and twisted and bent by customer demand and risk perception; but on trying to get information from them, the market starts calibrating models: fitting parameters to the traded prices; Derman highlights the publication of “Standard Deviations of Stock Price Ratios Implied in Option Prices”, by Latané and Rendleman (The Journal of Finance, May 1976) as critical in the origin of the rise of calibration. And now we get volatility as an asset class, with options as a means to an end.

Quantitative finance started to develop by attracting physicists (like Derman himself), but while in physics the future is always a consequence from the present, in finance you calibrate the future using present prices, and then come back to the present to predict other present prices (“And the end of all our exploring / Will be to arrive where we started”).

As markets and models grow more complex, calibration still continue to update unstable parameters at every iteration, as we trade volatility of volatility without a theory (or even a consensus model). As Rebonato pointed out during the event, when Pat Hagan was asked about the fit of SABR to the markets, he said: “In the beginning, it was not perfect, but after everyone started using it then it fit perfectly”. That is perhaps the great difference between Physics and Finance: electrons didn’t change their behaviour when our models of the atom changed.

As for the present and future, Derman pointed out the shift from:

(i) sell-side to buy-side

(ii) q quants (structural/derivative) to p quants (econometric/statistical)

While pointing out the risks of wrong models and lack of reproducibility in social sciences.

Market microstructure, enabled by the electronification of the markets and the availability of high-frequency data, is among the fields where the modeling has shifted back to the underlying assets and statistical approaches.

And now curiously a stock is not a single entity anymore; if today each food item is seen by some as a basket of nutrients, financial assets like stocks are now traded according to (statistical) factors and even who holds them is now used as a predictor of returns. Again, the Earth did not change its orbit as we went from epicycles to Kepler to Newton, but today everyone is invited to add their money to levers that will push the market into a new path that will not resemble the past. As the final slide points out, there are no reliable theorems in finance; it’s not math, it’s the world.

Quire a long commentary, but Derman’s ideas about the limitations of models are always worth considering, especially as a new generation arrives with powerful tools and a new hope for building a system of the world.

The plenary section was treated to another kind of model reevaluation with the anecdotes of freakyclown, co-founder of Redacted Firm and a social engineer / ethical hacker. Those of who have read Bruce Schneier know how he shifted his point of view about security from the technological to the holistic, paying special attention to the human element. In this presentation, freakyclown focused on the physical/human element of security, in an eye-opening collection of incidents that elicited laughs but hopefully reflections. About two years ago I read “A Burglar’s Guide to The City”, by Geoff Manaugh, and this presentation also should make one incorporate an attacker’s point of view in his/her daily life.

Damiano Brigo and Peter Carr also presented on Day 3, and Alexander Lipton pointed out the way for new financial services companies to disrupt the existing business of banks (e.g. WeChat).

Saeed Amen (Cuemacro) chaired Stream A, where a presentation on machine learning applied to FX trading by Patrik Karlsson had interesting comments on how to account for your actions and restrictions by Santander’s Francesca Fiorilli. The following presentation (“Implied volatility with bivariate Checyshev interpolation” by Kathrin Glau) had interesting results (fast results within a chosen precision) and also benefited from comments (Peter Jaeckel was present). Saeed and his colleague Shih-Hau Tan discussed FX trading (news, TCA and Python libraries – check them out here:  https://github.com/cuemacro/finmarketpy and check Saeed’s comments on the event here: http://www.cuemacro.com/2018/05/21/quantminds-lisbon-2018/ ).

An appropriate closure came with “Learning Rough Volatility” by Aitor Gonzalez. He finds that the most convincing argument for rough volatility is its forecasting power (based on the past realisation), and that combining the historical data with the market’s variance swap quotes we can infer the market premium of risk. Adding and exponential term allows for mean reversion, and an quite good parametrization (5 factors) of the whole surface with an efficient simulation leading to fast calibration.

Overall, rough volatility seems like a promising way to find the time future contained in time past.

And that concludes my analysis of the event (I was not there for the seminars).

As always, any errors on the interpretation of the presentations/papers are the responsibility of Marcos.