Wednesday, December 16, 2020

Global currency lifetime (or on probability of USD replacement)

Introduction

Idea for this analysis emerged from the statement that the currencies used in global economy are replaced periodically. Such statement is usually accompanied with a graph of lifetime of such global currencies, like the one from this "Medium" article: "On Free Markets For Money" Hence, it is interesting to estimate the lifetime of global currencies and associated probabilities of their survival.

Such statement has many practical applications. One of them relates to the discounting process, where USD overnight FED rate is often referred to the so-called "risk free rate". It is an abstract coming from quantitative finance, that is used to price derivative contracts. Obviously, if there is a risk of USD loosing its top-liquid currency then it should lose its "risk free" label too. This risk has to contribute to a spread on top of the "risk free" discounting rate.

The memo is laid down in the discussion manner with an aim to demonstrate 'lose ends' of the arguments and to stimulate further counter-arguments and developments.

Data

First, the data are taken from the plot within the mentioned article, typed in and are plotted for the crosscheck. Few remarks are important though:

  • definition of the global currency as a 'legal tender' to set notional of international trading contract implies that the associated volume can be traceable through the entire historical period under question. There is no such data. Hence, 'global currency' classification is somewhat weak;
  • dates of the start and end of periods are not rigorously re-checked, however, they make some sense:
    1. Portuguese Empire began to expand in ~1450 after the row of Great Discoveries (new sea routes) and establishment of new colonies;
    2. Spanish Empire lagged due to the necessity to liberate Iberian peninsula from Islamic rule. When completed, the Spain conquered Portugal and they formed Iberian Unity. The 1530 is set to the date of crowing of Charles V as an Emperor, who designed 'the empire on which the Sun never sets';
    3. The start of 'Dutch period' is set to 1640 after Holland states liberated themselves and 'Vereenigde Oostindische Compagnie' (VOC) established monopoly over the trade with Japan. Development of sciences and technology, e.g. water pumping, ship-building, globes and maps making, arts and the great ability to negotiate special relationships have added to the significant growth of its trading networks and wealth;
    4. Second Spanish period (data point 'spanish2') was triggered by pan-European wars (Spain against France-England Union and Russia against Sweden) which lasted till 1720. Despite these events Spain became top economy with the largest network of colonies;
    5. At the end of this war France became at the same level as Spain. In fact, during this period French philosophers has ignited and led the 'Age of Enlightenment', which later led to the rise of new class and French Revolution and to the industrialization of England. French language became the language of elites due to the large trading network and also great contribution of philosophers, first Encyclopedia and scientific research;
    6. Britain gained its supremacy after the self-destructive French Revolution, subsequent Napoleonic wars and then his defeat at Waterloo. Before that, Britain had prepared itself going through industrialization, trade and expansion of colonies. Despite the loss of United States, it strengthened its power in every other colonies. This stimulated the development of sciences and making English a global language;
    7. An American success became possible due to the weakening of European powers going through First World War, development of financial technologies, technological advantage and special relationship with Britain. English language shared with former metropole did contribute to its further cultural, scientific and technological expansion. The end of First World War and peaceful settlements in Europe and Near East have happened around 1920. They were negotiated mostly with participation of United States of America. At this time, flexible financial system and investment technology powered by stock markets became an important driver of successful economy. Due to decline and weakening of European financial system the American dollar has taken over and started its current period of reign.

Main drawback of this division is that it is purely subjective and is not explained through historic analysis of relative sizes of competing economies. Nevertheless, let’s analyze the sample of 7 points:

\{T_i, \Delta T_i\}, i=1...7

where T_i is the beginning of a period and \Delta T_i is the length of that period.



Assumptions

Few assumptions are used.

Assumption-1 : Censoring data. Select 'modern times'

Data before ~1500 ('portuguese') is cut-off. The reason for doing this is two-fold:

  • the regimes of global currency change are visibly different. If ancient times are retained, then the two-regimes model has to be invented, which is difficult to validate. The reason of this regime change is the beginning of 'Age of Great Discoveries', where discovery of both Americas by Christopher Columbus suddenly opened the unlimited resources. This also led to the beginning of social changes through the series of revolutions and the change of economic conditions;
  • too little data is known about alternative ancient monetary systems, like Persia, China or India. For example, in all terms the size of Chinese economy was quite significant even in Roman times. Communication between these economies was near zero.

An additional argument for this data censoring is that the presented analysis is very simplistic and is based solely on 7 data points. More complex analysis can be valid by inclusion of more data about the status of global economies participated in the inter-states competition at those times.

Assumption-2 : Independence of periods

The periods of global currencies are linked with each other in chain (with some exceptions). This is result of the global competition, where contenders work hard to get on top of the global scene and where the current 'ruling currency' gets weaker due to decay of passionary forces within the states. States rise and fall with regular pattern, just like human beings. This phenomena deserves its own study, however, in this analysis we assume that all 'periods' are independent of each other.

Assumption-3 : Same underlying process

All this analysis if driven by simple facts: two different economies such as Spanish from 16 century and British from 19 century were able to hold global leadership over the same period of time, 110 and 105 years, respectively. The sample of 7 data points range from 80 to 110 years. That’s actually an amazing fact. Perhaps, these data points share same dynamics, such as geopolitical competition of economies. In this short memo it is quite impossible to investigate whether it is true or not. Therefore, another assumption is that indeed there is same shared underlying process which drives changes of leaders with certain regularity. The discussion about the nature of this process is left outside of this discussion.

Today, after the period of relative stability the global economy once again experiences drastic changes. As it is mentioned at the beginning these changes are perceived as a possible evidence of another global currency change, from USD to something else like Yuan, SDR (Sovereign Depositary Receipt) or maybe even Bitcoin. However, the 'reigning period' of USD is still on-going. Assuming that the switch is inevitable let’s estimate the switch time from the available data. The analysis is done assuming few methods.


Analysis

Data analysis is performed with two basic methods:

  • Polynomial regression model;
  • Survival analysis uses estimation of survival process and applies it to the current period.

As a 'spoiler', due to the nature of used data, the first method appears to be less correct. Information about period proxies lifetime of economy being competitive, which is related to the survival process. The change of the leader is always accompanied with the challenge coming from contenders. It is reflection of two processes: the graduate decline and self-destruction of the leader and the rise of contender due to its ambition and hunger for the power. However, the switch does not happen that easily once the leader becomes weaker than the upcoming contender. Inertia of other countries who use the currency for their international trades has a certain prolongation effect.

1. Polynomial regression models

Regressions with 1st and 2d order polynomials are used:

\Delta T(\vec{\alpha},T_i) = \alpha_0 + \alpha_1\cdot T_i + \alpha_2 \cdot T^2_i

The objective function is minimized to obtain optimal \tilde{\vec{\alpha}} with equal errors (weights):

Q(\tilde{\vec{\alpha}}) = min_{\vec{\alpha}} \sum_{i=0}^N (\Delta T_i- \Delta T(T_i))^2

We can also assume that errors are normally distributed (in econometric fashion). Hence, the projected distribution of the current lifetime, \Delta T_{us} is:

F(\Delta T_{us}) = \frac{1}{\sigma_{us}\sqrt{2\pi}} \int_{-\infty}^{\Delta T_{us}} e^{-\frac{(\tau-\mu_{us})^2}{2\sigma_{us}^2}} d\tau

where \mu_{us} = \Delta T_{us}(\tilde{\vec{\alpha}}) and \sigma_{us}^2 = Q(\tilde{\vec{\alpha}})/N

Note: the constant corresponds to 0th order polynomial and does not involve procedure of minimization.

The results of this model are summarized in plots below. Survival probability is estimated as:

SP(1920+\tilde{\Delta T_{us}}) = 1 - F(\tilde{\Delta T_{us}})

Table with results is shown at the end.

Discussion

As it was mentioned above, the main determinant of the moment of the global currency change is when its strength and trust from others go under some threshold causing switch. So, the dynamics is shaped by interaction of global economic and geopolitical players through their competition. The leader gains its status as the most competitive state in terms of advanced in politics, state governance, economic policies and access to resources (commodity and human), science, technology etc. However, because of its internal processes similar to the ones of living organism, its decline of internal stability is inevitable. Coupled with the existence of global competition, where everyone is competing with everyone by making temporary coalitions, and the pressure from 'younger' contenders, the rotation of leader follows the same pattern with perhaps the same replacement mechanism.

This interaction dynamics must be understood through the detailed (structured) analysis of all endogenous variables (and may be more), but it also can be analyzed through observation of external variables like we do now.

This discussion leads to the conclusion that the regression of some threshold values is not good approach. The best alternative is survival (also time-to-hit) analysis. This kind of analysis is widely used in biology, medicine, event history (like this one) and engineering. In finance and economics it is used in insurance, credits and trading etc.

2. Survival analysis

In this analysis we aggregate all data into survival probability, \textit{SP}(\Delta T_i), distribution. To do this we ignore time, T_i and use only \Delta T_i. \textit{SP} can be measured in many ways, where the most simple and widely used empirical indicator is Kaplan-Meier estimator adopted to our data, which are sorted in increasing order:

\textit{SP}(\Delta t) = \prod_{i : \Delta T_i <= \Delta t} (1-\frac{i}{n})

where n is total statistics of the sample and i's are indexes of all data points satisfying \Delta T_i <= \Delta t.

Empirical \textit{SP} is defined by (1) through cumulative distribution function. Here, instead of estimating the 'mean' and 'std' of the sample we build empirical \textit{SP} and then estimate distribution through optimization procedure. Similar to the previous section, Normal distribution is chosen.

Below, K-M estimator is plotted for each point as \{\Delta T_i + 1920\} ("Data") with fitted Normal \textit{SP}-function on top ("Model").

Empirical SP with Kaplan-Meier estimator and fitted Normal SP as a model.

Note, that at 80 we have 2 data points ('portuguese' and 'dutch').

Table. Summary of expectations of \mu_{us} = 1920+\Delta T_{us}, respective s.t.d as \sigma_{us} and Survival Probabilities at 2020, 2030 and 2040.

Model
_________________
\mu_{us}, year |
_________
\sigma_{us}, year |
________
SP(2020), % |
___________
SP(2030), % |
___________
SP(2040), % |
___________
Constant 2015 | 12 | 30.2 | 8.3 | 1.5 |
Linear 2025 | 11.5 | 64.1 | 30.6 | 9.9 |
Parabolic 2028 | 12 | 74.0 | 41.2 | 15.8 |
SP: K-M with Normal 2002 | 14.9 | 11.9 | 3.2 | 0.6 |

Conclusions

Very few conclusions can be drawn:

  • assumption about persistent underlying process of currency switch is weak. More studies are needed;
  • input data used in this analysis although make sense but they are not properly validated even on the expert level, which would be interesting to see;
  • regression model is not fit for analysis of lifetime data, while survival analysis is more appropriate;
  • data points are not many, therefore, the statistical precision of obtained distribution is of course missing;
  • Although KM+Normal model expects switch from USD very soon, with high probability even during the next 10 years or so, which means that "the game is not yet over".

Thursday, March 26, 2020

Validation items of Algorithmic trading

Given the interest about algorithmic trading validation coming from regulators here is the (non-exhaustive) list of items to be checked when this activity is validated.

List of input requirements:
PRA - Consultation Paper | CP5/18 , "Algorithmic trading". February 2018
PRA - Supervisory Statement 5/18 , Algorithmic trading
MiFID II - Algorithmic trading - AFS summary
FCA : Algorithmic and High Frequency Trading (HFT) Requirements

Acronyms and terminology:
MTF - Multilateral Trading Facility
OTF - Organized Trading Facility
HFT - High Frequency Trading
DMA/DEA - Direct Market Access / Direct Electronic Access
PnL - Profit and Loss
Trader - owner of trading system
Algorithm - model interpreting the market and own performance
System - embeds market, own portfolio views, algorithm and order execution module

Validation of algorithmic trading is an exercise to prove integrity of the following items:
  • Stress resiliency of the algorithm (protection against singularities or internal self-generating order flow) with respect to different scenarios:
    • simulation of quote flooding
    • physical disconnection
      • develop sequence of actions
        • check risk reports
        • focus on biggest risks 
          • close position or
          • hedge
        • turn to alternative connections, brokers
  • Compensating controls
    • Pre-trade control includes estimations of:
      • profit margin (alternatively distribution of)
      • fee
      • risk margin charged by exchange
    • Kill-switch (Red button, ) is a combination of 
      • hardware based blocks (circuit breaker disconnects from the market)
      • software based blocks (control over single order gate, like order flow internalizer)
      • foreseen implementation of post-disaster scenarios, like 
        • shut down all orders outflow
        • recall/cancel all orders outstanding
        • neutralize unnecessary exposure (to the best knowledge of the system)
      • develop and test the system against such post-disaster scenarios:
        • What-if all market goes all UP/DOWN, 
        • All orders move away from the mid-price
        • etc.
  • Backtesting algorithm:
    • portfolio (PnL) performance
      • price prediction
      • order flow prediction vs realized LOB dynamics
      • order execution vs market reaction 
    • system performance due to:
      • latency
      • memory capacity
      • internal responsiveness
      • algorithm performance during periods of stress
        • requires development of time/flow-pressure simulator
        • algorithm may work differently under sequential tick injection and under time pressure, if/when speed has priority over smart result
  • Feed integrity (completeness and reliability of available information)
    • how many messages are lost
    • how does it impact the decision flow
  • Risk controls:
    • realtime control of own view vs exchange view over
      • order mismatch 
      • position mismatch 
    • same for end-of-day settlements (alignment with Settlements)
    • impact of variability of latency (if not co-located, but still even with collocation there might be a problem)
    • when all above is working, check Market risk
  • Resilience of trading systems:
    • reliable connectivity
    • presence of counter-balancing mechanisms embedded into the system allowing for compensation of the damage - so-called feedback mechanisms
Validator has to check algorithm performance keeping in mind that the following is avoided:
  • disorderly market - voluntary or accidental destabilizing impact is equally damaging for the reputation of the Trader
  • market abuse:
    • huge inventory or huge competitive technical advantage may destabilize the market to the profit of the owner of the algorithm. In case of such event it would be difficult to prove the innocence.
    • Intentional abuse of price formation versus behaviour of market players
  • business risk arises if Trader is a member of exchange with obligations to maintain the market and if system performance is non perfect:
    • How must such trader react to failure of his system? Can he fulfill his/her obligations
Ensure that the responsible Trader knows and understands the behavior of the system. All above items requires reflection in policies.

Monday, June 24, 2019

IS LIBRA FOR LIBERTE?

If to ignore #Libra functionality, it offers the same as Central Banks - centralisation, surveillance, inflation (= proxy inflation of underlying basket of currencies). That's why CB's are worried so much. Last year they discussed, postponed and finally avoided the issuance of own sovereign digital money but were afraid to destroy banking industry.
Now, Libra will damage this strategy, because from user's perspective it will have the same properties. Yet, Libra will be global and will compete even with forex exchange. For example, cuban FB-user from Spain can send money to her mom in Cuba without the need to pay for currency exchange and high transaction fee. Still, the same can be done with #Bitcoin but transaction fee is much higher (use then #LightningNetwork !).
All in all, this creates a mess and will push CB's to get defensive. It will be very hard, taking into account that #crypto provides the average desire of money customers (we all are):
- to avoid intermediary risk taker
- to avoid inflation of monetary base
- to be in control of own deposit (digged under "crypto-tree" in the "crypto-garden")
.
PS. Remember the old: "Liberte, Fraternite, Egalite". Missing FrereCoin and EgalCoin. 

Wednesday, June 19, 2019

Stress testing

Some thoughts aloud.:

Bank employ rich set of scenarios to test resilience of portfolio and to deliver various measures of risk. These models span from

  • Type-I: scenarios with attached probabilities (weights), like it is used in Value-at-Risk (VaR) or Monte Carlo (MC) type of models or
  • Type-II: stress test based models, where possible market states are scanned in wide range, portfolio performance is inter-/extrapolated and the worse case scenarios are used as risk measure.

hashtagHowever, thinking more about stress tests they should serve as a model independent addition to the usual probability hashtagmeasure *) based tools (type-I).

Another method is to reverse engineer the risk factors of portfolio in terms of weakest points (no likelihood/probability is attached), but that will serve the purpose to some extent, because this exercise will depend on the specific in-house (pricing) model. Some model independence can be achieved by building market-wide (AI?) model which can be used to detect pockets of instability (~singularity) which scenarios have to be injected into pricing of portfolio.

Discrepancy or consistency between measures used in pricing and risk modelling is similar topic, because if singularity becomes certain (realises) it changes/shifts all pricing (via price of risk, e.g. optionality or hashtagXVA type of modelling items).

Friday, June 14, 2019

Libor replacement - 2

Due to directions from FED, ECB, BoE etc there is a hype on #LiborReplacement which is due in 2021. Some argue that this will flow "by itself", similar to the change from national currencies to EURO in 1999-2002, some think that it will be "major disaster".

Indeed, many banking systems are relying on Libor. It starts from pricing and risk models which may use Libor as a core rate. Although such approach has changed since ~2007 when OIS rate was introduced as central rate for modelling, still there might be some entities who use Libor as a central quote.

Libor is often used as a reference rate, e.g. Libor+X%, to price commercial products towards corporate and retail (mortgage) markets. After 2021, all these contracts must be rolled. The change of Libor to the new rate perhaps will be done under zero-profit condition. That has to be calculated. Those who will do the calculations, remember, to account "in-arrears" settlement condition during transition period.

To add few words about causes of #Libor problem and possible solutions:


  • Libor was quoted by few closely-related banks. It was tempting for them to manipulate the rate, so they did.
  • One of the solutions to avoid manipulation is to invite a #thirdparty who will honestly and independently monitor the market and quote it. However, there are famous negative examples, when rating agencies were part of the deal too.
  • Governments take the role into their hands and say that the publicly traded Short-rates will be the new Libor. This is regarded as not the most optimal solution and might turn to be the new handle for corruption or market manipulation when they tried to safe those who are "too big to fail".
  • Another solution can be in hands of #CCP's. By definition, many banks today are obliged to process large portion of vanilla IR-contracts through CCP. Hence, these CCP are able to calculate the all-balancing rate out from the inherent cash flows. For the sake of stability of the market it would be useful to publish aggregated distributions of cash flows within financial system. By the way, CCP is also able to calculate implied contractual rates from these flows, hence it is possible to construct more reliable "new Libor" rate. That will embed an useful informational feedback. 
  • Yet another possibility, is to build a distributed (blockchained) register for quotes which will be delivered by all participants. The open-source algorithm will calculate and publish the rate based on the information delivered by participants. The readonly-backdoor can be given to regulators for audit purposes. The design of such system can be elaborated further to ensure stability and avoid manipulations. 


Monday, April 15, 2019

Consolidated Basel-4 framework (part-2)

Also seen in https://www.innovaest.org/blog

Previous post was about:
  • Border between Banking and Trading books, 
  • Alignment (mostly data) problems and solutions of Credit risk measures used in these books
  • Structural changes in Trading book capital measurement
  • Alignment between EL (Banking book) and CVA (Trading book)
  1. Changes in Banking book:
    1. Change in Credit Risk measurement is driven by IFRS9 requirements about classification of portfolio items, measurements (Fair Valuation vs Amortization vs FVOCI (Fair Value Other Comprehensive Income))
    2. Changes from TTC (Through The Cycle) to PIT (Point In Time), which is closer to the Risk Neutral measure used in Trading book). It helps to reduce delay in recognition of asset impairments
    3. (Current) Expected (Credit) Loss (CECL) model with smoother and faster reaction to the state of counterpart. Compare to IAS39, where loss is expected and accounted in pre-default and default state. 
    4. With regard to what has been discussed in the previous part, the questions are: 
      1. Can we compare CECL and CVA directly? 
      2. Hence can we compare PD(PTI) and PD(Risk Neutral)?
      3. Also, can we compare LGDs? Trading book credit related items are mostly governed by ISDA, while Banking book is governed by local legislation and loan agreements with lender (be it mortgage or plain customer loan)
    5. Remark: Changes in BigData industry led to openness of SME financial data, better predictability of economical data and also increased research about predictions of economic activities coming from satellite surveillance (by the way, this is interesting for separate discussion)
  2. Balance sheet structural effects, related to ALM:
    1. IRRBB - Net Income Interest (NII) risk
      1. Main driver of confusion and problem here is the existence of different methods of accounting interest:
        1. accrual as in contract method vs 
        2. aligned with IR instruments, like swaps. 
      2. This difference in methods is the main reason for the gap.
      3. Consistent simulation of IR-scenarios across entire book can be a source for more optimal resource allocation.
    2. Liquidity
      1. HQLA requirement creates demand for government bonds, which are required to be supported with deposits. Liquidity Coverage Ratio (LCR) implicitly demands equalisation of HQLA with run-out Cash Flows within 1-month, where deposits are the most vulnerable in this regard. Net Stable Funding Ratio (NSFR) is closely related to this ratio
    3. Structural FX risk 
      1. This one comes from the necessity to protect capital adequacy ratio at the aggregated level (denominated in base currency) from changes of capital in branches (denominated in other currencies) due to movements in foreign exchange rates (see "Minimum capital requirements for market risk", jan2016, item 4, page 5). 
  3. In summary, overall balance sheet optimization must be done within the following regulatory constraints:
    1. Capital adequacy ratio
    2. Leverage ratio
    3. LCR and NSFR

Consolidated Basel-4 framework (Part-1)

Also seen in https://www.innovaest.org/blog

BIS has published their Consolidated B4 framework for Banks.

Summary of themes:
  1. Introduce stronger border between Banking and Trading books.
  2. Better alignment between Credit Risk measures withing Trading and Banking books:
    1. Trading Credit Risk was measured over risk neutral measures (i.e. implied from traded instruments, such as CDS, Bonds etc.), while
    2. Banking Credit Risk was simpler, but had more complicated structure. It was a blend of
      1. Ratings coming from major rating agencies
      2. Statistics from the sample of default events
      3. Fundamental information coming from business (market size, accounts etc) of the counterpart
    3. Main difficulty of implementation of such alignment lies in the search of equivalent measure between Risk Neutral and Fundamental/Structured Credit risks present in the Trading and Banking books. 
  3. Trading book (Market and Trading Credit risks):
    1. It fixes capital arbitrage problem as a main concern from regulators. In the past, it was possible to shuffle instruments between those books in order to optimize (reduce) capital requirement. FRTB (Market risk) sets two restrictions:
      1. on product definitions, where they can be "hold till maturity" (banking book) or "available for trade" (trading book). 
      2. it is not possible to change capital model for those items which changed the book
    2. Capital for Trading book must be calculated with Standardized  Approach. It is done for the better and more homogeneous capital benchmark between banks.
    3. Traded Credit risk formerly accounted in IRC (Incremental Risk Charge) now moves into DRC (Default Risk Charge).
      1. IRC included both, credit spread (tradeable diffusion-type series) and default events (jump)
      2. DRC moves capital from default event into banking book.
    4. Cross-border (banking/trading) hedges are disallowed.
    5. Replacement of VaR with Expected Shortfall seems to be not much of the problem.
    6. NMRF (Non-Modellable Risk Factors) are very close to those mentioned as RNIV (Risk Not In VaR) by PRA (UK).
    7. New regulation requires approval at desk level. 
    8. Overall, the structure has changed:
      1. Basel-2: Regulatory Market risk RWA was a blend of IMA and Standardized approaches. Regulators encouraged banks to develop Economic Capital to allow regulators to benchmark both numbers. 
      2. Basel-3: Market risk RWA is calculated ultimately by SA, while IMA becomes the "new Economic Capital" and will be used for regulatory benchmarking.
  4. Cost of Credit - Expected Loss and CVA (Credit Valuation Adjustment)
    1. Within Basel-2 Expected Loss (EL) was provisioned within annual budget. Any Unexpected Loss was covered from Capital buffer. Trading Credit Risk (also Credit Counterparty Risk) accounted EL similarly.
    2. After and during the crisis of 2007-2009, CVA became important as a measure aligned with other instruments in Trading Book. Resolution of problems related to hedging rules. 
Continued here