1. Trang chủ
  2. » Kinh Doanh - Tiếp Thị

High performance computing in finance problems, methods, and solutions

637 57 0
Tài liệu đã được kiểm tra trùng lặp

Đang tải... (xem toàn văn)

Tài liệu hạn chế xem trước, để xem đầy đủ mời bạn chọn Tải xuống

THÔNG TIN TÀI LIỆU

Thông tin cơ bản

Định dạng
Số trang 637
Dung lượng 20,9 MB

Các công cụ chuyển đổi và chỉnh sửa cho tài liệu này

Nội dung

As lessons are being learned from the recent financial crisis and ful stress tests, demand for superior computing power has been manifest inthe financial and insurance industries for relia

Trang 2

High-Performance Computing in Finance

Problems, Methods,

and Solutions

Trang 6

High-Performance Computing in Finance

Problems, Methods,

and Solutions

Edited by

M A H Dempster Juho Kanniainen John Keane Erik Vynckier

Trang 7

does not warrant the accuracy of the text or exercises in this book This book’s use or discussion

of MATLABRsoftware or related products does not constitute endorsement or sponsorship by TheMathWorks of a particular pedagogical approach or particular use of the MATLABRsoftware.CRC Press

Taylor & Francis Group

6000 Broken Sound Parkway NW, Suite 300

Boca Raton, FL 33487-2742

c

 2018 by Taylor & Francis Group, LLC

CRC Press is an imprint of Taylor & Francis Group, an Informa business

No claim to original U.S Government works

Printed on acid-free paper

International Standard Book Number-13: 978-1-4822-9966-3 (Hardback)

This book contains information obtained from authentic and highly regarded sources Reasonable efforts have been made to publish reliable data and information, but the author and publisher cannot assume responsibility for the validity of all materials or the consequences of their use The authors and publishers have attempted to trace the copyright holders of all material reproduced in this publication and apologize to copyright holders if permission to publish in this form has not been obtained If any copyright material has not been acknowledged please write and let us know so we may rectify in any future reprint.

Except as permitted under U.S Copyright Law, no part of this book may be reprinted, reproduced, transmitted, or utilized in any form by any electronic, mechanical, or other means, now known or hereafter invented, including photocopying, microfilming, and recording, or in any information stor- age or retrieval system, without written permission from the publishers.

For permission to photocopy or use material electronically from this work, please access

www.copyright.com ( http://www.copyright.com/ ) or contact the Copyright Clearance Center, Inc (CCC), 222 Rosewood Drive, Danvers, MA 01923, 978-750-8400 CCC is a not-for-profit organization that provides licenses and registration for a variety of users For organizations that have been granted

a photocopy license by the CCC, a separate system of payment has been arranged.

Trademark Notice: Product or corporate names may be trademarks or registered trademarks, and

are used only for identification and explanation without intent to infringe.

Library of Congress Cataloging-in-Publication Data

Names: Dempster, M A H (Michael Alan Howarth), 1938- editor | Kanniainen,

Juho editor | Keane, John editor | Vynckier, Erik editor.

Title: High-performance computing in finance : problems, methods, and

solutions / [edited by] M.A.H Dempster [and three others].

Description: Boca Raton, FL : CRC Press, 2018.

Identifiers: LCCN 2017052035| ISBN 9781482299663 (hardback) | ISBN

9781315372006 (ebook)

Subjects: LCSH: Finance Mathematical models | Finance Data processing.

Classification: LCC HG106 H544 2018 | DDC 332.01/5118 dc23

LC record available at https://lccn.loc.gov/2017052035

Visit the Taylor & Francis Web site at

http://www.taylorandfrancis.com

and the CRC Press Web site at

http://www.crcpress.com

Trang 8

Jonathan Rosen, Christian Kahl, Russell Goyder, and Mark Gibbs

2 Using Market Sentiment to Enhance Second-Order

Stochastic Dominance Trading Models 25

Gautam Mitra, Christina Erlwein-Sayer, Cristiano Arbex Valle,

and Xiang Yu

3 The Alpha Engine: Designing an Automated Trading

Anton Golub, James B Glattfelder, and Richard B Olsen

4 Portfolio Liquidation and Ambiguity Aversion 77

´

Alvaro Cartea, Ryan Donnelly, and Sebastian Jaimungal

5 Challenges in Scenario Generation: Modeling Market

and Non-Market Risks in Insurance 115

Douglas McLean

II Numerical Methods in Financial

6 Finite Difference Methods for Medium- and

High-Dimensional Derivative Pricing PDEs 175

Christoph Reisinger and Rasmus Wissmann

vii

Trang 9

7 Multilevel Monte Carlo Methods for Applications

Michael B Giles and Lukasz Szpruch

8 Fourier and Wavelet Option Pricing Methods 249

Stefanus C Maree, Luis Ortiz-Gracia, and Cornelis W Oosterlee

9 A Practical Robust Long-Term Yield Curve Model 273

M A H Dempster, Elena A Medova, Igor Osmolovskiy, and

Philipp Ustinov

and Jacques du Toit

11 Case Studies of Real-Time Risk Management

via Adjoint Algorithmic Differentiation (AAD) 339

Luca Capriotti and Jacky Lee

12 Tackling Reinsurance Contract Optimization

by Means of Evolutionary Algorithms and HPC 371

Omar Andres Carmona Cortes and Andrew Rau-Chaplin

13 Evaluating Blockchain Implementation of Clearing

and Settlement at the IATA Clearing House 391

Sergey Ivliev, Yulia Mizgireva, and Juan Ivan Martin

III HPC Systems: Hardware, Software, and

Peter Schober

15 Multiscale Dataflow Computing in Finance 441

Oskar Mencer, Brian Boucher, Gary Robinson, Jon Gregory,

and Georgi Gaydadjiev

16 Manycore Parallel Computation 471

John Ashley and Mark Joshi

17 Practitioner’s Guide on the Use of Cloud Computing

Binghuan Lin, Rainer Wehkamp, and Juho Kanniainen

Trang 10

18 Blockchains and Distributed Ledgers in Retrospective

Alexander Lipton

19 Optimal Feature Selection Using a Quantum Annealer 561

Andrew Milne, Mark Rounds, and Peter Goddard

Trang 12

Michael Dempster is Professor Emeritus, Centre for Financial Research,

University of Cambridge He has held research and teaching appointments

at leading universities globally and is founding editor-in-chief of Quantitative

Finance His numerous papers and books have won several awards, and he

is Honorary Fellow of the IFoA, Member of the Academia dei Lincei, andmanaging director of Cambridge Systems Associates

Juho Kanniainen is Professor of Financial Engineering at Tampere

Uni-versity of Technology, Finland He has served as coordinator of two national EU-programs: HPC in Finance (www.hpcfinance.eu) and Big Data

inter-in Finter-inance (www.bigdatafinance.eu) His research is broadly in quantitativefinance, focusing on computationally expensive problems and data-drivenapproaches

John Keane is Professor of Data Engineering in the School of Computer

Science at the University of Manchester, UK As part of the UK government’sForesight Project, The Future of Computer Trading in Financial Markets,

he co-authored a commissioned economic impact assessment review He hasbeen involved in both the EU HPC in Finance and Big Data in Financeprograms His wider research interests are data and decision analytics andrelated performance aspects

Erik Vynckier is board member of Foresters Friendly Society, partner of

InsurTech Venture Partners, and chief investment officer of Eli Global, lowing a career in banking, insurance, asset management, and petrochemicalindustry He co-founded EU initiatives on high performance computing andbig data in finance Erik graduated as MBA at London Business School and

fol-as chemical engineer at Universiteit Gent

xi

Trang 14

University College London

London, United Kingdom

Oxford, United Kingdom

Omar Andres Carmona Cortes

Cambridge Systems Associates

Cambridge, United Kingdom

Michael B Giles

Mathematical InstituteUniversity of OxfordOxford, United Kingdom

xiii

Trang 15

MIT Connection Science andEngineering

Cambridge, Massachusetts

Stefanus C Maree

Centrum Wiskunde & InformaticaAmsterdam, The Netherlands

Juan Ivan Martin

International Air TransportAssociation

Douglas McLean

Moody’s AnalyticsEdinburgh, Scotland,United Kingdom

Elena A Medova

Centre for Financial ResearchUniversity of Cambridgeand

Cambridge Systems AssociatesCambridge, United Kingdom

Oskar Mencer

Andrew Milne

1QBitVancouver, Canada

Gautam Mitra

OptiRisk Systems and Department

of Computer ScienceUCL, London, United Kingdom

Trang 16

Delft University of Technology

Delft, The Netherlands

Cambridge Systems Associates

Cambridge, United Kingdom

Oxford-Man Institute ofQuantitative FinanceUniversity of OxfordUnited Kingdom

Gary Robinson Jonathan Rosen

Quantitative ResearchFINCAD

Mark Rounds

1QBitVancouver, Canada

Xiang Yu

OptiRisk SystemsUnited Kingdom

Trang 18

As lessons are being learned from the recent financial crisis and ful stress tests, demand for superior computing power has been manifest inthe financial and insurance industries for reliability of quantitative modelsand methods and for successful risk management and pricing From a prac-titioner’s viewpoint, the availability of high-performance computing (HPC)resources allows the implementation of computationally challenging advancedfinancial and insurance models for trading and risk management Researchers,

unsuccess-on the other hand, can develop new models and methods to relax tic assumptions without being limited to achieving analytical tractability toreduce computational burden Although several topics treated in these pageshave been recently covered in specialist monographs (see, e.g., the references),

unrealis-we believe this volume to be the first to provide a comprehensive up-to-dateaccount of the current and near-future state of HPC in finance

The chapters of this book cover three interrelated parts: (i) ally expensive financial problems, (ii) Numerical methods in financial HPC,and (iii) HPC systems, software, and data with financial applications Theyconsider applications which can be more efficiently solved with HPC, togetherwith topic reviews introducing approaches to reducing computational costsand elaborating how different HPC platforms can be used for different finan-cial problems

Computation-Part I offers perspectives on computationally expensive problems in thefinancial industry

InChapter 1, Jonathan Rosen, Christian Kahl, Russell Goyder, and MarkGibbs provide a concise overview of computational challenges in derivativepricing, paying special attention to counterparty credit risk management Theincorporation of counterparty risk in pricing generates a huge demand for com-puting resources, even with vanilla derivative portfolios They elaborate pos-

sibilities with different computing hardware platforms, including graphic

pro-cessing units (GPU) and field-programmable gate arrays (FPGA) To reduce

hardware requirements, they also discuss an algorithmic approach, called rithmic differentiation (AD), for calculating sensitivities

algo-In Chapter 2, Gautam Mitra, Christina Erlwein-Sayer, Cristiano ArbexValle, and Xiang Yu describe a method for generating daily trading signals

to construct second-order stochastic dominance (SSD) portfolios of traded securities They provide a solution for a computationally (NP) hardoptimization problem and illustrate it with real-world historical data for theFTSE100 index over a 7-year back-testing period

exchange-xvii

Trang 19

InChapter 3, Anton Golub, James B Glattfelder, and Richard B Olsenintroduce an event-based approach for automated trading strategies In theirmethodology, in contrast to the usual continuity of physical time, only events(interactions) make the system’s clock tick This approach to designing auto-mated trading models yields an algorithm that possesses many desired featuresand can be realized with reasonable computational resources.

In Chapter 4, ´Alvaro Cartea, Ryan Donnelly, and Sebastian Jaimungalconsider the optimal liquidation of a position using limit orders They focus

on the question of how the misspecification of a model affects the tradingstrategy In some cases, a closed-form expression is available, but additionalrelevant features in the framework make the model more realistic at the cost

of not having closed-form solutions

InChapter 5, Douglas McLean discusses challenges associated with nomic scenario generation (ESG) within an insurance context Under Pilar 1

eco-of the Solvency 2 directive, insurers who use their own models need to producemulti-year scenario sets in their asset and liability modeling systems, which is

a computationally hard problem McLean provides illustrative examples anddiscusses the aspects of high-performance computing in ESG as well

Part II focuses on numerical methods in financial high-performance puting (HPC)

com-First, inChapter 6, Christoph Reisinger and Rasmus Wissmann considerfinite difference methods in derivative pricing Numerical methods with par-tial differential equations (PDE) perform well with special cases, but compu-tational problems arise with high-dimensional problems Reisinger and Wiss-mann consider different decomposition methods, review error analysis, andprovide numerical examples

In Chapter 7, Mike Giles and Lukasz Szpruch provide a survey on theprogress of the multilevel Monte Carlo method introduced to finance bythe first author Multilevel Monte Carlo has now become a widely appliedvariance reduction method Giles and Szpruch introduce the idea of multi-level Monte Carlo simulation and discuss the numerical methods that can beused to improve computational costs They consider several financial applica-tions, including Monte Carlo Greeks, jump-diffusion processes, and the multi-dimensional Milstein scheme

InChapter 8, Stef Maree, Luis Ortiz-Gracia, and Cornelis Oosterlee discussFourier and wavelet methods in option pricing First, they review differentmethods and then numerically show that the COS and SWIFT (Shannonwavelet inverse Fourier technique) methods exhibit exponential convergence.HPC can be used in the calibration procedure by parallelizing option pricingwith different strikes

In Chapter 9, Michael Dempster, Elena Medova, Igor Osmolovskiy, andPhilipp Ustinov consider a three-factor Gaussian yield curve model that isused for scenario simulation in derivative valuation, investment modeling, andasset-liability management The authors propose a new approximation of the

Trang 20

Black (1995) correction to the model to accommodate nonnegative (or tive lower bounded) interest rates and illustrate it by calibrating yield curvesfor the four major currencies, EUR, GBP, USD, and JPY, using the unscentedKalman filter and estimating 10-year bond prices both in and out of sample.Calibration times are comparable to those for the original model and withcloud computing they can be reduced from a few hours to a few minutes.

nega-InChapter 10, Uwe Naumann, Jonathan H¨user, Jens Deussen, and Jacques

du Toit review the concept of algorithmic differentiation (AD) and adjoint

algorithmic differentiation (AAD), which has been gaining popularity in

com-putational finance over recent years With AD, one can efficiently compute thederivatives of the primal function with respect to a specified set of parame-ters, and therefore it is highly relevant for the calculation of option sensitivitieswith Monte Carlo The authors discuss aspects of implementation and providethree case studies

InChapter 11, Luca Capriotti and Jacky Lee consider adjoint algorithmicdifferentiation (AAD) by providing three case studies for real-time risk man-agement: interest rate products, counterparty credit risk management, andvolume credit products AAD is found to be extremely beneficial for the caseapplications as it is speeding up by several orders of magnitude the computa-tion of price sensitivities both in the context of Monte Carlo applications andfor applications involving faster numerical methods

InChapter 12, Omar Andres Carmona Cortes and Andrew Rau-Chaplinuse evolutionary algorithms for the optimization of reinsurance contracts.They use population-based incremental learning and differential evolutionalgorithms for the optimization With a case study they demonstrate the par-allel computation of an actual contract problem

In Chapter 13, which ends Part II, Sergey Ivliev, Yulia Mizgireva, andJuan Ivan Martin consider implementation of blockchain technologies for theclearing and settlement procedure of the IATA Clearing House They develop

a simulation model to evaluate the industry-level benefits of the adoption ofblockchain-based industry money for clearing and settlement

Part III considers different computational platforms, software, and datawith financial applications

In Chapter 14, Peter Schober provides a summary of supercomputing,

including aspects of hardware platforms, programming languages, and allelization interfaces He discusses supercomputers for financial applicationsand provides case studies on the pricing of basket options and optimizing lifecycle investment decisions

par-InChapter 15, Oskar Mencer, Brian Boucher, Gary Robinson, Jon Gregory,

and Georgi Gaydadjiev describe the concept of multiscale data-flow computing,

which can be used for special-purpose computing on a customized architecture,leading to increased performance The authors review the data-flow paradigm,describe Maxeler data-flow systems, outline the data-flow-oriented program-ming model that is used in Maxeler systems, and discuss how to developdata-flow applications in practice, and how to improve their performance

Trang 21

Additionally, they provide a case study to estimate correlations between alarge number of security returns Other financial applications including inter-est rate swaps, value-at-risk, option pricing, and credit value adjustment cap-ital are also considered.

In Chapter 16, John Ashley and Mark Joshi provide grounding in theunderlying computer science and system architecture considerations needed totake advantage of future computing hardware in computational finance Theyargue for a parallelism imperative, driven by the state of computer hardwareand current manufacturing trends The authors provide a concise summary

of system architecture, consider parallel computing design and then providecase studies on the LIBOR market model and the Monte Carlo pricing of earlyexercisable Bermudan derivatives

In Chapter 17, Binghuan Lin, Rainer Wehkamp, and Juho Kanniainenreview cloud computing for financial applications The section is written forpractitioners and researchers who are interested in using cloud computing forvarious financial applications The authors elaborate the concept of cloud com-puting, discuss suitable applications, and consider possibilities and challenges

of cloud computing Special attention is given to an implementation examplewith Techila middleware and to case studies on portfolio optimization

In Chapter 18, Alexander Lipton introduces blockchains (BC) and

dis-tributed ledgers (DL) and describes their potential applications to money and

banking He presents historical instances of BL and DL The chapter

intro-duces the modern version of the monetary circuit and how it can benefit from

the BL and DL framework Lipton shows how central bank-issued digital rencies can be used to move away from the current fractional reserve bankingtoward narrow banking

cur-In Chapter 19, the last chapter in the book which points to the future,Andrew Milne, Mark Rounds and Peter Goddard consider feature selection for

credit scoring and classification using a quantum annealer Quantum

comput-ing has received much attention, but is still in its infancy In this chapter, withthe aid of the 1QBit Quantum-Ready Software Development Kit, the authors

apply quantum computing using the DWave quantum simulated annealing

machine to a well-known financial problem involving massive amounts of data

in practice, credit scoring and classification They report experimental results

with German credit data

For further in-depth background the reader should consult thebibliography

Acknowledgments

This book arose in part from the four-year EU Marie Curie project

High-Performance Computing in Finance (Grant Agreement Number 289032,

www.hpcfinance.eu), which was recently completed We would like to thank

Trang 22

all the partners and participants in the project and its several public events,

in particular its supported researchers, many of whom are represented in thesepages We owe all our authors a debt of gratitude for their fine contributionsand for enduring a more drawn out path to publication than we had originallyenvisioned We would also like to express our gratitude to the referees and toWorld Scientific and Emerald for permission to reprint Chapters 71 and182,respectively Finally, without the expertise and support of the editors and staff

at Chapman & Hall/CRC and Taylor & Francis this volume would have beenimpossible We extend to them our warmest thanks

Michael Dempster Juho Kanniainen John Keane Erik Vynckier

Cambridge, Tampere, Manchester

August 2017

Bibliography

1 Foresight: The Future of Computer Trading in Financial Markets 2012

markets-an-international-perspective

www.gov.uk/government/publications/future-of-computer-trading-in-financial-2 De Schryver, C., ed 2015 FPGA Based Accelerators for Financial Applications.

Springer

3 Delong, L 2013 Backward Stochastic Differential Equations with Jumps and

Their Actuarial and Financial Applications London: Springer.

4 Zopounidis, C and Galariotis, E 2015 Quantitative Financial Risk

Manage-ment: Theory and Practice New York, NY: John Wiley & Sons.

5 Uryasev, S and Pardalos, P M., eds 2013 Stochastic Optimization: Algorithms

and Applications Vol 54 Berlin: Springer Science & Business Media.

6 John, K 2016 WHPCF’14, Proceedings of the 7th Workshop on High

Perfor-mance Computational Finance Special issue, Concurrency and Computation

Practice and Experience 28(3)

7 John, K 2015 WHPCF’15, Proceedings of the 8th Workshop on High

Perfor-mance Computational Finance, SC’15 The International Conference for High Performance Computing, Networking, Storage and Analysis New York: ACM.

Scientific (2013).

Trang 26

Chapter 1

Computationally Expensive

Problems in Investment Banking

Jonathan Rosen, Christian Kahl, Russell Goyder, and Mark Gibbs

CONTENTS

1.1 Background 31.1.1 Valuation requirements 5

1.1.1.1 Derivatives pricing and risk 51.1.1.2 Credit value adjustment/debit value

adjustment 71.1.1.3 Funding value adjustment 101.1.2 Regulatory capital requirements 11

1.1.2.1 Calculation of market risk capital 121.1.2.2 Credit risk capital 121.1.2.3 Capital value adjustment 131.2 Trading and Hedging 141.3 Technology 151.3.1 Hardware 15

1.3.1.1 Central processing unit/floating point unit 151.3.1.2 Graphic processing unit 161.3.1.3 Field programmable gate array 161.3.1.4 In-memory data aggregation 171.3.2 Algorithmic differentiation 17

1.3.2.1 Implementation approaches 191.3.2.2 Performance 201.3.2.3 Coverage 211.4 Conclusion 22

References 23

Financial instruments traded on markets are essentially contractual ments between two parties that involve the calculation and delivery of quanti-ties of monetary currency or its economic equivalent This wider definition offinancial investments is commonly known as financial derivatives or options,

agree-3

Trang 27

and effectively includes everything from the familiar stocks and bonds to themost complex payment agreements, which also include complicated mathe-

matical logic for determining payment amounts, the so-called payoff of the

derivative

In the early days of derivatives, they were thought of more like traditionalinvestments and treated as such on the balance sheet of a business Complex-ity mainly arose in the definition and calculation of the option payoff, andapplying theoretical considerations to price them It was quickly discoveredthat probabilistic models, for economic factors on which option payoffs werecalculated, had to be quite restrictive in order to produce computationallystraightforward problems in pricing the balance sheet fair mark-to-marketvalue The development of log-normal models from Black and Scholes wasquite successful in demonstrating not only a prescient framework for deriva-tive pricing, but also the importance of tractable models in the practical appli-cation of risk-neutral pricing theory, at a time when computational facilitieswere primitive by modern standards Developments since then in quantitativefinance have been accompanied by simultaneous advancement in computingpower, and this has opened the door to alternative computational methodssuch as Monte Carlo, PDE discretization, and Fourier methods, which havegreatly increased the ability to price derivatives with complex payoffs andoptionality

Nevertheless, recent crises in 2008 have revealed the above complexitiesare only part of the problem The state of credit worthiness was eventually

to be revealed as a major influence on the business balance sheet in the eventthat a contractual counterparty in a derivatives contract fails to meet theterms for payment It was recognized that market events could create such

a scenario due to clustering and tail events In response to the explosion ofcredit derivatives and subsequent global financial crisis, bilateral credit valueadjustments (CVAs) and funding cost adjustments were used to representthe impact of credit events according to their likelihood in the accountingbalance sheet Credit value adjustments and funding adjustments introducedadditional complexity into the business accounting for market participants.While previously simple trades only required simple models and textbookformulas to value, the CVA is a portfolio derivative problem requiring jointmodeling of many state variables and is often beyond the realm of simpleclosed-form computation

Meanwhile, the controversial decision to use tax payer money to bail outthe financial institutions in the 2008 crisis ignited a strong political interest

to introduce regulation that requires the largest investors to maintain capitalholdings that meet appropriate thresholds commensurate with the financialrisk present in their balance sheet In recent years, there have been an adoption

of capital requirements globally, with regional laws determining methods andcriteria, for the calculation of regulatory capital holdings The demand placed

on large market participants to apply additional value adjustments for taxand capital funding costs requires modeling these effects over the lifetime of

Trang 28

the investment, which has introduced a large systemic level of complexity inaccounting for the financial investment portfolio, even when it is composed ofvery simple investments.

1.1.1.1 Derivatives pricing and risk

In the early days of option trading, accounting principles were completely

decoupled from financial risk measures commonly known today as market risk.

Investors holding derivatives on their balance sheets were mainly concernedwith the ability to determine the fair market value In the early derivativemarkets, the role of central exchanges was quite limited, and most derivativecontracts directly involved two counterparties; these trades were known asover-the-counter (OTC) As derivative payoffs became more closely tailored

to individual investors, the theoretical pricing of exotic trades was ingly complex, and there was no possibility for liquid quotes for investor hold-ings, meaning theoretical risk-neutral considerations became paramount tothe accounting problem of mark-to-market balance sheets

increas-The innovation in this area began with the seminal work of Black andScholes, leading to tractable and computationally efficient theoretical deriva-tive pricing formulas, which were widely adopted and incorporated into trad-ing and accounting technology for derivative investors An example is formula

for the (at expiry) value V t of a call option struck at K on a stock whose expected value at expiry t is F t,



K

+σ

2

2 t



where σ is the volatility parameter in the log-normal distribution of F t

assumed by the model The number of analytic formulas that could be deducedwas restricted by the complexity of the payoff and the sophistication of thedynamic model being used Payoffs that require many looks at underlying fac-tors and allow periodic or continuous exercise can greatly complicate or evenprohibit the ability to derive a useful analytic formula or approximation forpricing and risk

Meanwhile, the state-of-the-art in computation was becoming moreadvanced, leading to modern numerical techniques being applied to derivativepricing problems Considering the fundamental theorem of asset pricing inconjunction with the theorem of Feynman-Kac, one arrives at an equivalence

of the expectation of a functional of a stochastic process to partial-integro ferential equation, which can be further transformed analytically using Fourier

Trang 29

dif-method-based approaches leading to a system of ordinary differential tions All three might allow for further simplifications leading to closed-formsolutions or need to be solved numerically.

equa-Partial (integro-) differential equations: equa-Partial differential equations form

the backbone of continuous time-pricing theory, and a numerical imation to the solution can be achieved with finite difference methods aswell as many other techniques However, this approach suffers from the

approx-so-called curse of dimensionality, very similar to the case of multivariate

quadrature in that the stability of this approach breaks down for modelswith many underlying factors

Fourier methods: The mathematical formulation of a conjugate variable to

time can be used to greatly simplify convolution integrals that appear inderivative pricing problems by virtue of the Plancharel equality The use of

a known characteristic function of a dynamic process allows fast numericalderivative pricing for options with many looks at the underlying factor

Monte Carlo simulation: Monte Carlo methods offer a very generic tool

to approximate the functional of a stochastic process also allowing todeal effectively with path-dependent payoff structures The most generalapproach to derivative pricing is based on pathwise simulation of time-discretized stochastic dynamical equations for each underlying factor Theadvantage is in being able to handle any option payoff and exercise style,

as well as enabling models with many correlated factors The disadvantage

is the overall time-consuming nature and very high level of complexity inperforming a full simulation

Besides dealing with the complexity of the option payoff, the Black–Scholesformula made use of a single risk-free interest rate and demonstrated that inthe theoretical economy, this rate had central importance for the time value

of money and the expected future growth of risk-neutral investment strategiesanalogous to single currency derivatives This means a single discount curveper currency was all that was needed, which by modern standards led to

a fairly simple approach in the pricing of derivatives For example, a starting interest rate swap, which has future cash flows that are calculatedusing a floating rate that must be discounted to present value, would use

spot-a single curve for the term structure of interest rspot-ates to both cspot-alculspot-ate therisk-neutral implied floating rates and the discount rates for future cash flows.However, the turmoil of 2008 revealed that collateral agreements were ofcentral importance in determining the relevant time value of money to use fordiscounting future cash flows, and it quickly became important to separatediscounting from forward rate projection for collateralized derivatives Thesubsequent computational landscape required building multiple curves in asingle currency to account for institutional credit risk in lending at differenttenors

Trang 30

On top of this problem is the risk calculation, which requires the sensitivity

of the derivative price to various inputs and calculated quantities inside thenumerical calculation A universal approach to this is to add a small amount

to each quantity of interest and approximate the risk with a finite difference

calculation, known as bumping While this can be applied for all

numeri-cal and closed-form techniques, it does require additional numeri-calculations whichthemselves can be time-consuming and computationally expensive The mod-ern view is that by incorporating analytic risk at the software library level,

known as algorithmic differentiation (AD), some libraries such as our own

produce analytic risk for all numerical pricing calculations, to be described inSection 1.3.2

1.1.1.2 Credit value adjustment/debit value adjustment

Since the credit crisis in 2008, counterparty credit risk of derivative tions has become an increasingly important subject The IFRS 9 standardrequired a fair value option for derivative investment accounting (Ramirez,2015) and originally proposed to include CVA for application to hedgeaccounting, to represent the effect of an entity’s risk management activitiesthat use financial instruments to manage market risk exposures that couldaffect overall profit or loss IFRS 13 set out requirements intended to accountfor the risk that the counterparty of the financial derivative or the entity willdefault before the maturity/expiration of the transaction and will be unable

posi-to meet all contractual payments, thereby resulting in a loss for the entity or

the counterparty, which required accounting for CVA and a debit value

adjust-ment (DVA) in the balance sheet as nonperformance risk However, IFRS 9

does not provide direct guidance on how CVA or DVA is to be calculated,beyond requiring that the resulting fair value must reflect the credit quality

of the instrument

There are a lot of choices to be made when computing CVA, including thechoice of model used and the type of CVA to be computed In addition, thereare further decisions on whether it is unilateral or bilateral (Gregory, 2009),what type of closeout assumptions to make, how to account for collateral, andconsiderations for including first-to-default, which will be discussed later Thevariety of possible definitions and modeling approaches used in CVA calcula-tion has led to discrepancies in valuation across different financial institutions(Watt, 2011) There are a variety of ways to determine CVA, however, it

is often a computationally expensive exercise due to the substantial number

of modeling factors, assumptions involved, and the interaction among theseassumptions More specifically, CVA is a complex derivative pricing problem

on the entire portfolio, which has led to substantially increasing the ity of derivative pricing

complex-As a measure of the exposure of a portfolio of products to counterpartydefault, and if such an event occurs within some time interval, the expectedloss is the positive part of the value of the remainder of the portfolio afterthe event This is still in the realm of the familiar risk-neutral assumption

Trang 31

of derivative pricing in the form of an adjustment to the fair value, wherebyconditioning on the credit event explicitly and accumulating the contributionfrom each time interval over the life of the portfolio, an amount is obtained

by which the value of the portfolio can be modified in order to account for theexposure to counterparty default This can be expressed formally as

where R(t) is the recovery rate at time t, ˆ V (t) the value at time t of the

remainder of the underlying portfolio whose value is V (t), D(t) the discount factor at time t, τ the time of default, and I τ ∈(t,t+dt) is a default indicator,

evaluating to 1 if τ lies between time t and t + dt and 0 otherwise Note

that Equation 1.3 does not account for default of the issuer, an importantdistinction which is in particular relevant for regulatory capital calculationpurposes (Albanese and Andersen, 2014) Including issuer default is commonlyreferred to as first-to-default CVA (FTDCVA)

The first step in proceeding with computation of CVA is to define a timegrid over periods in which the portfolio constituents expose either party tocredit risk Next, the present value of the exposure upon counterparty default

is calculated at each point in time Often this involves simplifying tions, such that the risk-neutral drift and volatility are sufficient to modelthe evolution of the underlying market factors of the portfolio, and similarlythat counterparty credit, for example, is modeled as jump-to-default processcalibrated to observable market CDS quotes to obtain corresponding survivalprobabilities

assump-However, in practically all realistic situations, the investment portfolio ofOTC derivatives will contain multiple trades with any given counterparty Inthese situations, the entities typically execute netting agreements, such as theISDA master agreement, which aims to consider the overall collection of trades

as a whole, such that gains and losses on individual positions are offset againstone another In the case that either party defaults, the settlement agreementconsiders the single net amount rather than the potentially many individuallosses and gains One major consequence of this is the need to consider CVA

on a large portfolio composed of arbitrary groups of trades entered with agiven counterparty As Equation 1.3 suggests, the CVA and DVA on such aportfolio cannot be simply decomposed into CVA on individual investments,

thus it is a highly nonlinear problem which requires significant computational

facilities in order to proceed

Quantifying the extent of the risk mitigation benefit of collateral is tant, and this usually requires modeling assumptions to be made Collateralmodeling often incorporates several mechanisms which in reality do not lead

impor-to perfect removal of credit default risk, which can be summarized as follows(Gregory, 2015)

Trang 32

Granularity of collateral posting: The existence of thresholds and minimum

transfer amounts, for example, can lead to over- and under-collateralization,which can provide deficits and surpluses for funding under certain collateralagreements or Credit Support Annexes (CSAs)

Delay in collateral posting: The operational complexities of modern CSA

management involve some delay between requesting and receiving eral, including the possibility of collateral dispute resolution

collat-Cash equivalence of collateral holdings: In situations where collateral

includes assets or securities that are not simply cash in the currency inwhich exposure is assessed, the potential variation of the value of the col-lateral holdings must be modeled over the life of the investment

Given the arbitrary nature of the investment portfolio’s composition andthe nonlinear nature of CVAs, the computation of CVA typically requiresnumerical methods beyond the scope of closed-form computation Monte Carlosimulation approaches are common There is a framework for Monte Carlocopula simulation described by Gibbs and Goyder (2013a) While simulationapproaches are highly general and well suited to the CVA problem, they areknown to be significantly more complex to define than closed-form methods,often increasing calculation times by orders of magnitude, not to mention thedifficulty in setting up simulation technology in the most efficient and robustways possible for business accounting operations

DVA, as a balance sheet value adjustment required under IFRS 9, is definedanalogously to CVA except it represents the complimentary adjustment forrisk exposure in the financial portfolio to the credit of the entity DVA reducesthe value of a liability, recognizing reduced losses when an entity’s own credit-worthiness deteriorates Some criticism of this balance sheet adjustment points

to the difficulty in realizing profits resulting from an entity’s default when ing out financial positions, and the possibility of significant volatility in profit

clos-or loss in periods of credit market turmoil

CVA and DVA introduce significant model dependence and risk intothe derivative pricing problem, as it is impacted heavily by dynamic fea-tures such as volatilities and correlations of the underlying portfolio vari-ables and counterparty credit spreads Some open-ended problems suggestthat the modeling complexity can increase significantly in times of marketstress Clustering and tail risk suggest non-Gaussian simulation processes andcopulae of many state variables are necessary to accurately reflect creditrisk in times of market turmoil Wrong way risk can also be an impor-tant effect, which is additional risk introduced by correlation of the invest-ment portfolio and default of the counterparty Additionally, the incorpora-tion of first-to-default considerations can have significant effects (Brigo et al.,2012) The sum of these complexities introduced by IFRS 9 led to a contin-uous need for computationally expensive credit risk modeling in derivativepricing

Trang 33

1.1.1.3 Funding value adjustment

A more recently debated addition to the family of value adjustments is the

funding value adjustment (FVA) Until recently, the conceptual foundations

of FVA were less well understood relative to counterparty risk (Burgard andKjaer, 2011), though despite the initial debate, there is consensus that funding

is relevant to theoretical replication and reflects a similar dimension to theCVA specific to funding a given investment strategy This is simply a way ofsaying that funding a trade is analogous to funding its hedging strategy, andthis consideration leads to the practical need for FVA in certain situations.Some important realities of modern financial markets that involve fundingcosts related to derivative transactions are as follows (Green, 2016)

1 Uncollateralized derivative transactions can lead to balance sheet ments with effective loan or deposit profiles Loan transfer pricing withinbanks incorporates these exposure profiles in determining funding costsand benefits for trading units

invest-2 The details of CSA between counterparties affect the availability ofposted funds for use in collateral obligations outside the netting set The

presence or absence of the ability to rehypothecate collateral is critical

to properly account for funding costs of market risk hedges

3 The majority of the time derivatives cannot be used as collateral for a

secured loan like in the case of a bond This reflects the lack of repo

market for derivatives, which means that funding is decided at the level

of individual investment business rather than in the market, requiringaccounting for detailed FVA

Central to FVA is the idea that funding strategies are implicit in thederivatives portfolio (essentially by replication arguments), however, the fund-ing policy is typically specific to each business, and so incorporating this in

a value adjustment removes the symmetry of one price for each transaction.This leads to a general picture of a balance sheet value which differs fromthe mutually agreed and traded price This lack of symmetry in the busi-ness accounting problem is one of the reasons that FVA has yet to formally

be required on the balance sheet through regulation, though IFRS standardscould change in the near future to account for the increasing importance ofFVA (Albanese and Andersen, 2014; Albanese et al., 2014)

The details of funding costs are also linked to collateral modeling, whichalso has strong impact on CVA/DVA, thus in many cases it becomes impos-sible to clearly separate the FVA and CVA/DVA calculations in an addi-tive manner Modern FVA models are essentially exposure models whichderive closely from CVA, and fall into two broad classes, namely expectationapproaches using Monte Carlo simulation and replication approaches usingPDEs (Burgard and Kjaer, 2011; Brigo et al., 2013)

Trang 34

1.1.2 Regulatory capital requirements

Legal frameworks for ensuring that financial institutions hold sufficient plus capital, commensurate with their financial investment risk, have becomeincreasingly important on a global scale Historically speaking, regulatory cap-ital requirements (RCR) were not a major consideration for the front-officetrading desk, because the capital holdings were not included in derivativepricing Indeed, the original capitalization rules of the 1988 Basel CapitalAccord lacked sensitivity to counterparty credit risk, which was acknowledged

sur-as enabling the reduction of RCR without actually reducing financial taking

risk-A major revision to the Basel framework came in 2006 (Basel II ) to

address the lack of risk sensitivity and also allowed for supervisory approvalfor banks to use internal models to calculate counterparty default risk, based

on calibrated counterparty default probability and loss-given-default The newcapital rules led to a sizable increase in RCR directly associated with the front-office derivatives trading business, led to a subsequent need to focus on therisks associated with derivatives trading, and for the careful assessment andmodeling of RCR on the trading book The inflated importance of regulatorycapital for derivatives trading has led to the creation of many new front-officecapital management functions, whose task is to manage the capital of thebank deployed in support of trading activities

One serious shortcoming of Basel II was that it did not take into accountthe possibility of severe mark-to-market losses due to CVA alone This wasacknowledged in the aftermath of the 2008 financial crisis, where drasticwidening of counterparty credit spreads resulted in substantial balance sheetlosses due to CVA, even in the absence of default The revised Basel III frame-work proposed in 2010 significantly increases counterparty risk RCR, primarily

by introducing an additional component for CVA, which aims to capitalize thesubstantial risk apparent in CVA accounting volatility The need to accuratelymodel capital holdings for the derivatives trading book, which in turn requiresaccurate modeling of risk-neutral valuation including counterparty credit riskvalue adjustments, potentially introduces a much greater level of complexityinto the front-office calculation infrastructure This has led to a dichotomy

of RCR calculation types with varying degrees of complexity, applicable fordifferent institutions with respective degrees of computational sophistication

To address the widening gap between accurate and expensive computationand less sophisticated investors, a family of simpler calculation methods hasbeen made available These approaches only take as input properties of thetrades themselves and thus lack the risk sensitivity that is included in theadvanced calculation methods of RCR for counterparty default and CVA risks.There is a hurdle for model calculations, involving backtesting on PnL attri-bution, which is put in place so that special approval is required to make use

of internal model calculations The benefit of advanced calculation methods isimproved accuracy compared with standard approaches, and greater risk sen-sitivity in RCR, allowing more favorable aggregation and better recognition

Trang 35

of hedges These factors typically allow advanced methods to provide capitalrelief, resulting in a competitive advantage in some cases.

1.1.2.1 Calculation of market risk capital

Under Basel II, a standardized method for market risk capital was duced, which is based on a series of formulas to generate the trading book cap-ital requirement, with different approaches to be used for various categories

intro-of assets The formulas do not depend on market conditions, only properties

of the trades themselves, with overall calibration put in place by the BaselCommittee to achieve adequate capital levels

For banks approved to use internal risk models under the internal model

method (IMM) for determining the trading book capital requirement, this is

typically based on Value-at-Risk (VaR) using one-sided 99% confidence levelbased on 10-day interval price shock scenarios with at least 1 year of time-series data In Basel III, it is required to include a period of significant marketstress relevant for the bank to avoid historical bias in the VaR calculation.While VaR calculation can be presented as a straightforward percentile prob-lem of an observed distribution, it is also nonlinear and generally requiresrecalculating for each portfolio or change to an existing portfolio In addition,the extensive history of returns that must be calculated and stored, as well

as the definition of suitable macroeconomic scenarios at the level of pricingmodel inputs both represent significant challenges around implementing andbacktesting of VaR relating to computational performance, storage overhead,and data management

There has been criticism of VaR as an appropriate measure of exposure

since it is not a coherent risk measure due to lack of subadditivity This

dimin-ished recognition of diversification has been addressed in a recent proposal

titled The Fundamental Review of the Trading Book (FRTB), part of which

is the replacement of VaR with expected shortfall (ES), which is less intuitivethan VaR but does not ignore the severity of large losses However, theoreticalchallenges exist since ES is not elicitable, meaning that backtesting for reg-ulatory approval is a computationally intensive endeavor which may depend

on the risk model being used by the bank (Acerbi and Szkely, 2014)

1.1.2.2 Credit risk capital

The Basel II framework developed a choice of methods for default (EAD) for each counterparty netting set, both a standardized methodand an advanced method A serious challenge was posed to define EAD,due to the inherently uncertain nature of credit exposure driven by mar-ket risk factors, correlations, and legal terms related to netting and collat-eral Consistently with previous methods, SA-CCR treats the exposure atdefault as a combination of the mark-to-market and the potential future expo-sure If a bank receives supervisory authorization, they can use advancedmodels to determine capital requirements under the IMM, which is the most

Trang 36

exposure-at-risk-sensitive approach for EAD calculation under the Basel framework EAD

is calculated at the netting set level and allows full netting and collateralmodeling However to gain IMM approval takes significant time and requires

a history of backtesting which can be very costly to implement

During the financial crisis, roughly two-thirds of losses attributed to terparty credit were due to CVA losses and only about one-third were due

coun-to actual defaults Under Basel II, the risk of counterparty default and creditmigration risk was addressed, for example, in credit VaR, but mark-to-marketlosses due to CVA were not This has led to a Basel III proposal that considersfairly severe capital charges against CVA, with relatively large multipliers onrisk factors used in the CVA RCR calculation, compared with similar formulasused in the trading book RCR for market risk

The added severity of CVA capital requirements demonstrates a lack ofalignment between the trading book and CVA in terms of RCR, which isintentional to compensate for structural differences in the calculations AsCVA is a nonlinear portfolio problem, the computational expense is muchgreater, often requiring complex simulations, compared with the simpler val-uations for instruments in the trading book The standard calculations of riskinvolve revaluation for bumping each risk factor The relative expense of CVAcalculations means that in practice fewer risk factors are available, and thuscapital requirements must have larger multipliers to make up for the lack ofrisk sensitivity

However, one technology in particular has emerged to level the playingfield and facilitate better alignment between market risk and CVA capitalrequirements AD, described in Section 1.3.2, reduces the cost of risk fac-tor calculations by eliminating the computational expense of adding addi-tional risk factors With AD applied to CVA pricing, the full risk sensi-tivity of CVA capital requirements can be achieved without significantlyincreasing the computational effort, although the practical use of AD caninvolve significant challenges in the software implementation of CVA pric-ing

1.1.2.3 Capital value adjustment

The ability to account for future capital requirements is a necessity forthe derivatives trading business which aims to stay capitalized over a certaintime horizon The complexity of the funding and capitalization of the deriva-tives business presents a further need to incorporate funding benefits associ-ated with regulatory capital as well as capital requirements generated fromhedging trades into the derivative pricing problem as a capital value adjust-ment (KVA) As in the case of CVA and FVA, the interconnected nature ofvalue adjustments means that it is not always possible to fully separate KVAfrom FVA and CVA in an additive manner Recent work (Green, 2016) hasdeveloped several extensions to FVA replication models to incorporate theeffects associated with KVA in single asset models

Trang 37

The current and future landscape of RCR appears to be driven by tions that are risk sensitive, and thus highly connected to the valuation modelsand market risk measures This is a complex dynamic to include in the val-uation adjustment since it links mark-to-market derivatives accounting withexogenous costs associated with maintaining sufficient capital reserves that,

calcula-in turn, depend on market conditions, analogously to counterparty defaultand operational funding costs Challenges arise due to the use of real-worldrisk measures to calculate exposures, which are at odds with arbitrage-freederivative pricing theory This requires sophisticated approximation of therelationship between real-world risk measures used in regulatory EAD cal-culation with the market-calibrated exposure profile used in the risk-neutralCVA pricing (Elouerkhaoui, 2016)

Another significant computational challenge for KVA is related to tural misalignment of the capital requirement formulas with the hedgeamounts determined by internal models Under the current regulatory frame-work, there is still a possibility of reducing capital requirements by increasingrisk as determined by internally approved models Optimal hedge amountsmust be carefully determined so as to maximize the profitability of the deriva-tives business within a certain level of market and counterparty risk, whilesimultaneously reducing the capital requirements associated with hedgingactivity The overall picture is thus pointing towards a growing need for accu-racy in KVA as a part of derivatives hedge accounting, despite the currentlack of any formal requirement for KVA in the IFRS standards

struc-1.2 Trading and Hedging

Historically, counterparty credit risk and capital adjustments only played aminor role within financial institutions often delegated away from front-officeactivity and thus covered by departments such as risk and finance Given therelatively low capital requirements prior to the financial crisis, the profit cen-ters were simply charged on a determined frequency ranging from monthly toyearly against their relevant capital usage, but the impact has been negligi-ble for almost all trading activity decisions This also meant that front-officemodel development followed a rapid path of innovation whilst counterpartycredit risk and capital modeling was only considered a needed by-product ofoperation Realistically, only CVA did matter for trades against counterpar-ties where no collateral agreement has been in place, such as corporates, forwhich the bank could end up in the situation of lending a significant amount

of money which is already the case for a vanilla interest rate swap Givencredit spreads of almost zero for all major financial institutions, neither DVAnor FVA was worth considering

Given the strong emphasis of both accounting and regulation on xVArelated charges ranging from CVA/DVA to FVA and even KVA, it is of

Trang 38

fundamental importance to have accurate pre- and post-trade informationavailable Both are a matter of computational resources albeit with a slightlydifferent emphasis Whilst calculation speed is critical for decision making inpretrade situations, post-trade hedging and reporting is predominantly a mat-ter of overall computational demand only limited by the frequency of reportingrequirements, which in case of market risk require a daily availability of fairvalues, xVA and their respective risk sensitivities A similar situation holdsfor hedging purposes where portfolio figures don’t always need to be avail-able on demand but calculations on a much coarser frequency are sufficient.

It is important to point out in particular that in periods of stress, the ability

to generate accurate post-trade analysis becomes more important as ity markets will also make risk and hedging figures fluctuate more and thuspotentially require to be run on a higher frequency in order to make decisionranging from individual hedges all the way down to unwinding specific busi-ness activity, closing out of desks to raising more regulatory capital Since it isnot economical to just warehouse idle computation resource for the event of acrisis, one would either have to readjust their current usage where possible orrely on other scaling capabilities A point we are investigating in more detail

volatil-is in Section 1.3.1

There are a multitude of technology elements supporting derivative pricingranging from hardware (see Section 1.3.1) supporting the calculation to soft-ware technology such as algorithmic adjoint differentiation (see Section 1.3.2)

In the discussion below, we focus predominantly on the computationaldemand related to derivative pricing and general accounting and regulatorycapital calculations, and less on the needs related to high-frequency applica-tions including the collocation of hardware on exchanges

Choice of hardware is a critical element in supporting the pricing, riskmanagement, accounting and regulatory reporting requirements of financialinstitutions In the years prior to the financial crisis, the pricing and risk man-agement have typically been facilitated by in-house developed pricing librariesusing an object-oriented programming language (C++ being a popular choice)

on regular CPU hardware and respective distributed calculation grids

1.3.1.1 Central processing unit/floating point unit

Utilizing an object-oriented programming language did allow for a quickintegration of product, model, and pricing infrastructure innovation into

Trang 39

production which is a critical element for a structured product business tostay ahead of competition Even with product and model innovation being

of less importance, postcrisis being able to reutilize library components aids

to the harmonization of pricing across different asset classes and productswhich is in particular important in light of regulatory scrutiny C++ hasbeen the programming language of choice as one could make full benefit fromhardware development on the floating-point-unit writing performance criticalcode very close to the execution unit This is particularly important with CPUmanufacturers utilizing the increase in transistor density to allow for greater

vectorization of parallel instructions (SSEx, AVX, AVX2, ) going hand in

hand with the respective compiler support

1.3.1.2 Graphic processing unit

Graphic processing units (GPUs) lend themselves very naturally to thecomputationally expensive task in the area of computational finance Thedesign of GPUs offers to take the well-established paradigm of multithreading

to the next level, offering advantages in scale and also importantly energyusage In particular, Monte Carlo simulations lend themselves very naturally

to multithreading tasks due to their inherent parallel nature, where sets ofpaths can be processed efficiently in parallel Thus, GPUs offer a very pow-erful solution for well-encapsulated high-performance tasks in computationalfinance One of the key difficulties utilizing the full capabilities of GPUs is theintegration into the analytics framework as most of the object management,serialization and interface functionality need to be handled by a higher levellanguage such as C++, C#, Java, or even Python All of those offer seamlessintegration with GPUs where offloading of task only requires some simple datatransfer and dedicated code running on the GPU However, this integrationtask does make the usage of GPUs a deliberate choice as one typically has

to support more than one programming language and hardware technology.Additionally, upgrades to GPUs would make a reconfiguration or even coderewrite not unlikely to fully utilize the increased calculation power, which is

no different to the greater level of vectorizations on CPUs

1.3.1.3 Field programmable gate array

Field programmable gate arrays (FPGAs) offer high-performance solutionsclose to the actual hardware At the core of FPGAs are programmable logicblocks which can be “wired” together using hardware description languages

At the very core, the developer is dealing with simple logic gates such asAND and XOR Whilst both CPUs and GPUs have dedicated floating pointunits (FPUs) specifically designed to sequentially execute individual tasks athigh speed, FPGAs allow systems to take advantage of not only a high level

of parallelism similar to multithreading, but in particular creating a pipe ofcalculation tasks At any clock cycle (configurable by the FPGA designer),the FPGA will execute not only a single task but every instruction along the

Trang 40

calculation chain is executed Monte Carlo simulation on FPGA can taneously execute the tasks of uniform random number generation, inversecumulative function evaluation, path construction, and payoff evaluation—andevery sub-step in this process will also have full parallelization and pipeliningcapacity Therefore even at a far lower clock speed, one can execute manytasks along a “pipe” with further horizontal acceleration possible This allows

simul-a fsimul-ar higher utilizsimul-ation of trsimul-ansistors compsimul-ared to CPUs/GPUs, where mostparts of the FPU are idle as only a single instruction is dealt with One

of the core difficulties of FPGAs is that one not only needs to cate data between the CPU and the FPGA but the developer also needs toswitch programming paradigm Both CPUs and GPUs can handle a new set

communi-of tasks with relative ease by a mere change communi-of the underlying chine code, whilst the FPGA requires much more dedicated reconfiguration

assembly/ma-of the executing instructions Whilst FPGAs would offer even greater lation speedups and in particular a significant reduction in energy usage it isyet to be seen whether they provide a true competitor to traditional hardwareand software solutions That being said, FPGAs do have a widespread usage

calcu-in the context of high-frequency tradcalcu-ing; calcu-in particular, colocated on exchangepremises as they offer the ultimate hardware acceleration for a very dedicatedtask

1.3.1.4 In-memory data aggregation

Significant increase in in-memory data storage capacity opened new bilities for the calculation and aggregation of risk, counterparty and regulatorydata Calculation intensive computations such as Monte Carlo simulation forcounterparty credit risk used to require data aggregation as an in-built fea-ture for the efficiency of the algorithm where one had to discard a significantamount of intermediate calculation results such as the realization of an individ-ual index at any time-discretization point Thus, any change in the structure

capa-of the product (or counterparty, or collateral agreement) required a full rerun

of the Monte Carlo simulation recreating the same intermediate values Whilststorage capacity made this the only viable solution a few years ago, memorystorage on 64-bit system does allow for consideration of more efficient alter-natives In-memory data aggregation in the context of computational financedoes provide the distinctive advantage that more business decisions can bemade in real time or further high-level optimization to be deployed such asbest counterparty to trade with considering collateral cost, counterparty creditrisk and regulatory capital requirements even keeping the product features thesame

While the impact of hardware choices on computational performance iscritical, that of software choices can be overlooked Within software, opti-mization efforts tend to fall into at least two categories: first, the application

Ngày đăng: 08/01/2020, 08:34

TỪ KHÓA LIÊN QUAN