# Quantitative Finance Collector

Topics: Options, Mathematical finance, Option Pages: 49 (9371 words) Published: January 11, 2013
Quantitative Finance Collector
abiao

Published: 2010
Categories(s): Non-Fiction, Business & economics, Finance
Tag(s): "quantitative finance" "financial engineering" "mathematical finance" quant "quantitative trading"

1

Quantitative Finance Collector is simply a record of my financial engineering learning journey as a master in quantitative finance, a PhD candidate in finance and a Quantitative researcher. It is mainly about Quantitative finance codes, methods in mathematical finance focusing on derivative pricing, quantitative trading and quantitative risk management, with most of the entries written at university.

2

www.feedbooks.com
Food for the mind

3

Quantitative Finance Collector

Updated: 8-10
Update this newspaper

1

Quantitative Finance Collector
Handling Large CSV Files in R
A follow-up of my previous post Excellent Free CSV Splitter. I asked a question at LinkedIn about how to handle large CSV files in R / Matlab. Specifically,
Quotation
suppose I have a large CSV file with over 30 million number of rows, both Matlab / R lacks memory when importing the data. Could you share your way to handle this issue? what I am thinking is:
a) split the file into several pieces (free, straightforward but hard to maintain);
b) use MS SQL/MySQL (have to learn it, MS SQL isn't free, not straightforward).

A useful summary of suggested solution:
1, 1) import the large file via "scan" in R;
2) convert to a data.frame --> to keep data formats
3) use cast --> to group data in the most "square" format as possible, this step involves the Reshape package, a very good one.
2, use the bigmemory package to load the data, so in my case, using read.big.matrix() instead of read.table(). There are several other interesting functions in this package, such as mwhich() replacing which() for memory consideration, foreach() instead of for(), etc. How large can this package handle? I don't know, the authors successfully load a CSV with size as large as 11GB.

3, switch to a 64 bit version of R with enough memory and preferably on linux. I can't test this solution at my office due to administration constraint, although it is doable, as mentioned in R help document, Quotation

64-bit versions of Windows run 32-bit executables under the WOW (Windows on Windows) subsystem: they run in almost exactly the same way as on a 32-bit version of Windows, except that the address limit for the R process is 4GB (rather than 2GB or perhaps 3GB)....The

2

disadvantages are that all the pointers are 8 rather than 4 bytes and so small objects are larger and more data has to be moved around, and that far less external software is available for 64-bit versions of the OS.

Search & trial.
Tags - r , csv

3

Excellent Free CSV Splitter
Share an excellent free CSV splitter I found recently, as my csv file is too large to be openned in Matlab & R, I have to split the csv into several smaller files. As far as I have tried, Matlab & R warn "short of memory" for reading csv file larger than 10,000,000 number of rows (it may be varied across computers), while my tick-by-tick corporate bond data has nearly 30,000,000 number of rows.

This CSV splitter allows you to split your large file into several smaller files either by number of lines or by max pieces,

The amazing point of it is the smaller files keep the original header of the big csv file, very cool. Download the free csv splitter here. Tags - csv , tool

4

Isin Cusip Conversion
Long time no blog. Just to let you know I am still alive, busy with my own PhD research, collecting & cleaning data,...