Dealing with heteroskedasticity; regression with robust standard errors using R
Exporting editable plots from R to Powerpoint: making ggplot2 purrr with officer
Forecasting my weight with R
Getting data from pdfs using the pdftools package
Going from a human readable Excel file to a machine-readable csv with {tidyxl}
How Luxembourguish residents spend their time: a small {flexdashboard} demo using the Time use survey data
Imputing missing values in parallel using {furrr}
Missing data imputation and instrumental variables regression: the tidy approach
The year of the GNU+Linux desktop is upon us: using user ratings of Steam Play compatibility to play around with regex and the tidyverse
{pmice}, an experimental package for missing data imputation in parallel using {mice} and {furrr}
Building formulae
Functional peace of mind
Get basic summary statistics for all the variables in a data frame
Getting {sparklyr}, {h2o}, {rsparkling} to work together and some fun with bash
Importing 30GB of data into R with sparklyr
Introducing brotools
It's lists all the way down
It's lists all the way down, part 2: We need to go deeper
Keep trying that api call with purrr::possibly()
Lesser known dplyr 0.7* tricks
Lesser known dplyr tricks
Lesser known purrr tricks
Make ggplot2 purrr
Mapping a list of functions to a list of datasets with a list of columns as arguments
Predicting job search by training a random forest on an unbalanced dataset
Teaching the tidyverse to beginners
Why I find tidyeval useful
tidyr::spread() and dplyr::rename_at() in action
Easy peasy STATA-like marginal effects with R
Functional programming and unit testing for data munging with R available on Leanpub
How to use jailbreakr
My free book has a cover!
Work on lists of datasets instead of individual datasets by using functional programming
Method of Simulated Moments with R
New website!
Nonlinear Gmm with R - Example with a logistic regression
Simulated Maximum Likelihood with R
Bootstrapping standard errors for difference-in-differences estimation with R
Careful with tryCatch
Data frame columns as arguments to dplyr functions
Export R output to a file
I've started writing a 'book': Functional programming and unit testing for data munging with R
Introduction to programming econometrics with R
Merge a list of datasets together
Object Oriented Programming with R: An example with a Cournot duopoly
R, R with Atlas, R with OpenBLAS and Revolution R Open: which is fastest?
Read a lot of datasets at once with R
Unit testing with R
Update to Introduction to programming econometrics with R
Using R as a Computer Algebra System with Ryacas

I started reading *Applied Computational Economics & Finance* by Mario J. Miranda and Paul L. Fackler. It is a very interesting book that I recommend to every one of my colleagues. The only issue I have with this book, is that the programming language they use is Matlab, which is proprietary. While there is a free as in freedom implementation of the Matlab language, namely Octave, I still prefer using R. In this post, I will illustrate one example the authors present in the book with R, using the package `rootSolve`

. `rootSolve`

implements Newtonian algorithms to find roots of functions; to specify the functions for which I want the roots, I use R's Object-Oriented Programming (OOP) capabilities to build a model that returns two functions. This is optional, but I found that it was a good example to illustrate OOP, even though simpler solutions exist, one of which was proposed by reddit user TheDrownedKraken (whom I thank) and will be presented at the end of the article.

The example is taken from Miranda's and Fackler's book, on page 35. The authors present a Cournot duopoly model. In a Cournot duopoly model, two firms compete against each other by quantities. Both produce a certain quantity of an homogenous good, and take the quantity produce by their rival as given.

The inverse demand of the good is :

$$P(q) = q^{-\dfrac{1}{\eta}}$$

the cost function for firm i is:

$$C_i(q_i) = P(q_1+q_2)*q_i - C_i(q_i)$$

and the profit for firm i:

$$\pi_i(q1,q2) = P(q_1+q_2)q_i - C_i(q_i)$$

The optimality condition for firm i is thus:

$$\dfrac{\partial \pi_i}{\partial q_i} = (q1+q2)^{-\dfrac{1}{\eta}} - \dfrac{1}{\eta} (q_1+q_2)^{\dfrac{-1}{\eta-1}}q_i - c_iq_i=0.$$

If we want to find the optimal quantities \(q_1\) and \(q_2\) we need to program the optimality condition and we could also use the jacobian of the optimality condition. The jacobian is generally useful to speed up the root finding routines. This is were OOP is useful. First let's create a new class, called *Model*:

```
setClass(Class = "Model", slots = list(OptimCond = "function", JacobiOptimCond = "function"))
```

This new class has two *slots*, which here are functions (in general *slots* are properties of your class); we need the model to return the optimality condition and the jacobian of the optimality condition.

Now we can create a function which will return these two functions for certain values of the parameters, *c* and of the model:

```
my_mod <- function(eta, c) {
e = -1/eta
OptimCond <- function(q) {
return(sum(q)^e + e * sum(q)^(e - 1) * q - diag(c) %*% q)
}
JacobiOptimCond <- function(q) {
return((e * sum(q)^e) * array(1, c(2, 2)) + (e * sum(q)^(e - 1)) * diag(1,
2) + (e - 1) * e * sum(q)^(e - 2) * q * c(1, 1) - diag(c))
}
return(new("Model", OptimCond = OptimCond, JacobiOptimCond = JacobiOptimCond))
}
```

The function `my_mod`

takes two parameters, `eta`

and `c`

and returns two functions, the optimality condition and the jacobian of the optimality condition. Both are now accessible via `my_mod(eta=1.6,c = c(0.6,0.8))@OptimCond`

and `my_mod(eta=1.6,c = c(0.6,0.8))@JacobiOptimCond`

respectively (and by specifying values for `eta`

and `c`

).

Now, we can use the `rootSolve`

package to get the optimal values \(q_1\) and \(q_2\)

```
library("rootSolve")
multiroot(f = my_mod(eta = 1.6, c = c(0.6, 0.8))@OptimCond, start = c(1, 1),
maxiter = 100, jacfunc = my_mod(eta = 1.6, c = c(0.6, 0.8))@JacobiOptimCond)
```

```
## $root
## [1] 0.8396 0.6888
##
## $f.root
## [,1]
## [1,] -2.220e-09
## [2,] 9.928e-09
##
## $iter
## [1] 4
##
## $estim.precis
## [1] 6.074e-09
```

After 4 iterations, we get that and are equal to 0.84 and 0.69 respectively, which are the same values as in the book!

I posted this blog post on the rstats subbreddit on www.reddit.com. I got a very useful comment by reddit member TheDrownedKraken which suggested the following approach, which doesn't need a new class to be build. I thank him for this. Here is his suggestion:

```
generator <- function(eta, a) {
e = -1/eta
OptimCond <- function(q) {
return(sum(q)^e + e * sum(q)^(e - 1) * q - diag(a) %*% q)
}
JacobiOptimCond <- function(q) {
return((e * sum(q)^e) * array(1, c(2, 2)) + (e * sum(q)^(e - 1)) * diag(1,
2) + (e - 1) * e * sum(q)^(e - 2) * q * c(1, 1) - diag(a))
}
return(list(OptimCond = OptimCond, JacobiOptimCond = JacobiOptimCond))
}
f.s <- generator(eta = 1.6, a = c(0.6, 0.8))
multiroot(f = f.s$OptimCond, start = c(1, 1), maxiter = 100, jacfunc = f.s$JacobiOptimCond)
```

```
## $root
## [1] 0.8396 0.6888
##
## $f.root
## [,1]
## [1,] -2.220e-09
## [2,] 9.928e-09
##
## $iter
## [1] 4
##
## $estim.precis
## [1] 6.074e-09
```