The downside is R is really slow...I have coded monte carlo simulations in c++ that take seconds that take R minutes.
This could not be more true. I've tried experimenting with the parallel processing packages for R but to no success.
The downside is R is really slow...I have coded monte carlo simulations in c++ that take seconds that take R minutes.
This is definitely a problem. There are some packages trying to solve this but the degree of success varies.To me the downside of R is the plotting - not interactive, so it's much harder to explore the data naturally.
What I might end up doing is feeding R through Python and matplotlib via RPy.This is definitely a problem. There are some packages trying to solve this but the degree of success varies.
What I might end up doing is feeding R through Python and matplotlib via RPy.
I've tried some of those packages and overall it's still pretty painful to get what you want, although there are some good parts.
Which of the two would you recommend more/what are the essential differences?I have used RPy and RPy2.
To me the downside of R is the plotting - not interactive, so it's much harder to explore the data naturally.
As far as speed, vectorize the code.
I really don't know what you are doing but, in my experience, R is never the problem.
I've run the exact same time series functions on the same data in R and Matlab. It took <1 second in Matlab and 30 seconds in R.
if you can code something in bash that was written in R, it means that R was used for the wrong reasons. R strengths are far away from what bash can do.I'm working on a project, where I have to recode / impute categorical variables. On a 2046 x 11 test dataset, R took somewhere between 15 - 30 secs on average. I recoded that part in Bash and it gets the same job done in about a second. That said, R does have its strong points : easy EDA (exploratory data analysis) and plotting, very mature stats and machine learning packages, and it integrates with other major languages...