[R] Performance (speed) of ggplot

ONKELINX, Thierry Thierry.ONKELINX at inbo.be
Fri Sep 26 12:09:27 CEST 2014

You are using ggplot2 very inefficiently. Many geom's plot only one data point. You can combine several of them in a single geom. Have a look at this gridExtra package which has some useful functions like grid.arrange and tableGrob.

Best regards,

ir. Thierry Onkelinx
Instituut voor natuur- en bosonderzoek / Research Institute for Nature and Forest
team Biometrie & Kwaliteitszorg / team Biometrics & Quality Assurance
Kliniekstraat 25
1070 Anderlecht
+ 32 2 525 02 51
+ 32 54 43 61 85
Thierry.Onkelinx op inbo.be

To call in the statistician after the experiment is done may be no more than asking him to perform a post-mortem examination: he may be able to say what the experiment died of.
~ Sir Ronald Aylmer Fisher

The plural of anecdote is not data.
~ Roger Brinner

The combination of some data and an aching desire for an answer does not ensure that a reasonable answer can be extracted from a given body of data.
~ John Tukey

-----Oorspronkelijk bericht-----
Van: r-help-bounces op r-project.org [mailto:r-help-bounces op r-project.org] Namens Christopher Battles
Verzonden: donderdag 25 september 2014 20:30
Aan: r-help op r-project.org
Onderwerp: [R] Performance (speed) of ggplot

Hello list,

I have been working on learning ggplot for its extraordinary flexibility compared to base plotting and have been developing a function to create a "Minitab-like" process capability chart.

*sigh* some of the people I interface with can only understand the data when it is presented in Minitab format

The function creates a ggplot container to hold 10 ggplot items which are the main process capability chart, a Q-Q plot, and the text boxes with all the capabilities data.  When I run the function, the elapsed time is on the order of 3 seconds, the gross majority of which is user time.  sys time is very small.  A bit of hacking shows that the calls to

gt1 <- ggplot_gtable(ggplot_build(p)),

etc., each take on the order of 1/3 of a second. These times are on a 3.2GHz Xeon workstation.  I'd like to see the entire function complete in less than a second.  My questions are: 1) Am I misusing ggplot, hence the performance hit? 2) Is there any way to increase the speed of this portion of the code? 3) Am I simply asking ggplot to crunch so much that it is inevitable that it will take a while to process?

To that end, the function, vectis.cap(), can be downloaded from http://pastebin.com/05s5RKYw .  It runs to 962 lines of code, so I won't paste it here.  The offending ggplot_gtable calls are at lines 909 - 918.

vectis.cap(chickwts$weight, target = 300, USL = 400, LSL = 100)

Thank you,

Christopher Battles

R-help op r-project.org mailing list
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.
* * * * * * * * * * * * * D I S C L A I M E R * * * * * * * * * * * * *
Dit bericht en eventuele bijlagen geven enkel de visie van de schrijver weer en binden het INBO onder geen enkel beding, zolang dit bericht niet bevestigd is door een geldig ondertekend document.
The views expressed in this message and any annex are purely those of the writer and may not be regarded as stating an official position of INBO, as long as the message is not confirmed by a duly signed document.

More information about the R-help mailing list