R Shiny web app: Extreme events
I’ve made a new app using Shiny, an R package for publishing interactive web applications. This app was produced for the Marine Downscaling project. It provides the ability to examine frequencies (days per month) of extreme temperature and wind events projected into the future at various locations around coastal Alaska, primarily.
The modeled outputs come from three global climate models (GCMs), which have been regridded to common resolution and quantile mapped with respect to the observation-based European Reanalysis (ERA-40) dataset. Frequencies of events are displayed using a time series barplot. A bar can be drawn for each of any subset of months per year. Additionally, with multiple months displayed per year, individual months can be highlighted with differently colored line plot overlays.
The key feature of the app is that the barplots are condititional barplots, such that a set of multiple time series barplots are displayed vertically for visual comparison, conditional on levels or classes of a given categorical variable. Users can view time series of extreme events for various specific climate variables, geographic locations, climate models, RCPs, and climate variable value thresholds. Users can select any one of these variables as a conditional variable to generate a comparative visual display of extreme events barplots by factor levels of the chosen variable. There is also the ability to switch between defining extreme events as the number of days per month above or below a given threshold.
For those more interested in R and Shiny, I’ll mention a bit more about the features. I’ve included a download button which allows the user to obtain a pdf of the currently displayed graphic. Adjacent to this in the sidebar panel, there is the option to toggle on/off a simple geographic map which shows the grid cells around Alaska that are available in the app, as well as which are currently selected and included in the time series plot. The map graphic is not interactive, in the sense that you cannot click on it and make something happen. But it is reactive, in that it updates when inputs which affect the map are changed.
I made use of
wellPanel to segment out sections of somewhat related inputs in the sidebar panel to make it a bit more organized. I also used some HTML in the
ui.R script to put inputs side by side. The default is for all inputs to stack vertically, so when there are as many inputs as found in this app, they can go well off the screen and necessitate lots of annoying scrolling for the user. This alleviated that issue substantially.
I included an About tab, which you can refer to for any information about the app perhaps not mentioned here. You may also notice the custom header, with title aligned left and organizational logo aligned right, kept on the same line to avoid too much vertical white space at the top of the app pushing down the sidebar and main panel.
One thing that is not apparent from using the app, is how I choose to load the data. At first, I sourced a single workspace file (
.RData) containing the data for all locations in one truly massive dataframe. It turned out to be very inefficient because it took a while to load the app, and once loaded, it also took a while to subset such a large dataframe for plotting based on user inputs. The first step was to break the dataframe out into a separate, smaller dataframe for each location, resulting in 39 dataframes. It still took as long to load the workspace, and thus the app, but once loaded, the
reshape2 package had to work only on a single, much smaller dataframe at any given time, greatly increasing efficiency of the app. And only in the case of the user conditioning on location, with multiple locations selected, was it necessary to append multiple small dataframes together, which proved to be a good tradeoff since no one will plot more than a three or four of these at a time.
That still left the issue of loading time for the app. I decided that it was wasteful to load all 39 locations from one workspace file before calling
shinyServer. I instead shrunk that workspace file way down, to almost nothing, by excluding the 39 dataframes from it. These I instead wrote out as 39 separate workspace files, each of which would be loaded into the global environment from within the
shinyServer call only when specifically requested by the user. This took a little rethinking on my part, as I’ve not done much like this before. And I had to sprinkle a handful of
envir=.GlobalEnv statements throughout some functions inside of the
shinyServer call. Mainly, I added references to
checkData() for this.