View on GitHub

geoknife

Methods for geo-web processing of gridded data

Download this project as a .zip file Download this project as a tar.gz file

Introduction

The geoknife package was created to support web-based geoprocessing of large gridded datasets according to their overlap with landscape (or aquatic/ocean) features that are often irregularly shaped. geoknife creates data access and subsequent geoprocessing requests for the USGS’s Geo Data Portal to carry out on a web server. The results of these requests are available for download after the processes have been completed. This type of workflow has three main advantages: 1) it allows the user to avoid downloading large datasets, 2) it avoids reinventing the wheel for the creation and optimization of complex geoprocessing algorithms, and 3) computing resources are dedicated elsewhere, so geoknife operations do not have much of an impact on a local computer.

Because communication with web resources are central to geoknife operations, users must have an active internet connection. geoknife interacts with a remote server to discover processing capabilities, find already available geospatial areas of interest (these are normally user-uploaded shapefiles), get gridded dataset characteristics, execute geoprocessing requests, and get geoprocessing results.

The main elements of setting up and carrying out a geoknife ‘job’ (geojob) include defining the feature of interest (the stencil argument in the geoknife function), the gridded web dataset to be processed (the fabric argument in the geoknife function), and the the processing algorithm parameters (the knife argument in the geoknife function). The status of the geojob can be checked with check, and output can be loaded into a data.frame with loadOutput. See below for more details.

Installation

To install the stable version of geoknife package with dependencies:

install.packages("geoknife", 
    repos = c("http://owi.usgs.gov/R"),
    dependencies = TRUE)

Or to install the current development version of the package:

install.packages("devtools")
devtools::install_github('USGS-R/geoknife')

getting started

The geoknife package was created to support web-based geoprocessing of large gridded datasets according to their overlap with landscape (or aquatic/ocean) features that are often irregularly shaped. geoknife creates data access and subsequent geoprocessing requests for the USGS’s Geo Data Portal to carry out on a web server.

geoknife concepts

geoknife has abstractions for web-available gridded data, geospatial features, and geoprocessing details. These abstractions are the basic geoknife arguments of fabric, stencil and knife.
* fabric defines the web data that will be accessed, subset, and processed (see the fabric section for more details). These data are limited to gridded datasets that are web-accessible through the definitions presented in the OPeNDAP section. Metadata for fabric include time, the URL for the data, and variables.
* stencil is the geospatial feature (or set of features) that will be used to delineate specific regions of interest on the fabric (see the stencil section for more details). stencil can include point or polygon groupings of various forms (including classes from the sp R package).
* knife defines the way the analysis will be performed, including the algorithm and version used, the URL that receives the processing request, the statistics returned, and the format of the results (see the knife section for more details).
* The geoknife() function takes the fabric, stencil, and knife, and returns a geojob, which is a live geoprocessing request that will be carried out on a remote web server (see the geojob section for more details). The geojob can be checked by users, and results can be parsed and loaded into the R environment for analyses.

remote processing basics

Because geoknife executes geospatial computations on a remote webserver, the workflow for to execute geoprocessing operations may feel a bit foreign to users who usually performing their analyses on a local computer. To find available datasets and their details (variables, time range, etc.), geoknife must query remote servers because data for use with geoknife is typically hosted on open access servers near the processing service. These operations are covered in detail below, but this section is designed to provide a quick overview.

Interactions with web resources may take on the following forms, and each involve separate requests to various webservers:

  1. Using the query function to figure out what data exist for fabric. This function will request data from a CSW (catalog service for the web) resource and return results, or, if a dataset is already specified, it can be used to query for the variables or time dimension.
  2. Using the query function to use a web resource for the geometry of stencil, including a US State, Level III Ecoregion, and many others.
  3. Submitting a geojob to be processed externally
  4. Checking the status of a geojob
  5. Loading the results from a successful geojob

quick start guide

There are various ways to get up and running quickly with geoknife. See sections below for additional details on any of the following operations. As mentioned above, geoknife has the basic arguments of fabric, stencil and knife. knife is an optional argument, and if not used, a default knife will be used to specify the processing details.

define a stencil that represents the geographic region to slice out of the data

There are many different ways to specify geometry (stencil) for geoknife. The two basic functions that support building stencil objects are simplegeom and webdata:

library(geoknife)

Use a single longitude latitude pair as the geometry with the simplegeom function:

stencil <- simplegeom(c(-89, 46.23))

Or specify a collection of named points in a data.frame (note that naming is important for multi-features because it specifies how the results are filtered):

stencil <- simplegeom(data.frame(
              'point1' = c(-89, 46), 
              'point2' = c(-88.6, 45.2)))

Use a web-available geometry dataset with the webgeom function to specify state boundaries:

stencil <- webgeom('state::New Hampshire')
stencil <- webgeom('state::New Hampshire,Wisconsin,Alabama')

or HUC8s (hydrologic unit code):

stencil <- webgeom('HUC8::09020306,14060009')
# display stencil:
stencil
## An object of class "webgeom":
## url: http://cida.usgs.gov/gdp/geoserver/wfs 
## geom: derivative:wbdhu8_alb_simp 
## attribute: HUC_8 
## values: 09020306 14060009 
## version: 1.1.0

see what other HUCs could be used via the query function:

HUCs <- query(stencil, 'values')
# there are thousands of results, but head() will only display a few of them
head(HUCs) 
## [1] "11060006" "11060005" "11060001" "11060004" "11060003"
## [6] "11060002"

define a fabric that represents the underlying data

The fabric is specified using the webdata function, and can be done explicity or with a quick-start dataset name (such as ‘prism’)

fabric <- webdata('prism')
# display fabric:
fabric
## An object of class "webdata":
## times: 1895-01-01 1899-01-01 
## url: http://cida.usgs.gov/thredds/dodsC/prism 
## variables: ppt

The same can be done explicitly by passing a list to webdata:

fabric <- webdata(list(
            times = as.POSIXct(c('1895-01-01','1899-01-01')),
            url = 'http://cida.usgs.gov/thredds/dodsC/prism',
            variables = 'ppt'))

To modify the times in fabric, use times():

times(fabric) <- as.POSIXct(c('1990-01-01','2005-01-01'))

Similar to webgeom, the query method can be used on webdata objects:

query(fabric, 'times')
query(fabric, 'variables')

create the processing job that will carry out the subsetting/summarization task

job <- geoknife(stencil, fabric)

use convienence functions to check on the job:

check(job)
## $status
## [1] "Process Started"
## 
## $URL
## NULL
## 
## $statusType
## [1] "ProcessStarted"
running(job)
## [1] TRUE
error(job)
## [1] FALSE
successful(job)
## [1] FALSE

Cancel a running job:

job <- cancel(job)

Run the job again, but have R wait until the process is finished:

job <- geoknife(stencil, fabric, wait = TRUE)

Load up the output and plot it

data <- loadOutput(job)
plot(data[,1:2], ylab = variables(fabric))

For long running processes, it often makes sense to use an email listener:

job <- geoknife(webgeom('state::Wisconsin'), fabric = 'prism', email = 'fake.email@gmail.com')

spatial features (stencil)

The stencil concept in geoknife represents the area(s) of interest for geoprocessing. stencil can be represented by two classes in geoknife: simplegeom and webdata. Any other classes can also be used that can be coerced into either of these two classes (such as data.frame).

simplegeom object

The simplegeom class is designed to hold spatial information from the R environment and make it available to the processing engine. simplegeom is effectively a wrapper for the sp package’s SpatialPolygons class, but also coerces a number of different other types into this class. For example:

Points can be specified as longitude latitude pairs:

stencil <- simplegeom(c(-89, 45.43))

or as a data.frame:

stencil <- simplegeom(data.frame(
              'point1' = c(-89, 46), 
              'point2' = c(-88.6, 45.2)))

Also, a SpatialPolygons object can be used as well (example from sp package):

library(sp)
Sr1 = Polygon(cbind(c(2,4,4,1,2),c(2,3,5,4,2)))
Sr2 = Polygon(cbind(c(5,4,2,5),c(2,3,2,2)))
Sr3 = Polygon(cbind(c(4,4,5,10,4),c(5,3,2,5,5)))
Sr4 = Polygon(cbind(c(5,6,6,5,5),c(4,4,3,3,4)), hole = TRUE)

Srs1 = Polygons(list(Sr1), "s1")
Srs2 = Polygons(list(Sr2), "s2")
Srs3 = Polygons(list(Sr3, Sr4), "s3/4")
stencil <- simplegeom(Srl = list(Srs1,Srs2,Srs3), proj4string = CRS("+proj=longlat +datum=WGS84"))

webgeom object

The webgeom class is designed to hold references to web feature service (WFS) details and make it available to the processing engine.

Similar to webdata (see below), the webgeom class has public fields that can be set and accessed using simple methods. Public fields in webgeom:

To create a default webgeom object:

stencil <- webgeom()

The user-level information in webgeom is all available with the webgeom “show” method (or print).

stencil
## An object of class "webgeom":
## url: http://cida.usgs.gov/gdp/geoserver/wfs 
## geom: NA 
## attribute: NA 
## values: NA 
## version: 1.1.0

The public fields can be accessed in by using the field name:

geom(stencil) <- "derivative:CONUS_States"
version(stencil)
## [1] "1.1.0"
attribute(stencil) <- "STATE"
values(stencil) <- c("Wisconsin","Maine")

quick access to web available data for webgeoms

There are some built in webgeom templates that can be used to figure out the pattern, or to use these datasets for analysis. Currently, the package only supports US States, Level III Ecoregions, or HUC8s:

stencil <- webgeom('state::Wisconsin')
stencil
## An object of class "webgeom":
## url: http://cida.usgs.gov/gdp/geoserver/wfs 
## geom: derivative:CONUS_States 
## attribute: STATE 
## values: Wisconsin 
## version: 1.1.0
query(stencil, 'values')
##  [1] "Alabama"              "Arizona"             
##  [3] "Arkansas"             "California"          
##  [5] "Colorado"             "Connecticut"         
##  [7] "Delaware"             "District of Columbia"
##  [9] "Florida"              "Georgia"             
## [11] "Idaho"                "Illinois"            
## [13] "Indiana"              "Iowa"                
## [15] "Kansas"               "Kentucky"            
## [17] "Louisiana"            "Maine"               
## [19] "Maryland"             "Massachusetts"       
## [21] "Michigan"             "Minnesota"           
## [23] "Mississippi"          "Missouri"            
## [25] "Montana"              "Nebraska"            
## [27] "Nevada"               "New Hampshire"       
## [29] "New Jersey"           "New Mexico"          
## [31] "New York"             "North Carolina"      
## [33] "North Dakota"         "Ohio"                
## [35] "Oklahoma"             "Oregon"              
## [37] "Pennsylvania"         "Rhode Island"        
## [39] "South Carolina"       "South Dakota"        
## [41] "Tennessee"            "Texas"               
## [43] "Utah"                 "Vermont"             
## [45] "Virginia"             "Washington"          
## [47] "West Virginia"        "Wisconsin"           
## [49] "Wyoming"
webgeom('state::Wisconsin,Maine')
## An object of class "webgeom":
## url: http://cida.usgs.gov/gdp/geoserver/wfs 
## geom: derivative:CONUS_States 
## attribute: STATE 
## values: Wisconsin Maine 
## version: 1.1.0
webgeom('HUC8::09020306,14060009')
## An object of class "webgeom":
## url: http://cida.usgs.gov/gdp/geoserver/wfs 
## geom: derivative:wbdhu8_alb_simp 
## attribute: HUC_8 
## values: 09020306 14060009 
## version: 1.1.0
webgeom('ecoregion::Colorado Plateaus,Driftless Area')
## An object of class "webgeom":
## url: http://cida.usgs.gov/gdp/geoserver/wfs 
## geom: derivative:Level_III_Ecoregions 
## attribute: LEVEL3_NAM 
## values: Colorado Plateaus Driftless Area 
## version: 1.1.0
head(query(webgeom('ecoregion::Colorado Plateaus,Driftless Area'), 'values'), 10)
##  [1] "Ahklun And Kilbuck Mountains" 
##  [2] "Alaska Peninsula Mountains"   
##  [3] "Alaska Range"                 
##  [4] "Aleutian Islands"             
##  [5] "Arctic Coastal Plain"         
##  [6] "Arctic Foothills"             
##  [7] "Arizona/New Mexico Mountains" 
##  [8] "Arizona/New Mexico Plateau"   
##  [9] "Arkansas Valley"              
## [10] "Atlantic Coastal Pine Barrens"

query function for webgeom

The query function on webgeom can be used to find possible inputs for each public field (other than version and url currently):

query(stencil, 'geoms')
##  [1] "sample:Alaska"                  
##  [2] "sample:CONUS_Climate_Divisions" 
##  [3] "derivative:CONUS_States"        
##  [4] "sample:CONUS_states"            
##  [5] "sample:CSC_Boundaries"          
##  [6] "derivative:FWS_LCC"             
##  [7] "sample:FWS_LCC"                 
##  [8] "upload:GIS"                     
##  [9] "upload:GIS_trout"               
## [10] "derivative:wbdhu8_alb_simp"     
## [11] "derivative:Level_III_Ecoregions"
## [12] "derivative:NCA_Regions"         
## [13] "draw:Philadelphia_Airport"      
## [14] "draw:Philly_airport"            
## [15] "derivative:US_Counties"         
## [16] "upload:delin"                   
## [17] "draw:junk"                      
## [18] "sample:nps_boundary_2013"       
## [19] "sample:simplified_HUC8s"        
## [20] "draw:sq"                        
## [21] "upload:tl_f"
query(stencil, 'attributes')
## [1] "STATE"
query(stencil, 'values')
##  [1] "Alabama"              "Arizona"             
##  [3] "Arkansas"             "California"          
##  [5] "Colorado"             "Connecticut"         
##  [7] "Delaware"             "District of Columbia"
##  [9] "Florida"              "Georgia"             
## [11] "Idaho"                "Illinois"            
## [13] "Indiana"              "Iowa"                
## [15] "Kansas"               "Kentucky"            
## [17] "Louisiana"            "Maine"               
## [19] "Maryland"             "Massachusetts"       
## [21] "Michigan"             "Minnesota"           
## [23] "Mississippi"          "Missouri"            
## [25] "Montana"              "Nebraska"            
## [27] "Nevada"               "New Hampshire"       
## [29] "New Jersey"           "New Mexico"          
## [31] "New York"             "North Carolina"      
## [33] "North Dakota"         "Ohio"                
## [35] "Oklahoma"             "Oregon"              
## [37] "Pennsylvania"         "Rhode Island"        
## [39] "South Carolina"       "South Dakota"        
## [41] "Tennessee"            "Texas"               
## [43] "Utah"                 "Vermont"             
## [45] "Virginia"             "Washington"          
## [47] "West Virginia"        "Wisconsin"           
## [49] "Wyoming"

gridded data (fabric)

The fabric concept in geoknife represents the gridded dataset that will be operated on by the tool. fabric can be a time-varying dataset (such as PRISM) or a spatial snapshot coverage dataset (such as the NLCD). At present, fabric is limited to datasets that can be accessed using the OPeNDAP protocol or WMS (web map service). Most helper functions in geoknife, including query(fabric,'variables') tend to work better for OPeNDAP datasets.

webdata object

The webdata class holds all the important information for webdatasets in order to make them available for processing by geoknife’s outsourced geoprocessing engine, the Geo Data Portal. Public fields in webdata:

To create a default webdata object:

fabric <- webdata()

The user-level information in webdata is all available with the webdata “show” method (or print).

fabric
## An object of class "webdata":
## times: NA NA 
## url: NA 
## variables: NA

The public fields can be accessed in by using the field name:

times(fabric)
## [1] NA NA
url(fabric) <- 'http://cida.usgs.gov/thredds/dodsC/prism'
variables(fabric) <- 'ppt'

times(fabric)[1] <- as.POSIXct('1990-01-01')

quick access to web available data

The Geo Data Portal’s web data catalog is quite extensive, and inludes many datasets that can all be processed with geoknife. Check it out at cida.usgs.gov/gdp. This is not a complete list of all relevant datasets that can be accessed and processed. The geoknife package has a number of quick access datasets build in (similar to quick start webgeom objects).

An example of a quick start dataset:

fabric <- webdata('prism')
fabric
## An object of class "webdata":
## times: 1895-01-01 1899-01-01 
## url: http://cida.usgs.gov/thredds/dodsC/prism 
## variables: ppt

which can be a starting point for the PRISM dataset, as the fields can be modified:

times(fabric) <- c('1990-01-01','2010-01-01')
variables(fabric) <- c('ppt','tmx', 'tmn')
fabric
## An object of class "webdata":
## times: 1990-01-01 2010-01-01 
## url: http://cida.usgs.gov/thredds/dodsC/prism 
## variables: ppt tmx tmn

query function for webdata

The query function works on webdata, similar to how it works for webgeom objects. For the PRISM dataset specified above, the time range of the dataset can come from query with times:

variables(fabric) <- 'ppt'
query(fabric, 'times')
## [1] "1895-01-01 UTC" "2013-02-01 UTC"

likewise, variables with variables:

query(fabric, 'variables')
## [1] "ppt" "tmx" "tmn"

Note that a variable has to be specified to use the times query:

variables(fabric) <- NA

This will fail:

query(fabric, 'times')

At present, the geoknife package does not have a query method for dataset urls.

knife object

The webprocess class holds all the important information for geoknife processing details for the outsourced geoprocessing engine, the Geo Data Portal. Public fields in webprocess:

query function for webprocess

The query function works on webprocess, similar to how it works for webgeom and webdata objects. For a default webprocess object, the available algorithms can be queried by:

knife <- webprocess()
query(knife, 'algorithms')
## $`WCS Subset`
## [1] "gov.usgs.cida.gdp.wps.algorithm.FeatureCoverageIntersectionAlgorithm"
## 
## $`OPeNDAP Subset`
## [1] "gov.usgs.cida.gdp.wps.algorithm.FeatureCoverageOPeNDAPIntersectionAlgorithm"
## 
## $`Categorical Coverage Fraction`
## [1] "gov.usgs.cida.gdp.wps.algorithm.FeatureCategoricalGridCoverageAlgorithm"
## 
## $`Area Grid Statistics (weighted)`
## [1] "gov.usgs.cida.gdp.wps.algorithm.FeatureWeightedGridStatisticsAlgorithm"
## 
## $`Area Grid Statistics (unweighted)`
## [1] "gov.usgs.cida.gdp.wps.algorithm.FeatureGridStatisticsAlgorithm"
## 
## $`PRMS Parameter Generator`
## [1] "gov.usgs.cida.gdp.wps.algorithm.PRMSParameterGeneratorAlgorithm"
## 
## $`A generalized daily climate statistics algorithm`
## [1] "org.n52.wps.server.r.gridded_daily"
## 
## $`A generalized bioclim algorithm`
## [1] "org.n52.wps.server.r.gridded_bioclim"

Changing the webprocess url will modify the endpoint for the query, and different algorithms may be available:

url(knife) <- 'http://cida-test.er.usgs.gov/gdp/process/WebProcessingService'
query(knife, 'algorithms')
## $`WCS Subset`
## [1] "gov.usgs.cida.gdp.wps.algorithm.FeatureCoverageIntersectionAlgorithm"
## 
## $`OPeNDAP Subset`
## [1] "gov.usgs.cida.gdp.wps.algorithm.FeatureCoverageOPeNDAPIntersectionAlgorithm"
## 
## $`Categorical Coverage Fraction`
## [1] "gov.usgs.cida.gdp.wps.algorithm.FeatureCategoricalGridCoverageAlgorithm"
## 
## $`Area Grid Statistics (weighted)`
## [1] "gov.usgs.cida.gdp.wps.algorithm.FeatureWeightedGridStatisticsAlgorithm"
## 
## $`Area Grid Statistics (unweighted)`
## [1] "gov.usgs.cida.gdp.wps.algorithm.FeatureGridStatisticsAlgorithm"
## 
## $`PRMS Parameter Generator`
## [1] "gov.usgs.cida.gdp.wps.algorithm.PRMSParameterGeneratorAlgorithm"
## 
## $`A generalized daily climate statistics algorithm`
## [1] "org.n52.wps.server.r.gridded_daily"
## 
## $`A generalized bioclim algorithm`
## [1] "org.n52.wps.server.r.gridded_bioclim"

algorithm

As noted above, the algorithm field in webprocess is a list, specifying the algorithm name and relative path to the algorithm endpoint. To access or change the algorithm:

knife <- webprocess()
algorithm(knife)
## $`Area Grid Statistics (weighted)`
## [1] "gov.usgs.cida.gdp.wps.algorithm.FeatureWeightedGridStatisticsAlgorithm"
algorithm(knife) <- query(knife, 'algorithms')[1]
algorithm(knife)
## $`WCS Subset`
## [1] "gov.usgs.cida.gdp.wps.algorithm.FeatureCoverageIntersectionAlgorithm"
# -- or --
algorithm(knife) <- list('Area Grid Statistics (weighted)' = 
                           "gov.usgs.cida.gdp.wps.algorithm.FeatureWeightedGridStatisticsAlgorithm")

inputs

getting and setting processInputs for geoknife is currently in. Check back later.

url

The url field in webprocess can be accessed and set as expected:

url(knife)
## [1] "http://cida.usgs.gov/gdp/process/WebProcessingService"
url(knife) <- 'http://cida-test.er.usgs.gov/gdp/process/WebProcessingService'

wait

The wait boolean in webprocess can set during creation:

knife <- webprocess(wait = TRUE)
knife
## An object of class "webprocess":
## url: http://cida.usgs.gov/gdp/process/WebProcessingService 
## algorithm: Area Grid Statistics (weighted) 
## version: 1.0.0 
## process inputs: 
##    TIME_START: NA
##    TIME_END: NA
##    SUMMARIZE_TIMESTEP: false
##    SUMMARIZE_FEATURE_ATTRIBUTE: false
##    FEATURE_ATTRIBUTE_NAME: 
##    DATASET_URI: 
##    DATASET_ID: 
##    REQUIRE_FULL_COVERAGE: true
##    STATISTICS: 
##    DELIMITER: COMMA
##    GROUP_BY: 
## wait: TRUE 
## email: NA

email

The email field in webprocess can be accessed and set as expected:

knife <- webprocess(email = 'fake.email@gmail.com')
knife
## An object of class "webprocess":
## url: http://cida.usgs.gov/gdp/process/WebProcessingService 
## algorithm: Area Grid Statistics (weighted) 
## version: 1.0.0 
## process inputs: 
##    TIME_START: NA
##    TIME_END: NA
##    SUMMARIZE_TIMESTEP: false
##    SUMMARIZE_FEATURE_ATTRIBUTE: false
##    FEATURE_ATTRIBUTE_NAME: 
##    DATASET_URI: 
##    DATASET_ID: 
##    REQUIRE_FULL_COVERAGE: true
##    STATISTICS: 
##    DELIMITER: COMMA
##    GROUP_BY: 
## wait: FALSE 
## email: fake.email@gmail.com

geojob details

The geojob in the geoknife package contains all of the processing configuration details required to execute a processing request to the Geo Data Portal and check up on the state of that request. A geojob object is created using the high-level function geoknife() with the stencil, fabric and optional knife arguments as described above.

geojob class and details

The geojob public fields include:

cancel geojob

The geoknife package currently limits the user processing requests to single-running processes, so as to avoid creating thousands of requests in error, which could overwhelm the processing resources. If there is a reason to support additional jobs at one time, please email the package maintainers with your query.

To cancel and existing job: Cancel a running job but retain the details:

id(job)
## [1] "http://cida.usgs.gov:80/gdp/process/RetrieveResultServlet?id=6cf2e928-8a4a-4767-abc6-1ae3f70d207b"
job <- cancel(job)
id(job)
## [1] "<no active job>"

To cancel any running job without specifying the geojob reference:

cancel()

geoknife web resources

geoknife outsources all major geospatial processing tasks to a remote server. Because of this, users must have an active internet connection. Problems with connections to datasets or the processing resources are rare, but they do happen. When experiencing a connectivity problem, the best approach is often to try again later or email gdp@usgs.gov with any questions. The various web dependencies are described below.

Geo Data Portal

The U.S. Geological Survey’s “Geo Data Portal” (GDP) provides the data access and processing services that are leveraged by the geoknife package. See cida.usgs.gov/gdp for the GDP user interface.

CSW (catalog service for the web)

The Catalog Service for the Web is an OGC standard for dataset metadata storage and access. The GDP has a catalog of datasets known to work with geoknife and the GDP, which are available through the httr R package:

response <- httr::POST(url = 'http://cida.usgs.gov/gdp/geonetwork/srv/en/csw', body = request, content_type_xml())

a more user-friendly dataset query function is currently in development to return these datasets and their associated metadata.

Authors and Contributors

Jordan Read @jread-usgs