Subscribe to R-spatial feed
R Spatial software blogs and ideas 2018-10-30T18:47:18+00:00 Jekyll v3.8.4 https://avatars1.githubusercontent.com/u/25086656
Updated: 13 hours 3 min ago

mapedit - interactively edit spatial data in R

Mon, 01/30/2017 - 01:00

[view raw Rmd]

The R ecosystem offers a powerful set of packages for geospatial analysis. For a comprehensive list see the CRAN Task View: Analysis of Spatial Data. Yet, many geospatial workflows require interactivity for smooth uninterrupted completion. With new tools, such as htmlwidgets, shiny, and crosstalk, we can now inject this useful interactivity without leaving the R environment. In the first phase of the mapedit project, we have focused on experimenting and creating proof of concepts for the following three objectives:

  1. drawing, editing, and deleting features,

  2. selecting and querying of features and map regions,

  3. editing attributes.

Install mapedit

To run the code in the following discussion, please install with devtools::install_github. Please be aware that the current functionality is strictly a proof of concept, and the API will change rapidly and dramatically.

devtools::install_github("bhaskarvk/leaflet.extras") devtools::install_github("r-spatial/mapedit") Drawing, Editing, Deleting Features

We would like to set up an easy process for CRUD (create, read, update, and delete) of map features. The function edit_map demonstrates a first step toward this goal.

Proof of Concept 1 | Draw on Blank Map

To see how we might add some features, let’s start with a blank map, and then feel free to draw, edit, and delete with the Leaflet.Draw toolbar on the map. Once finished drawing simply press “Done”.

library(leaflet) library(mapedit) what_we_created <- leaflet() %>% addTiles() %>% edit_map()

edit_map returns a list with drawn, edited, deleted, and finished features as GeoJSON. In this case, if we would like to see our finished creation we can focus on what_we_created$finished. Since this is GeoJSON, the easiest way to see what we just created will be to use the addGeoJSON function from leaflet. This works well with polylines, polygons, rectangles, and points, but circles will be treated as points without some additional code. In future versions of the API it is likely that mapedit will return simple features gemometries rather than geojson by default.

leaflet() %>% addTiles() %>% addGeoJSON(what_we_created$finished) Proof of Concept 2 | Edit and Delete Existing Features

As an extension of the first proof of concept, we might like to edit and/or delete existing features. Let’s play Donald Trump for this exercise and use the border between Mexico and the United States for California and Arizona. For the sake of the example, let’s use a simplified polyline as our border. As we have promised we want to build a wall, but if we could just move the border a little in some places, we might be able to ease construction.

library(sf) # simplified border for purpose of exercise border <- st_as_sfc( "LINESTRING(-109.050197582692 31.3535554844322, -109.050197582692 31.3535554844322, -111.071681957692 31.3723176640684, -111.071681957692 31.3723176640684, -114.807033520192 32.509681296831, -114.807033520192 32.509681296831, -114.741115551442 32.750242384668, -114.741115551442 32.750242384668, -117.158107738942 32.5652527715121, -117.158107738942 32.5652527715121)" ) %>% st_set_crs(4326) # plot quickly for visual inspection plot(border)

Since we are Trump, we can do what we want, so let’s edit the line to our liking. We will use mapview for our interactive map since it by default gives us an OpenTopoMap layer and the develop branch includes preliminary simple features support. With our new border and fence, we will avoid the difficult mountains and get a little extra beachfront.

# use develop branch of mapview with simple features support # devtools::install_github("environmentalinformatics-marburg/mapview@develop") library(mapview) new_borders <- mapview(border)@map %>% edit_map("border")

Now, we can quickly inspect our new borders and then send the coordinates to the wall construction company.

leaflet() %>% addTiles() %>% fitBounds(-120, 35, -104, 25) %>% addGeoJSON(new_borders$drawn)

Disclaimers

If you played enough with the border example, you might notice a couple of glitches and missing functionality. This is a good time for a reminder that this is alpha and intended as a proof of concept. Please provide feedback, so that we can insure a quality final product. In this case, the older version of Leaflet.Draw in RStudio Viewer has some bugs, so clicking an existing point creates a new one rather than allowing editing of that point. Also, the returned list from edit_map has no knowledge of the provided features.

Selecting Regions

The newest version of leaflet provides crosstalk support, but support is currently limited to addCircleMarkers. This functionality is enhanced by the sf use of list columns and integration with dplyr verbs. Here is a quick example with the breweries91 data from mapview.

library(crosstalk) library(mapview) library(sf) library(shiny) library(dplyr) # convert breweries91 from mapview into simple features # and add a Century column that we will use for selection brew_sf <- st_as_sf(breweries91) %>% mutate(century = floor(founded/100)*100) %>% filter(!is.na(century)) %>% mutate(id=1:n()) pts <- SharedData$new(brew_sf, key = ~id, group = "grp1") ui <- fluidPage( fluidRow( column(4, filter_slider(id="filterselect", label="Century Founded", sharedData=pts, column=~century, step=50)), column(6, leafletOutput("leaflet1")) ), h4("Selected points"), verbatimTextOutput("selectedpoints") ) server <- function(input, output, session) { # unfortunatly create SharedData again for scope pts <- SharedData$new(brew_sf, key = ~id, group = "grp1") lf <- leaflet(pts) %>% addTiles() %>% addMarkers() not_rendered <- TRUE # hack to only draw leaflet once output$leaflet1 <- renderLeaflet({ if(req(not_rendered,cancelOutput=TRUE)) { not_rendered <- FALSE lf } }) output$selectedpoints <- renderPrint({ df <- pts$data(withSelection = TRUE) cat(nrow(df), "observation(s) selected\n\n") str(dplyr::glimpse(df)) }) } shinyApp(ui, server)

With mapedit, we would like to enhance the geospatial crosstalk integration to extend beyond leaflet::addCircleMarkers. In addition, we would like to provide an interactive interface to the geometric operations of sf, such as st_intersects(), st_difference(), and st_contains().

Proof of Concept 3

As a select/query proof of concept, assume we want to interactively select some US states for additional analysis. We will build off Bhaskar Karambelkar’s leaflet projection example using Bob Rudis albersusa package.

# use @bhaskarvk USA Albers with leaflet code # https://bhaskarvk.github.io/leaflet/examples/proj4Leaflet.html #devtools::install_github("hrbrmstr/albersusa") library(albersusa) library(sf) library(leaflet) library(mapedit) spdf <- usa_composite() %>% st_as_sf() pal <- colorNumeric( palette = "Blues", domain = spdf$pop_2014 ) bounds <- c(-125, 24 ,-75, 45) (lf <- leaflet( options= leafletOptions( worldCopyJump = FALSE, crs=leafletCRS( crsClass="L.Proj.CRS", code='EPSG:2163', proj4def='+proj=laea +lat_0=45 +lon_0=-100 +x_0=0 +y_0=0 +a=6370997 +b=6370997 +units=m +no_defs', resolutions = c(65536, 32768, 16384, 8192, 4096, 2048,1024, 512, 256, 128) ))) %>% fitBounds(bounds[1], bounds[2], bounds[3], bounds[4]) %>% setMaxBounds(bounds[1], bounds[2], bounds[3], bounds[4]) %>% addPolygons( data=spdf, weight = 1, color = "#000000", # adding group necessary for identification group = ~iso_3166_2, fillColor=~pal(pop_2014), fillOpacity=0.7, label=~stringr::str_c(name,' ', format(pop_2014, big.mark=",")), labelOptions= labelOptions(direction = 'auto')#, #highlightOptions = highlightOptions( # color='#00ff00', bringToFront = TRUE, sendToBack = TRUE) ) ) # test out select_map with albers example select_map( lf, style_false = list(weight = 1), style_true = list(weight = 4) )

The select_map() function will return a data.frame with an id/group column and a selected column. select_map() will work with nearly all leaflet overlays and offers the ability to customize the styling of selected and unselected features.

Editing Attributes

A common task in geospatial analysis involves editing or adding feature attributes. While much of this can be accomplished in the R console, an interactive UI on a reference map can often help perform this task. Mapbox’s geojson.io provides a good reference point for some of the features we would like to provide in mapedit.

Proof of Concept 4

As a proof of concept, we made a Shiny app that thinly wraps a slightly modified geojson.io. Currently, we will have to pretend that there is a mechanism to load R feature data onto the map, since this functionality does not yet exist.

library(shiny) edited_features <- runGitHub( "geojson.io", "timelyportfolio", ref="shiny" )

Conclusion

mapedit hopes to add useful interactivity to your geospatial workflows by leveraging powerful new functionality in R with the interactivity of HTML, JavaScript, and CSS. mapedit will be better with your feedback, requests, bug reports, use cases, and participation. We will report on progress periodically with blog posts on this site, and we will develop openly on the mapedit Github repo.

sf - plot, graticule, transform, units, cast, is

Thu, 01/12/2017 - 01:00

[view raw Rmd]

This year began with the R Consortium blog on simple features:

#rstats A new post by Edzer Pebesma reviews the status of the R Consortium’s Simple Features project: https://t.co/W8YqH3WQVJ

— Joseph Rickert (@RStudioJoe) January 3, 2017

This blog post describes changes of sf 0.2-8 and upcoming 0.2-9, compared to 0.2-7, in more detail.

Direct linking to Proj.4

Since 0.2-8, sf links directly to the Proj.4 library:

library(sf) ## Linking to GEOS 3.5.0, GDAL 2.1.0, proj.4 4.9.2

before that, it would use the projection interface of GDAL, which uses Proj.4, but exposes only parts of it. The main reason for switching to Proj.4 is the ability for stronger error checking. For instance, where GDAL would interpret any unrecognized field for +datum as WGS84:

# sf 0.2-7: > st_crs("+proj=longlat +datum=NAD26") $epsg [1] NA $proj4string [1] "+proj=longlat +ellps=WGS84 +no_defs" attr(,"class") [1] "crs"

Now, with sf 0.2-8 we get a proper error in case of an unrecognized +datum field:

t = try(st_crs("+proj=longlat +datum=NAD26")) attr(t, "condition") ## <simpleError in make_crs(x): invalid crs: +proj=longlat +datum=NAD26, reason: unknown elliptical parameter name> plotting

The default plot method for sf objects (simple features with attributes, or data.frames with a simple feature geometry list-column) now plots the set of maps, one for each attribute, with automatic color scales:

nc = st_read(system.file("gpkg/nc.gpkg", package="sf"), quiet = TRUE) plot(nc)

well, that is all there is, basically. For plotting a single map, select the appropriate attribute

plot(nc["SID79"])

or only the geometry:

plot(st_geometry(nc))

graticules

Package sf gained a function st_graticule to generate graticules, grids formed by lines with constant longitude or latitude. Suppose we want to project nc to the state plane, and plot it with a longitude latitude graticule in NAD27 (the original datum of nc):

nc_sp = st_transform(nc["SID79"], 32119) # NC state plane, m plot(nc_sp, graticule = st_crs(nc), axes = TRUE)

The underlying function, st_graticule, can be used directly to generate a simple object with graticules, but is rather meant to be used by plotting functions that benefit from a graticule in the background, such as plot or ggplot. The function provides the end points of graticules and the angle at which they end; an example for using Lambert equal area on the USA is found in the help page of st_graticule:

The default plotting method for simple features with longitude/latitude coordinates is the equirectangular projection, (also called geographic projection, or equidistant cylindrical (eqc) projection) which linearly maps longitude and latitude into \(x\) and \(y\), transforming \(y\) such that in the center of the map 1 km easting equals 1 km northing. This is also the default for sp::plot, sp::spplot and ggplot2::coord_quickmap. The official Proj.4 transformation for this is found here.

We can obtain e.g. a plate carrée projection (with one degree latitude equaling one degree longitude) with

caree = st_crs("+proj=eqc") plot(st_transform(nc[1], caree), graticule = st_crs(nc), axes=TRUE, lon = -84:-76)

and we see indeed that the lon/lat grid is formed of squares.

The usual R plot for nc obtained by

plot(nc[1], graticule = st_crs(nc), axes = TRUE)

corrects for latitude. The equivalent, officially projected map is obtained by using the eqc projection with the correct latitude:

mean(st_bbox(nc)[c(2,4)]) ## [1] 35.23582 eqc = st_crs("+proj=eqc +lat_ts=35.24") plot(st_transform(nc[1], eqc), graticule = st_crs(nc), axes=TRUE)

so that in the center of these (identical) maps, 1 km east equals 1 km north.

geosphere and units support

sf now uses functions in package geosphere to compute distances or areas on the sphere. This is only possible for points and not for arbitrary feature geometries:

centr = st_centroid(nc) ## Warning in st_centroid.sfc(st_geometry(x)): st_centroid does not give ## correct centroids for longitude/latitude data st_distance(centr[c(1,10)])[1,2] ## 34093.21 m

As a comparison, we can compute distances in two similar projections, each having a different measurement unit:

centr.sp = st_transform(centr, 32119) # NC state plane, m (m <- st_distance(centr.sp[c(1,10)])[1,2]) ## 34097.54 m centr.ft = st_transform(centr, 2264) # NC state plane, US feet (ft <- st_distance(centr.ft[c(1,10)])[1,2]) ## 111868.3 US_survey_foot

and we see that the units are reported, by using package units. To verify that the distances are equivalent, we can compute

ft/m ## 1 1

which does automatic unit conversion before computing the ratio. (Here, 1 1 should be read as one, unitless (with unit 1)).

For spherical distances, sf uses geosphere::distGeo. It passes on the parameters of the datum, as can be seen from

st_distance(centr[c(1,10)])[1,2] # NAD27 ## 34093.21 m st_distance(st_transform(centr, 4326)[c(1,10)])[1,2] # WGS84 ## 34094.28 m

Other measures come with units too, e.g. st_area

st_area(nc[1:5,]) ## Units: m^2 ## [1] 1137388604 611077263 1423489919 694546292 1520740530

units vectors can be coerced to numeric by

as.numeric(st_area(nc[1:5,])) ## [1] 1137388604 611077263 1423489919 694546292 1520740530 type casting

With help from Mike Sumner and Etienne Racine, we managed to get a working st_cast, which helps converting one geometry in another.

casting individual geometries (sfg)

Casting individual geometries will close polygons when needed:

st_point(c(0,1)) %>% st_cast("MULTIPOINT") ## MULTIPOINT(0 1) st_linestring(rbind(c(0,1), c(5,6))) %>% st_cast("MULTILINESTRING") ## MULTILINESTRING((0 1, 5 6)) st_linestring(rbind(c(0,0), c(1,0), c(1,1))) %>% st_cast("POLYGON") ## POLYGON((0 0, 1 0, 1 1, 0 0))

and will warn on loss of information:

st_linestring(rbind(c(0,1), c(5,6))) %>% st_cast("POINT") ## Warning in st_cast.LINESTRING(., "POINT"): point from first coordinate only ## POINT(0 1) st_multilinestring(list(matrix(1:4,2), matrix(1:6,,2))) %>% st_cast("LINESTRING") ## Warning in st_cast.MULTILINESTRING(., "LINESTRING"): keeping first ## linestring only ## LINESTRING(1 3, 2 4) casting sets of geometries (sfc)

Casting sfc objects can group or ungroup geometries:

# group: st_sfc(st_point(0:1), st_point(2:3), st_point(4:5)) %>% st_cast("MULTIPOINT", ids = c(1,1,2)) ## Geometry set for 2 features ## geometry type: MULTIPOINT ## dimension: XY ## bbox: xmin: 0 ymin: 1 xmax: 4 ymax: 5 ## epsg (SRID): NA ## proj4string: NA ## MULTIPOINT(0 1, 2 3) ## MULTIPOINT(4 5) # ungroup: st_sfc(st_multipoint(matrix(1:4,,2))) %>% st_cast("POINT") ## Geometry set for 2 features ## geometry type: POINT ## dimension: XY ## bbox: xmin: 1 ymin: 3 xmax: 2 ymax: 4 ## epsg (SRID): NA ## proj4string: NA ## POINT(1 3) ## POINT(2 4)

st_cast with no to argument will convert mixes of GEOM and MULTIGEOM to MULTIGEOM, where GEOM is POINT, LINESTRING or POLYGON, e.g.

st_sfc( st_multilinestring(list(matrix(5:8,,2))), st_linestring(matrix(1:4,2)) ) %>% st_cast() ## Geometry set for 2 features ## geometry type: MULTILINESTRING ## dimension: XY ## bbox: xmin: 1 ymin: 3 xmax: 6 ymax: 8 ## epsg (SRID): NA ## proj4string: NA ## MULTILINESTRING((5 7, 6 8)) ## MULTILINESTRING((1 3, 2 4))

or unpack geometry collections:

x <- st_sfc( st_multilinestring(list(matrix(5:8,,2))), st_point(c(2,3)) ) %>% st_cast("GEOMETRYCOLLECTION") x ## Geometry set for 2 features ## geometry type: GEOMETRYCOLLECTION ## dimension: XY ## bbox: xmin: 2 ymin: 3 xmax: 6 ymax: 8 ## epsg (SRID): NA ## proj4string: NA ## GEOMETRYCOLLECTION(MULTILINESTRING((5 7, 6 8))) ## GEOMETRYCOLLECTION(POINT(2 3)) x %>% st_cast() ## Geometry set for 2 features ## geometry type: GEOMETRY ## dimension: XY ## bbox: xmin: 2 ymin: 3 xmax: 6 ymax: 8 ## epsg (SRID): NA ## proj4string: NA ## MULTILINESTRING((5 7, 6 8)) ## POINT(2 3) casting on sf objects

The casting of sf objects works in principle identical, except that for ungrouping, attributes are repeated (and might give rise to warning messages),

# ungroup: st_sf(a = 1, geom = st_sfc(st_multipoint(matrix(1:4,,2)))) %>% st_cast("POINT") ## Warning in st_cast.sf(., "POINT"): repeating attributes for all sub- ## geometries for which they may not be constant ## Simple feature collection with 2 features and 1 field ## geometry type: POINT ## dimension: XY ## bbox: xmin: 1 ymin: 3 xmax: 2 ymax: 4 ## epsg (SRID): NA ## proj4string: NA ## c.1..1. geom ## 1 1 POINT(1 3) ## 2 1 POINT(2 4)

and for grouping, attributes are aggregated, which requires an aggregation function

# group: st_sf(a = 1:3, geom = st_sfc(st_point(0:1), st_point(2:3), st_point(4:5))) %>% st_cast("MULTIPOINT", ids = c(1,1,2), FUN = mean) ## Simple feature collection with 2 features and 2 fields ## geometry type: MULTIPOINT ## dimension: XY ## bbox: xmin: 0 ymin: 1 xmax: 4 ymax: 5 ## epsg (SRID): NA ## proj4string: NA ## ids.group a geom ## 1 1 1.5 MULTIPOINT(0 1, 2 3) ## 2 2 3 MULTIPOINT(4 5) type selection

In case we have a mix of geometry types, we can select those of a particular geometry type by the new helper function st_is. As an example we create a mix of polygons, lines and points:

g = st_makegrid(n=c(2,2), offset = c(0,0), cellsize = c(2,2)) s = st_sfc(st_polygon(list(rbind(c(1,1), c(2,1),c(2,2),c(1,2),c(1,1))))) i = st_intersection(st_sf(a=1:4, geom = g), st_sf(b = 2, geom = s)) ## Warning in st_intersection(st_sf(a = 1:4, geom = g), st_sf(b = 2, geom = ## s)): attribute variables are assumed to be spatially constant throughout ## all geometries i ## Simple feature collection with 4 features and 2 fields ## geometry type: GEOMETRY ## dimension: XY ## bbox: xmin: 1 ymin: 1 xmax: 2 ymax: 2 ## epsg (SRID): NA ## proj4string: NA ## a b geometry ## 1 1 2 POLYGON((2 2, 2 1, 1 1, 1 2... ## 2 2 2 LINESTRING(2 2, 2 1) ## 3 3 2 LINESTRING(1 2, 2 2) ## 4 4 2 POINT(2 2)

and can select using dplyr::filter, or directly using st_is:

filter(i, st_is(geometry, c("POINT"))) ## Simple feature collection with 1 feature and 2 fields ## geometry type: GEOMETRY ## dimension: XY ## bbox: xmin: 1 ymin: 1 xmax: 2 ymax: 2 ## epsg (SRID): NA ## proj4string: NA ## a b geometry ## 1 4 2 POINT(2 2) filter(i, st_is(geometry, c("POINT", "LINESTRING"))) ## Simple feature collection with 3 features and 2 fields ## geometry type: GEOMETRY ## dimension: XY ## bbox: xmin: 1 ymin: 1 xmax: 2 ymax: 2 ## epsg (SRID): NA ## proj4string: NA ## a b geometry ## 1 2 2 LINESTRING(2 2, 2 1) ## 2 3 2 LINESTRING(1 2, 2 2) ## 3 4 2 POINT(2 2) st_is(i, c("POINT", "LINESTRING")) ## [1] FALSE TRUE TRUE TRUE

OpenEO: a GDAL for Earth Observation Analytics

Tue, 11/29/2016 - 01:00

Earth observation data, or satellite imagery, is one of the richest sources to find out how our Earth is changing. The amount of Earth observation data we collect Today has become too large to analyze on a single computer. Although most of the Earth observation data is available for free, practical difficulties we are currently facing when we try to analyze it seriously constrains the potential benefits for citizens, industry, scientists, or society. How did we get here?

GIS: the 80’s

To understand the current difficulty when analyzing big Earth observation data analysis, let us look how geographic information systems (GIS) developed over the past decades. In the early days, they would be isolated structures:

where one would get things done in isolation, without any chance of verifying or comparing it with another system: these were expensive systems, hard to set up and maintain, and (with the exception of GRASS) closed databases and closed source software.

File formats: the 90’s

In the 90’s, file formats came up: several systems started supporting various file formats, and dedicated programs that would do certain file format conversions became available. This made many new things possible, such as the integration of S-Plus with Arc-Info, but to fully realize this, each software would have to implement drivers for every file format. Developing new applications or introducing a new file format are both difficult in this model.

GDAL: the 00’s

Then came GDAL! This Geospatial Data Abstraction Layer is a software library that reads and writes raster and vector data. Instead of having to write drivers for each file format, application developers needed to only write a GDAL client driver. When proposing a new file format, instead of having to convince many application developers to support it, only a GDAL driver for the new format was required to realize quick adoption. Instead of many-to-many links, only many-to-one links were needed:

R and python suddenly became strong GIS and spatial modelling tools. ArcGIS users could suddenly deal with the weird data formats from hydrologists, meteorologists, and climate scientists. Developing innovative applications and introducing new file formats became attractive.

For analyzing big Earth observation data, GDAL has its limitations, including:

  • it has weak data semantics: the data model does not include observation time or band wavelength (color), but addresses bands by dataset name and band number,
  • raster datasets cannot tell whether pixels refer to points, cells with constant value, or cells with an aggregated value; most regridding methods silently assume the least likely option (points),
  • the library cannot do much processing, meaning that clients that do the processing and use GDAL for reading and writing need to be close to where the data is, which is far away from the user’s computer.
Big Earth Observation data: the 10’s

We currently see a plethora of cloud environments that all try to solve the problem of how to effectively analyze Earth Observation data that are too large to download and process locally. It very much reminds of the isolated GIS of the 80’s, in the sense that strongly differing systems have been built with large efforts, and when solving a certain problem in one system it is practically impossible to try to solve it in another system too. Several systems work with data processing back-ends that are not open source. The following figure only shows a few systems for illustration:

Open Earth Observation Science

For open science, open data is a necessary but not sufficient condition. In order to fight the reproducibility crisis, analysis procedures need to be fully transparent and reproducible. To get there, it needs to be simple to execute a given analysis on two different data centers to verify that they yield identical results. Today, this sounds rather utopian. On the other hand, we believe that one of the reasons that data science has taken off is that the software making it possible (e.g. python, R, spark) was able to hide irrelevant details and had reached at a stage where it was transparent, stable, and robust.

Google Earth Engine is an excellent example of an interface that is simple, in the sense that users can directly address sensors and observation times and work with composites rather than having to comb through the raw data consisting of large collections of files (“scenes”). The computational back-end however is not transparent, it lets the user execute functions as far as they are provided by the API, but not inspect the source code of these function, or modify them.

How then can we make progress out of a world of currently incompatible, isolated Earth Observation data center efforts? Learning from the past, the solution might be a central interface between users who are allowed to think in high-level objects (“Landsat 7 over Western Europe, 2005-2015”) and a set of computational Earth Observation data centers that agree on supporting this interface. A GDAL for Earth Observation Analytics, so to speak. We’ll call it OpenEO.

Open EO Interface: the 20’s

The following figure shows schematically how this could look like: Users write scripts in a neutral language that addresses the data and operations, but ignores the computer architecture of the back-end. The back-end carries out the computations and returns results. It needs to be able to identify users (e.g. verify its permission to use it), but also to tell which data they provide and which computational operations they afford.

More in detail, and following the GDAL model closely, the architecture contains of

  • a client and back-end neutral set of API’s for both sides, defining which functions users can call, and which functions back-ends have to obey to,
  • a driver for each back-end that translates the neutral requirements into the platform specific offerings, or information that a certain function is not available (e.g. creating new datasets in a write-only back-end)
  • a driver for each client that binds the interfaces to the particular front-end such as R or python

With an architecture like this, it would not only become easy and attractive for users to change from one back-end to the other, but also make it much easier to choose between back-ends, because the value propositions of the back-end providers are now comparable, on paper as well as in practice.

A rough idea of the architecture is shown in this figure:

One of the drivers will interface collections of scenes in a back-end, or on the local hard drive. GDAL will remain playing an important role in back-ends if the back-end uses files in a format supported by GDAL, and in the front-end if the (small) results of big computations are fetched.

The way forward

We, as the authors of this piece, have started to work on these ideas in current activities, projects, and proposals. We are planning to submit a proposal to the EC H2020 call EO-2-2017: EO Big Data Shift. We hope that the call writers (and the reviewers) have the same problem in mind as what we explain above.

We are publishing this article because we believe this work has to be done, and only a concerted effort can realize it. When you agree and would like to participate, please get in touch with us. When successful, it will in the longer run benefit science, industry, and all of us.

Earlier blogs related to this topic

Earlier posts related to this:

Simple features now on CRAN

Wed, 11/02/2016 - 01:00

Submitting a package to CRAN is always an adventure, submitting a package with lots of external dependencies even more so. A week ago I submitted the simple features for R package to CRAN, and indeed, hell broke loose! Luckily, the people behind CRAN are extremely patient, friendly and helpful, but they let your code test on a big server farm with computers in 13 different flavors.

Of course we test code on linux and windows after every code push to github, but that feels like talking to a machine, like remotely compiling and testing. CRAN feels different: you first need to manually confirm that you did your utmost best to solve problems, and then there is a person telling you everything still remaining! Of course, this is of incredible help, and a big factor in the R community’s sustainability.

Package sf is somewhat special in that it links to GEOS and GDAL, and in particular GDAL links, depending on how it is installed, to many (77 in my case) other libraries, each with their own versions. After first submission of sf 0.2-0, I ran into the following issues with my code.

sf 0.2-0
  • I had to change all links starting with http://cran.r-project.org/web/packages/pkg into https://cran.r-project.org/packages=pkg. A direct link to a units vignette on CRAN had to be removed.
  • some of the tests gave very different output, because my default testing platforms (laptop, travis) have PostGIS, and CRAN machines don’t; I changed this so that testing without PostGIS (CRAN) is now most silent
  • the tests still output differences in GDAL and GEOS versions, but that was considered OK.

That was it! The good message

Thanks, on CRAN now. Best -k

arrived! Party time! Too early. In the evening (CRAN never sleeps) an email arrived, mentioning:

This worked for my incoming checks, but just failed for the regular checks, with Error in loadNamespace(name) : there is no package called 'roxygen2' Calls: :: ... tryCatch -> tryCatchList -> tryCatchOne -> <Anonymous> Execution halted For some reason roxygen2 is not working. Is it installed? ERROR: configuration failed for package ‘sf’

with lots of helpful hints. Indeed, my package generated Rcpp files and manual pages dynamically during install; this requires Rcpp and roxygen2 to be available unconditionally and they aren’t.

So I sat down and worked on 0.2-1, to address this. Before I could do that, an email from Oxford (Brian Ripley) arrived, telling me that sf had caused some excitement in the multi-flavor server farm:

Here, it should be noted (again) that the only two NOTEs were due to the excellent work of Jeroen Ooms who compiled GDAL and many other libraries for rwinlib, and prepared sf for downloading and using them. The rest was my contribution.

In addition, an issue was raised by Dirk Eddelbuettel, telling me that his Rcpp reverse check farm had noted him that sf required GDAL 2.0 or later, but not by properly checking its version but by generating plane compile errors. The horror, the horror.

sf 0.2-1: roxygen, Rcpp, SQLITE on Solaris

sf 0.2-1 tried to address the Rcpp and roxygen2 problems: I took their generation out of the configure and configure.win scripts. I added all automatically derived files to the github repo, to get everything in sync. Worked:

Thanks, on CRAN now. [Tonight I'll know for sure ...] Best -k

… and no emails in the evening.

Also, the errors on Solaris platforms were caused by the SQLITE library not being present, hence GeoPackage not being available as a GDAL driver. As a consequence, I had to set back the examples reading a GeoPackage polygons file to one where a shapefile is read. Bummer.

Another issue was related to relying on GDAL 2.1 features without testing for it; this was rather easily solved by conditional compiling.

This gave:

meaning SOME improvement, but where do these UBSAN reports suddenly come from!?

sf 0.2-2: byte swapping, memory alignment

Over-optimistically as I am, I had commented that life is too short to do byte swapping when reading WKB. Welcome to the CRAN server farm. Solaris-Spark is big-endian. Although the R code reading WKB does read non-native endian, this would have required to rewrite all tests, so I added byte swapping to the C++ code, using a helpful SO post.

The UBSAN issues all listed something like:

UBSAN: don't assume that pointers can point anywhere to, in valid memory; wkb.cpp:185:39: runtime error: load of misaligned address 0x6150001d0e29 for type 'uint32_t', which requires 4 byte alignment 0x6150001d0e29: note: pointer points here 00 00 00 01 06 00 00 00 01 00 00 00 01 03 00 00 00 01 00 00 00 1b 00 00 00 00 00 00 a0 41 5e 54

Sounds scary? What I had done was for every coordinate use a double pointer, pointing it to the right place in the WKB byte stream, then copy its value, and move it 8 bytes. I love this *d++ expression. But you can’t do this anymore! Although the code worked on my machines, you can’t put a double pointer to any location and assume it’ll work everywhere. The solution was to memcpy the relevant bytes to a double value on the stack, and copy that into the Rcpp::NumericVector.

To be done:

All these changes have brought me here:

where you see that linux and Windows compile (all NOTEs indicate that the library is too large, which is unavoidable with GDAL) and that errors are Mac related:

r-devel-macos-x86_64-clang ** testing if installed package can be loaded Error in dyn.load(file, DLLpath = DLLpath, ...) : unable to load shared object '/Users/ripley/R/packages/tests-devel/sf.Rcheck/sf/libs/sf.so': dlopen(/Users/ripley/R/packages/tests-devel/sf.Rcheck/sf/libs/sf.so, 6): Symbol not found: _H5T_NATIVE_DOUBLE_g Referenced from: /Users/ripley/R/packages/tests-devel/sf.Rcheck/sf/libs/sf.so Expected in: flat namespace in /Users/ripley/R/packages/tests-devel/sf.Rcheck/sf/libs/sf.so

which indicates a problem with GDAL not linking to the HDF5 library (unsolved).

The second:

r-release-osx-x86_64-mavericks checking gdal-config usability... yes configure: GDAL: 1.11.4 checking GDAL version >= 2.0.0... yes

indicates that

  • this platform still runs GDAL 1.x, so needs to be upgraded and that
  • my check for GDAL version present on the system still does not work!
CRAN flavors

CRAN flavors is a great asset that teaches you problems of all kinds at an early stage. Without it, users would have run at some stage into problems that are now caught up front. Thanks to the tremendous effort of the CRAN team!!

UPDATE, Dec 21, 2016
  • Thanks to Roger Bivand, Simon Urbanek and Brian Ripley for constructive help, the MacOSX-mavericks binary build is now on CRAN, the issue is described here

Automatic units in axis labels

Thu, 09/29/2016 - 02:00

This blog post concerns the development version of units, installed by

devtools::install_github("edzer/units")

[view raw Rmd]

Have you ever tried to properly add measurement units to R plots? It might go like this:

xlab = parse(text = "temperature ~~ group('[', degree * C, ']')") ylab = parse(text = "speed ~~ group('[', m * ~~ s^-1, ']')") par(mar = par("mar") + c(0, .3, 0, 0)) # avoids cutting of superscript plot(3 + 1:10 + 2 * rnorm(10), xlab = xlab, ylab = ylab)

The main observation is, of course that it can be done. However,

  • it looks geeky, and not quite intuitive
  • you would typically postpone this work to just before submitting the paper, or during review
  • you need this so infrequently that you tend to forget how it works.

Although well-written help is found in ?plotmath, all three observations cause frustration.

The original paper desribing plotmath is by Paul Murrell and Ross Ihaka. R core member Paul Murrell also wrote package grid, part of base R. Few people use it directly, but without it ggplot2 or lattice could not exist.

Automatic unit handling

The new units CRAN package now makes working with units

  • easier
  • automatic, and
  • less error-prone

Here is an example using mtcars. First, we specify the imperial units to those known in the udunits2 database:

library(units) gallon = make_unit("gallon") consumption = mtcars$mpg * with(ud_units, mi/gallon) displacement = mtcars$disp * ud_units[["in"]]^3

For displacement, we cannot use the normal lookup in the database

displacement = mtcars$disp * with(ud_units, in)

because in (inch) is also a reserved word in R.

We convert these values to SI units by

units(displacement) = with(ud_units, cm^3) units(consumption) = with(ud_units, km/l) consumption[1:5] ## Units: km/l ## [1] 8.928017 8.928017 9.693276 9.098075 7.950187 Automatic measurement units in axis labels

We can plot these numeric variabes of type units by

par(mar = par("mar") + c(0, .1, 0, 0)) # avoids cutting of brackets at lhs plot(displacement, consumption)

The units automatically appear in axis labels! If we want to have negative power instead of division bars, we can set a global option

units_options(negative_power = TRUE) # division becomes ^-1

Expressions such as

1/displacement [1:10] ## Units: cm^-3 ## [1] 0.0003813984 0.0003813984 0.0005650347 0.0002365261 0.0001695104 ## [6] 0.0002712166 0.0001695104 0.0004159764 0.0004334073 0.0003641035

automatically convert units, which also happens in plots (note the converted units symbols):

par(mar = par("mar") + c(0, .3, 0, 0)) plot(1/displacement, 1/consumption)

How to do this with ggplot?

We can of course plot these data by dropping units:

library(ggplot2) ggplot() + geom_point(aes(x = as.numeric(displacement), y = as.numeric(consumption)))

but that doesn’t show us units. Giving the units as variables gives an error:

ggplot() + geom_point(aes(x = displacement, y = consumption)) ## Don't know how to automatically pick scale for object of type units. Defaulting to continuous. ## Don't know how to automatically pick scale for object of type units. Defaulting to continuous. ## Error in Ops.units(x, range[1]): both operands of the expression should be "units" objects

(I could make that error go away by letting units drop the requirement that in a comparison both sides should have compatible units, which of course would be wrong.)

We can then go all the way with

ggplot() + geom_point(aes(x = as.numeric(displacement), y = as.numeric(consumption))) + xlab(make_unit_label("displacement", displacement)) + ylab(make_unit_label("consumption", consumption))

which at least doesn’t cut off the left label, but feels too convoluted and error-prone.

Oh ggplot gurus, who can help us out, here? How can we obtain that last plot by

ggplot() + geom_point(aes(x = displacement, y = consumption))

?

Update of Dec 2, 2016

Thanks to ggguru Thomas Lin Pedersen, automatic units in axis labels of ggplots are now provided by CRAN package ggforce:

library(ggforce) ggplot() + geom_point(aes(x = displacement, y = consumption))

and see this vignette for more examples. In addition to printing units in default axes labels, it allows for on-the-fly unit conversion in ggplot expressions:

dm = with(ud_units, dm) gallon = with(ud_units, gallon) mi = with(ud_units, mi) ggplot() + geom_point(aes(x = displacement, y = consumption)) + scale_x_unit(unit = dm^3) + scale_y_unit(unit = mi/gallon)

Related posts/articles

The future of R spatial

Mon, 09/26/2016 - 10:00

Last week’s geostat summer school in Albacete was a lot of fun, with about 60 participants and 10 lecturers. Various courses were given on handling, analyzing and modelling spatial and spatiotemporal data, using open source software. Participants came from all kind of directions, not only geosciences but also antropology, epidemiology and surprisingly many from biology and ecology. Tom Hengl invited us to discuss the future of spatial and spatiotemporal analysis on day 2:

@edzerpebesma talking about the future of spatial and spatiotemporal analysis at #geostat2016 @uclm_inter pic.twitter.com/yfQL2vb5ii

— Rubén G. Mateo (@RubenGMateo) September 20, 2016

In the background of the screen, you see the first appveyor (= windows) build of sf, the simple features for R package. It means that thanks to Jeroen Ooms and rwinlib, windows users can now build binary packages that link to GDAL 2.1, GEOS and Proj.4:

Windows users with Rtools installed can now build and install sfr. Opens the way for others to directly Rcpp into gdal2. Ta2 @opencpu !

— Edzer Pebesma (@edzerpebesma) September 21, 2016

Thanks to the efficient well-known-binary interface of sf, and thanks to using C++ and Rcpp, compared to sp the sf package now reads large feature sets much (18 x) faster into much (4 x) smaller objects (benchmark shapefile provided by Robin Lovelace):

> system.time(r <- rgdal::readOGR(".", "gis.osm_buildings_v06")) OGR data source with driver: ESRI Shapefile Source: ".", layer: "gis.osm_buildings_v06" with 487576 features It has 6 fields user system elapsed 90.312 0.744 91.053 > object.size(r) 1556312104 bytes > system.time(s <- sf::st_read(".", "gis.osm_buildings_v06")) Reading layer gis.osm_buildings_v06 from data source . using driver "ESRI Shapefile" features: 487576 fields: 6 converted into: MULTIPOLYGON proj4string: +proj=longlat +datum=WGS84 +no_defs user system elapsed 5.100 0.092 5.191 > object.size(s) 410306448 bytes Raster data

Currently, R package raster is gradually being ported to C++ for efficiency reasons. For reading and writing data through GDAL, it uses rgdal, so when going through a big (cached) raster in C++ it has to go through C++ \(\rightarrow\) R \(\rightarrow\) rgdal \(\rightarrow\) R \(\rightarrow\) C++ for every chunk of data. The current set of raster classes

library(raster) Loading required package: sp > showClass("Raster") Virtual Class "Raster" [package "raster"] Slots: Name: title extent rotated rotation ncols nrows crs Class: character Extent logical .Rotation integer integer CRS Name: history z Class: list list Extends: "BasicRaster" Known Subclasses: Class "RasterLayer", directly Class "RasterBrick", directly Class "RasterStack", directly Class ".RasterQuad", directly Class "RasterLayerSparse", by class "RasterLayer", distance 2 Class ".RasterBrickSparse", by class "RasterBrick", distance 2

has grown somewhat ad hoc, and should be replaced by a single class that supports

  • one or more layers (bands, attributes)
  • time as a dimension
  • altitude or depth as a dimension (possibly expressed as pressure level)
The future

So, how does the future of R spatial look like?

  1. vector data use simple features, now in package sf
  2. raster data get a single, flexible class that generalizes all Raster* classes now in raster and integrates with simple features
  3. vector and raster data share a clear and consistent interface, no more conflicting function names
  4. raster computing directly links to GDAL, but supports distributed computing back ends provided e.g. by SciDB, Google Earth Engine or rasdaman
  5. spatiotemporal classes in spacetime and trajectories build on simple features or raster
  6. support for measurement units
  7. support for strong typing that encourages meaningful computation.

Exciting times are ahead of us. We need your help!

Reading well-known-binary into R

Thu, 09/01/2016 - 11:00

This blog post describes ways to read binary simple feature data into R, and compares them.

WKB (well-known-binary) is the (ISO) standard binary serialization for simple features. You see it often printed in hexadecimal notation , e.g. in spatially extended databases such as PostGIS:

postgis=# SELECT 'POINT(1 2)'::geometry; geometry -------------------------------------------- 0101000000000000000000F03F0000000000000040 (1 row)

where the alternative form is the human-readable text (Well-known text) form:

postgis=# SELECT ST_AsText('POINT(1 2)'::geometry); st_astext ------------ POINT(1 2) (1 row)

In fact, the WKB is the way databases store features in BLOBs (binary large objects). This means that, unlike well-known text, reading well-known binary involves

  • no loss of precision caused by text <–> binary conversion,
  • no conversion of data needed at all (provided the endianness is native)

As a consequence, it should be possible to do this blazingly fast. Also with R? And large data sets?

Three software scenarios

I compared three software implementations:

  1. sf::st_as_sfc (of package sf) using C++ to read WKB
  2. sf::st_as_sfc (of package sf) using pure R to read WKB (but C++ to compute bounding box)
  3. wkb::readWKB (of package wkb) using pure R to read features into sp-compatible objects

Note that the results below were obtained after profiling, and implementing expensive parts in C++.

Three geometries

I created three different (sets of) simple features to compare read performance: one large and simple line, one data set with many small lines, and one multi-part line containing many sub-lines:

  1. single LINESTRING with many points: a single LINESTRING with one million nodes (pionts) is read into a single simple feature
  2. many LINESTRINGs with few points: half a million simple features of type LINESTRING are read, each having two nodes (points)
  3. single MULTILINESTRING with many short lines: a single simple feature of type MULTILINESTRING is read, consisting of half a million line segments, each line segment consisting of two points.

A reproducible demo-script is found in the sf package here, and can be run by

devtools::install_github("edzer/sfr") demo(bm_wkb)

Reported run times are in seconds, and were obtained by system.time().

single LINESTRING with many points expression user system elapsed sf::st_as_sfc(.) 0.032 0.000 0.031 sf::st_as_sfc(., pureR = TRUE) 0.096 0.012 0.110 wkb::readWKB(.) 8.276 0.000 8.275

We see that for this case both sf implementations are comparable; this is due to the fact that the whole line of 16 Mb is read into R with a single readBin call: C++ can’t do this much faster.

I suspect wkb::readWKB is slower here because instead of reading a complete matrix in one step it makes a million calls to readPoint, and then merges the points read in R. This adds a few million function calls. Since only a single Line is created, not much overhead from sp can take place here.

Function calls, as John Chambers explains in Extending R, have a constant overhead of about 1000 instructions. Having lots of them may become expensive, if each of them does relatively little.

many LINESTRINGs with few points expression user system elapsed sf::st_as_sfc(.) 1.244 0.000 1.243 sf::st_as_sfc(., pureR = TRUE) 55.004 0.056 55.063 wkb::readWKB(.) 257.092 0.192 257.291

Here we see a strong performance gain of the C++ implementation: all the object creation is done in C++, without R function calls. wkb::readWKB slowness may be largely due to overhead caused by sp: creating Line and Lines objects, object validation, computing bounding box.

I made the C++ and “pureR” implementations considerably faster by moving the bounding box calculation to C++. The C++ implementation was further optimized by moving the type check to C++: if a mix of types is read from a set of WKB objects, sfc will coerce them to a single type (e.g., a set of LINESTRING and MULTILINESTRING will be coerced to all MULTILINESTRING.)

single MULTILINESTRING with many short lines expression user system elapsed sf::st_as_sfc(.) 0.348 0.000 0.348 sf::st_as_sfc(., pureR = TRUE) 24.088 0.008 24.100 wkb::readWKB(.) 87.072 0.004 87.074

Here we see again the cost of function calls: both “pureR” in sf and wkb::readWKB are much slower due to the many function calls; the latter also due to object management and validation in sp.

Discussion

Reading well-known binary spatial data into R can be done pretty elegantly by R, but in many scenarios can be much faster using C++. We observe speed gains up to a factor 250.

Book review: Extending R

Wed, 08/17/2016 - 02:00

“Extending R”, by John M. Chambers; Paperback $69.95, May 24, 2016 by Chapman and Hall/CRC; 364 Pages - 7 B/W Illustrations;

R is a free software environment for statistical computing and graphics. It started as a free implementation of the S language, which was back then commercially available as S-Plus, and has since around ten years become the lingua franca of statistics, the main language people use to communicate statistical computation. R’s popularity stems partly from the fact that it is free and open source, partly from the fact that it is easily extendible: through add-on packages that follow a clearly defined structure, new statistical ideas can be implemented, shared, and used by others. Using R, the computational aspects of research can be communicated in a reproducible way, understood by a large audience.

Written between 1984 and 1998, John Chambers is (co-)author of the four leading – “brown”, “blue”, “white”, “green” – books that describe the S language as it evolved and as it is now. He has designed it, implemented it, and improved it in all its phases. Being part of the R core team, he is author of the methods package, part of every R installation, providing the S4 approach to object orientation.

This book, Extending R, appeared as a volume in “The R Series”. The book is organized in four parts:

  1. Understanding R,
  2. Programming with R,
  3. Object-oriented programming, and
  4. Interfaces.

The first part starts with explaining three principles underlying R:

  • Everything that exists in R is an object
  • Everything that happens in R is a function call
  • Interfaces to other software are part of R.

These principle form the basis for parts II, III and IV. The first chapter introduces them. Chapter two, “Evolution”, describes the history of the S language, from its earliest days to Today: the coming and going of S-Plus, the arrival of R and its dominance Today. It also describes the evolution of functional S, and the evolution of object-oriented programming in S. Chapter 3, “R in action”, explains a number of basics of R, such as how function calls work, how objects are implemented, and how the R evaluator works.

Part II, “Programming with R”, discusses functions in depth, explains what objects are and how they are managed, and explains what extension packages do to the R environment. It discusses small, medium and large programming exercises, and what they demand.

Part III, “Object-oriented programming”, largely focuses on the difference between functional object oriented programming (as implemented in S4) and encapsulated object oriented programming as implemented in reference classes (similar to C++ and java), and shows examples for which purpose each paradigm is most useful.

Part IV, “Interfaces”, explains the potential and challenges of interfacing R with other programming languages. It discusses several of such interfaces, and describes a general framework for creating such interfaces. As instances of this framework it provides interfaces to the Python and Julia languages, and discusses the existing Rcpp framework.

For who was this book written? It is clearly not an introductory text, nor a how-to or hands-on book for learning how to program R or write R packages, and it refers to the two volumes Advanced R and R packages, both written by Hadley Wickham. For those with a bit of experience with R programming and a general interest in the language, this book may give a number of new insights and a deeper, often evolutionary motivated understanding.

Not surprisingly, the book also gives clear advice on how software development should take place: object-oriented with formally defined classes (S4 or reference classes), and it argues why this is a good idea. One of these arguments is the ability to do method dispatch based on more than one argument. This needs all arguments to be evaluated, and does not work well with non-standard evaluation. Many R packages currently promoted by Hadley Wickham and many others (“tidyverse”) often favor non-standard evaluation, and constrain to S3. I think that both arguments have some merit, and would look forward to a good user study that compares the usability of the two approaches.

Measurement units in R now simplify

Tue, 08/16/2016 - 02:00

[view raw Rmd]

I wrote earlier about the units R package in this blog post. Last weekend I was happily surprised by two large pull requests (1, 2), from Thomas Mailund. He discusses his contribution in this blog.

Essentially, the pull requests enable

  • the handling and definition of user-defined units in R, and
  • automatic simplification of units
How it works

Units now have to be created explicitly, e.g. by

library(units) m = make_unit("m") s = make_unit("s") (a = 1:10 * m/s) ## Units: m/s ## [1] 1 2 3 4 5 6 7 8 9 10

The units of the udunits2 package are no longer loaded automatically; they are in a database (list) called ud_untis, which is lazyloaded, so after

rm("m", "s")

two clean solutions to use them are either

(a = 1:10 * ud_units$m / ud_units$s) ## Units: m/s ## [1] 1 2 3 4 5 6 7 8 9 10

or

(with(ud_units, a <- 1:10 * m / s)) ## Units: m/s ## [1] 1 2 3 4 5 6 7 8 9 10

and one much less clean solution is to first attach the whole database:

attach(ud_units) ## The following object is masked _by_ .GlobalEnv: ## ## a ## The following object is masked from package:datasets: ## ## npk ## The following objects are masked from package:base: ## ## F, T (a = 1:10 * m / s) ## Units: m/s ## [1] 1 2 3 4 5 6 7 8 9 10 Simplification

Simplification not only works when identical units appear in both numerator and denominator:

a = 1:10 * m / s a * (10 * s) ## Units: m ## [1] 10 20 30 40 50 60 70 80 90 100

but also when a unit in the numerator and denominator are convertible:

a = 1:10 * m / s a * (10 * min) ## Units: m ## [1] 600 1200 1800 2400 3000 3600 4200 4800 5400 6000 a / (0.1 * km) ## Units: 1/s ## [1] 0.01 0.02 0.03 0.04 0.05 0.06 0.07 0.08 0.09 0.10 New units

New units can be created on the fly, and are simplified:

apple = make_unit("apple") euro = make_unit("euro") (nr = c(5, 10, 15) * apple) ## Units: apple ## [1] 5 10 15 (cost_per_piece = 0.57 * euro / apple) ## 0.57 euro/apple (cost = nr * cost_per_piece) ## Units: euro ## [1] 2.85 5.70 8.55 Limitations

Two limitations of the current implementation are

  1. automatic conversion of user-implemented units into other user-defined units or to and from units in the ud_units database is not supported,
  2. non-integer powers are no (longer) supported.

Simple features for R, part 2

Mon, 07/18/2016 - 02:00

[view raw Rmd]

What happened so far?
  • in an earlier blog post I introduced the idea of having simple features mapped directly into simple R objects
  • an R Consortium ISC proposal to implement this got granted
  • during UseR! 2016 I presented this proposal (slides), which we followed up with an open discussion on future directions
  • first steps to implement this in the sf package have finished, and are described below

This blog post describes current progress.

Install & test

You can install package sf directly from github:

library(devtools) # maybe install first? install_github("edzer/sfr", ref = "16e205f54976bee75c72ac1b54f117868b6fafbc")

if you want to try out read.sf, which reads through GDAL 2.0, you also need my fork of the R package rgdal2, installed by

install_github("edzer/rgdal2")

this, obviously, requires that GDAL 2.0 or later is installed, along with development files.

After installing, a vignette contains some basic operations, and is shown by

library(sf) vignette("basic") How does it work?

Basic design ideas and constraints have been written in this document.

Simple features are one of the following 17 types: Point, LineString, Polygon, MultiPoint, MultiLineString, MultiPolygon, GeometryCollection, CircularString, CompoundCurve, CurvePolygon, MultiCurve, MultiSurface, Curve, Surface, PolyhedralSurface, TIN, and Triangle. Each type can have 2D points (XY), 3D points (XYZ), 2D points with measure (XYM) and 3D points with measure (XYZM). This leads to 17 x 4 = 68 combinations.

The first seven of these are most common, and have been implemented, allowing for XY, XYZ, XYM and XYZM geometries.

Simple feature instances: sfi

A single simple feature is created by calling the constructor function, along with a modifier in case a three-dimensional geometry has measure “M” as its third dimension:

library(sf) POINT(c(2,3)) ## [1] "POINT(2 3)" POINT(c(2,3,4)) ## [1] "POINT Z(2 3 4)" POINT(c(2,3,4), "M") ## [1] "POINT M(2 3 4)" POINT(c(2,3,4,5)) ## [1] "POINT ZM(2 3 4 5)"

what is printed is a well kown text representation of the object; the data itself is however stored as a regular R vector or matrix:

str(POINT(c(2,3,4), "M")) ## Classes 'POINT M', 'sfi' num [1:3] 2 3 4 str(LINESTRING(rbind(c(2,2), c(3,3), c(3,2)))) ## LINESTRING [1:3, 1:2] 2 3 3 2 3 2 ## - attr(*, "class")= chr [1:2] "LINESTRING" "sfi"

By using the two simple rules that

  1. sets of points are kept in a matrix
  2. other sets are kept in a list

we end up with the following structures, with increasing complexity:

Sets of points (matrix): str(LINESTRING(rbind(c(2,2), c(3,3), c(3,2)))) ## LINESTRING [1:3, 1:2] 2 3 3 2 3 2 ## - attr(*, "class")= chr [1:2] "LINESTRING" "sfi" str(MULTIPOINT(rbind(c(2,2), c(3,3), c(3,2)))) ## MULTIPOINT [1:3, 1:2] 2 3 3 2 3 2 ## - attr(*, "class")= chr [1:2] "MULTIPOINT" "sfi" Sets of sets of points: str(MULTILINESTRING(list(rbind(c(2,2), c(3,3), c(3,2)), rbind(c(2,1),c(0,0))))) ## List of 2 ## $ : num [1:3, 1:2] 2 3 3 2 3 2 ## $ : num [1:2, 1:2] 2 0 1 0 ## - attr(*, "class")= chr [1:2] "MULTILINESTRING" "sfi" outer = matrix(c(0,0,10,0,10,10,0,10,0,0),ncol=2, byrow=TRUE) hole1 = matrix(c(1,1,1,2,2,2,2,1,1,1),ncol=2, byrow=TRUE) hole2 = matrix(c(5,5,5,6,6,6,6,5,5,5),ncol=2, byrow=TRUE) str(POLYGON(list(outer, hole1, hole2))) ## List of 3 ## $ : num [1:5, 1:2] 0 10 10 0 0 0 0 10 10 0 ## $ : num [1:5, 1:2] 1 1 2 2 1 1 2 2 1 1 ## $ : num [1:5, 1:2] 5 5 6 6 5 5 6 6 5 5 ## - attr(*, "class")= chr [1:2] "POLYGON" "sfi" Sets of sets of sets of points: pol1 = list(outer, hole1, hole2) pol2 = list(outer + 12, hole1 + 12) pol3 = list(outer + 24) mp = MULTIPOLYGON(list(pol1,pol2,pol3)) str(mp) ## List of 3 ## $ :List of 3 ## ..$ : num [1:5, 1:2] 0 10 10 0 0 0 0 10 10 0 ## ..$ : num [1:5, 1:2] 1 1 2 2 1 1 2 2 1 1 ## ..$ : num [1:5, 1:2] 5 5 6 6 5 5 6 6 5 5 ## $ :List of 2 ## ..$ : num [1:5, 1:2] 12 22 22 12 12 12 12 22 22 12 ## ..$ : num [1:5, 1:2] 13 13 14 14 13 13 14 14 13 13 ## $ :List of 1 ## ..$ : num [1:5, 1:2] 24 34 34 24 24 24 24 34 34 24 ## - attr(*, "class")= chr [1:2] "MULTIPOLYGON" "sfi" Sets of sets of sets of sets of points: str(GEOMETRYCOLLECTION(list(MULTIPOLYGON(list(pol1,pol2,pol3)), POINT(c(2,3))))) ## List of 2 ## $ :List of 3 ## ..$ :List of 3 ## .. ..$ : num [1:5, 1:2] 0 10 10 0 0 0 0 10 10 0 ## .. ..$ : num [1:5, 1:2] 1 1 2 2 1 1 2 2 1 1 ## .. ..$ : num [1:5, 1:2] 5 5 6 6 5 5 6 6 5 5 ## ..$ :List of 2 ## .. ..$ : num [1:5, 1:2] 12 22 22 12 12 12 12 22 22 12 ## .. ..$ : num [1:5, 1:2] 13 13 14 14 13 13 14 14 13 13 ## ..$ :List of 1 ## .. ..$ : num [1:5, 1:2] 24 34 34 24 24 24 24 34 34 24 ## ..- attr(*, "class")= chr [1:2] "MULTIPOLYGON" "sfi" ## $ :Classes 'POINT', 'sfi' num [1:2] 2 3 ## - attr(*, "class")= chr [1:2] "GEOMETRYCOLLECTION" "sfi"

where this is of course a worst case: GEOMETRYCOLLECTION objects with simpler elements have less nesting.

Methods for sfi

The following methods have been implemented for sfi objects:

methods(class = "sfi") ## [1] as.WKT format print ## see '?methods' for accessing help and source code Alternatives to this implementation
  1. Package rgdal2 reads point sets not in a matrix, but into a list with numeric vectors named x and y. This is closer to the GDAL (OGR) data model, and would allow for easier disambiguation of the third dimension (m or z) in case of three-dimensional points. It is more difficult to select a single point, and requires validation of vector lenghts being identical. I’m inclined to keep using matrix for point sets.
  2. Currently, POINT Z is of class c("POINT Z", "sfi"). An alternative would be to have it derive from POINT, i.e. give it class c("POINT Z", "POINT", "sfi"). This would make it easier to write methods for XYZ, XYM and XYZM geometries. This may be worth trying out.
Simple feature list columns: sfc

Collections of simple features can be added together into a list. If all elements of this list

  • are of identical type (have identical class), or are a mix of X and MULTIX (with X being one of POINT, LINESTRING or POLYGON)
  • have an identical coordinate reference system

then they can be combined in a sfc object. This object

  • converts, if needed, X into MULTIX (this is also what PostGIS does),
  • registers the coordinate reference system in attributes epsg and proj4string,
  • has the bounding box in attribute bbox, and updates it after subsetting
ls1 = LINESTRING(rbind(c(2,2), c(3,3), c(3,2))) ls2 = LINESTRING(rbind(c(5,5), c(4,1), c(1,2))) sfc = sfc(list(ls1, ls2), epsg = 4326) attributes(sfc) ## $class ## [1] "sfc" ## ## $type ## [1] "LINESTRING" ## ## $epsg ## [1] 4326 ## ## $bbox ## xmin xmax ymin ymax ## 1 5 1 5 ## ## $proj4string ## [1] "+init=epsg:4326 +proj=longlat +datum=WGS84 +no_defs +ellps=WGS84 +towgs84=0,0,0" attributes(sfc[1]) ## $class ## [1] "sfc" ## ## $type ## [1] "LINESTRING" ## ## $epsg ## [1] 4326 ## ## $bbox ## xmin xmax ymin ymax ## 2 3 2 3 ## ## $proj4string ## [1] "+init=epsg:4326 +proj=longlat +datum=WGS84 +no_defs +ellps=WGS84 +towgs84=0,0,0"

The following methods have been implemented for sfc simple feature list columns:

methods(class = "sfc") ## [1] bbox format [ summary ## see '?methods' for accessing help and source code data.frames with simple features: sf

Typical spatial data contain attribute values and attribute geometries. When combined in a table, they can be converted into sf objects, e.g. by

roads = data.frame(widths = c(5, 4.5)) roads$geom = sfc roads.sf = sf(roads) roads.sf ## widths geom ## 1 5.0 LINESTRING(2 2, 3 3, 3 2) ## 2 4.5 LINESTRING(5 5, 4 1, 1 2) summary(roads.sf) ## widths geom ## Min. :4.500 LINESTRING :2 ## 1st Qu.:4.625 epsg:4326 :0 ## Median :4.750 +init=epsg...:0 ## Mean :4.750 ## 3rd Qu.:4.875 ## Max. :5.000 attributes(roads.sf) ## $names ## [1] "widths" "geom" ## ## $row.names ## [1] 1 2 ## ## $class ## [1] "sf" "data.frame" ## ## $sf_column ## geom ## 2 ## ## $relation_to_geometry ## widths ## <NA> ## Levels: field lattice entity

here, attribute relation_to_geometry allows documenting how attributes relate to the geometry: are they constant (field), aggregated over the geometry (lattice), or do they identify individual entities (buildings, parcels etc.)?

The following methods have been implemented for sfc simple feature list columns:

methods(class = "sf") ## [1] geometry ## see '?methods' for accessing help and source code Coercion to and from sp

Points, MultiPoints, Lines, MultiLines, Polygons and MultiPolygons can be converted between sf and sp, both ways. A round trip is demonstrated by:

df = data.frame(a=1) df$geom = sfc(list(mp)) sf = sf(df) library(methods) a = as(sf, "Spatial") class(a) ## [1] "SpatialPolygonsDataFrame" ## attr(,"package") ## [1] "sp" b = as.sf(a) all.equal(sf, b) # round-trip sf-sp-sf ## [1] TRUE a2 = as(a, "SpatialPolygonsDataFrame") all.equal(a, a2) # round-trip sp-sf-sp ## [1] TRUE Reading through GDAL

Function read.sf works, if rgdal2 is installed (see above), and reads simple features through GDAL:

(s = read.sf(system.file("shapes/", package="maptools"), "sids"))[1:5,] ## AREA PERIMETER CNTY_ CNTY_ID NAME FIPS FIPSNO CRESS_ID BIR74 ## 0 0.114 1.442 1825 1825 Ashe 37009 37009 5 1091 ## 1 0.061 1.231 1827 1827 Alleghany 37005 37005 3 487 ## 2 0.143 1.63 1828 1828 Surry 37171 37171 86 3188 ## 3 0.07 2.968 1831 1831 Currituck 37053 37053 27 508 ## 4 0.153 2.206 1832 1832 Northampton 37131 37131 66 1421 ## SID74 NWBIR74 BIR79 SID79 NWBIR79 geom ## 0 1 10 1364 0 19 MULTIPOLYGON(((-81.47275543212 ... ## 1 0 10 542 3 12 MULTIPOLYGON(((-81.23989105224 ... ## 2 5 208 3616 6 260 MULTIPOLYGON(((-80.45634460449 ... ## 3 1 123 830 2 145 MULTIPOLYGON(((-76.00897216796 ... ## 4 9 1066 1606 3 1197 MULTIPOLYGON(((-77.21766662597 ... summary(s) ## AREA PERIMETER CNTY_ CNTY_ID NAME ## 0.118 : 4 1.307 : 2 1825 : 1 1825 : 1 Alamance : 1 ## 0.091 : 3 1.601 : 2 1827 : 1 1827 : 1 Alexander: 1 ## 0.143 : 3 1.68 : 2 1828 : 1 1828 : 1 Alleghany: 1 ## 0.07 : 2 1.791 : 2 1831 : 1 1831 : 1 Anson : 1 ## 0.078 : 2 0.999 : 1 1832 : 1 1832 : 1 Ashe : 1 ## 0.08 : 2 1 : 1 1833 : 1 1833 : 1 Avery : 1 ## (Other):84 (Other):90 (Other):94 (Other):94 (Other) :94 ## FIPS FIPSNO CRESS_ID BIR74 SID74 ## 37001 : 1 37001 : 1 1 : 1 1027 : 1 0 :13 ## 37003 : 1 37003 : 1 10 : 1 1035 : 1 4 :13 ## 37005 : 1 37005 : 1 100 : 1 1091 : 1 1 :11 ## 37007 : 1 37007 : 1 11 : 1 11158 : 1 5 :11 ## 37009 : 1 37009 : 1 12 : 1 1143 : 1 2 : 8 ## 37011 : 1 37011 : 1 13 : 1 1173 : 1 3 : 6 ## (Other):94 (Other):94 (Other):94 (Other):94 (Other):38 ## NWBIR74 BIR79 SID79 NWBIR79 geom ## 736 : 3 10432 : 1 2 :10 1161 : 2 MULTIPOLYGON:100 ## 1 : 2 1059 : 1 0 : 9 5 : 2 epsg:NA : 0 ## 10 : 2 1141 : 1 1 : 9 10 : 1 ## 1243 : 2 11455 : 1 4 : 9 1023 : 1 ## 134 : 2 1157 : 1 5 : 9 1033 : 1 ## 930 : 2 1173 : 1 3 : 6 104 : 1 ## (Other):87 (Other):94 (Other):48 (Other):92

This also shows the abbreviation of long geometries when printed or summarized, provided by the format methods.

The following works for me, with PostGIS installed and data loaded:

(s = read.sf("PG:dbname=postgis", "meuse2"))[1:5,] ## zinc geom ## 1 1022 POINT(181072 333611) ## 2 1141 POINT(181025 333558) ## 3 640 POINT(181165 333537) ## 4 257 POINT(181298 333484) ## 5 269 POINT(181307 333330) summary(s) ## zinc geom ## Min. : 113.0 POINT :155 ## 1st Qu.: 198.0 epsg:NA : 0 ## Median : 326.0 +proj=ster...: 0 ## Mean : 469.7 ## 3rd Qu.: 674.5 ## Max. :1839.0 Still to do/to be decided

The following issues need to be decided upon:

  • reproject sf objects through rgdal2? support well-known-text for CRS? or use PROJ.4 directly?
  • when subsetting attributes from an sf objects, make geometry sticky (like sp does), or drop geometry and return data.frame (data.frame behaviour)?

The following things still need to be done:

  • write simple features through GDAL (using rgdal2)
  • using gdal geometry functions in rgdal2
  • extend rgdal2 to also read XYZ, XYM, and XYZM geometries - my feeling is that this will be easier than modifying rgdal
  • reprojection of sf objects
  • link to GEOS, using GEOS functions: GDAL with GEOS enabled (and rgdal2) has some of this, but not for instance rgeos::gRelate
  • develop better and more complete test cases; also check the OGC test suite
  • improve documentation, add tutorial (vignettes, paper)
  • add plot functions (base, grid)
  • explore direct WKB - sf conversion, without GDAL
  • explore how meaningfulness of operations can be verified when for attributes their relation_to_geometry has been specified

Please let me know if you have any comments, suggestions or questions!

Pages