Subscribe to R-sig-geo feed
This is an archive for R-sig-geo, not a forum. It is not possible to post through Nabble - you may not start a new thread nor follow up an existing thread. If you wish to post, but are not subscribed to the list through the list homepage, subscribe first (through the list homepage, not via Nabble) and post from your subscribed email address. Until 2015-06-20, subscribers could post through Nabble, but policy has been changed as too many non-subscribers misunderstood the interface.
Updated: 1 hour 27 sec ago

Job vacancy for a 'Digital Geographer' at the Oxford Internet Institute

Tue, 05/09/2017 - 01:38
Researcher in Digital Geography
<https://www.recruit.ox.ac.uk/pls/hrisliverecruit/erq_jobspec_details_form.jobspec?p_id=126609>
Oxford Internet Institute, 1 St Giles, Oxford
Grade 7: £31,076 – £38,183 p.a. (pro rata for part-time)

The Oxford Internet Institute is a leading centre for research into
individual, collective and institutional behaviour on the Internet. We are
looking for a Researcher to work with Professor Mark Graham on a two-year
project to better understand the geographies of the Internet.

Our existing research has uncovered highly uneven digital geographies: with
some parts of the world far more like to produce, and be represented by,
digital content than others.

We seek to hire a Researcher to continue some of this research, to ask what
has changed, and to ask new questions about digital inequalities at not
just the global, but also the local scale. We plan to ask and answer
questions such as: what are the contemporary geographies of the production
and consumption of digital knowledge-based economic activities?; what are
the geographies of digital representations (such as content in Wikipedia or
Google)?; how likely is digital content to be locally or non-locally
produced?; and do digital representations produce or reproduce social and
economic inequalities and divisions in our urban environments. If we accept
that our cities are made up of digital as well as physical raw materials –
we need to better understand who owns, controls, shapes, can access, and
can remake the digital layers of place.

We plan on answering the above questions using methods from computational
social science and GIS: scraping, mapping, and statistically analysing a
diverse range of datasets. The position is suited to candidates who have
recently completed a doctorate in Quantitative Geography, GIScience,
Computer Science, Economics, Sociology or other relevant discipline (i.e.
postdocs), but we also welcome applications from qualified individuals
without a doctorate (e.g. candidates with industry experience). Programming
skills, and experience with GIS are required. The successful candidate will
ultimately work with Professor Graham to produce a full-length monograph on
the topic.

Based at the Oxford Internet Institute, this position is available
immediately for 24 months in the first instance, with the possibility of
renewal thereafter, funding permitting.

This job advertisement is for a full-time researcher, but part-time
applications of more than 22.5 hours a week will also be considered.

*Recruitment link and further details about the position:*
https://www.recruit.ox.ac.uk/pls/hrisliverecruit/erq_jobspec
_details_form.jobspec?p_id=126609

*Other relevant links:*
http://www.markgraham.space/blog/2017/5/5/work-with-me-at-ox
ford-im-hiring-a-digital-geographer
http://www.markgraham.space/internet-information-geography/
http://geography.oii.ox.ac.uk/
http://geonet.oii.ox.ac.uk/mapping/
http://cii.oii.ox.ac.uk/


/apologies for cross-posting
------------------------------------------
Mark Graham

Professor of Internet Geography
Oxford Internet Institute
University of Oxford

Faculty Fellow
The Alan Turing Institute

Research Affiliate
School of Geography and the Environment
University of Oxford

markgraham.space | @geoplace <http://twitter.com/geoplace>
<http://twitter.com/geoplace>

        [[alternative HTML version deleted]]

_______________________________________________
R-sig-Geo mailing list
[hidden email]
https://stat.ethz.ch/mailman/listinfo/r-sig-geo

mapview 2.0.1 on CRAN

Mon, 05/08/2017 - 01:03
Dear list,

we have released mapview version 2.0.1 to CRAN yesterday. Windows
binaries have already been built and MacOSX binaries should follow soon.

As indicated by the major version bump, this release entails a few major
changes. The two most user-relevant changes are:

1. All vector data processing internally is now done using the simple
features package. Apart from providing full support for the most common
sf types this also means that mapview now returns data of class sf/sfc
even when passing a sp object to mapview. I am unsure whether anyone
ever uses the @object slot of mapview but in case people do, this is a
major change.

2. The github repository has been move from
environmentalinformatics-marburg to
https://github.com/r-spatial/mapview. So to install the develop version
use devtools::install_github("r-spatial/mapview@develop")

3. There are new vector data sets (breweries, trails, franconia). The
old data sets (breweries91, gadmCHE, astStorms2005) have been moved to
library(leaflet). Thus, examples using the old data hopefully should
still work.

All changes are outlined here (still listed as as version 1.2.79)

https://r-spatial.github.io/mapview/news/index.html

We also have a new online documentation which can be found here

https://r-spatial.github.io/mapview/

As always, feedback, suggestions, feature requests and bug reports
should be filed at https://github.com/r-spatial/mapview/issues

Happy mapping,
Tim

--
Tim Appelhans
Research Specialist, Geomarketing
GfK | Bamberger Str. 6, 90425 Nürnberg | Germany
Postal address: Nordwestring 101 | 90419 Nürnberg

_______________________________________________
R-sig-Geo mailing list
[hidden email]
https://stat.ethz.ch/mailman/listinfo/r-sig-geo

Appropriate variogram model

Sun, 05/07/2017 - 23:33
Hi,
Can anyone suggest me the most appropriate variogram model to fit my data? 
I have tried three different models and results are too bad. Here is the comparison of different variogram models as attached.
Thanks,Ankur
_______________________________________________
R-sig-Geo mailing list
[hidden email]
https://stat.ethz.ch/mailman/listinfo/r-sig-geo
Different models.pdf (38K) Download Attachment

Re: extract lat-long pairs - slot "coords"

Sun, 05/07/2017 - 04:33

I was able to get the coords to print to the R-console.  Still need to
read the docs on the Polygon class to access them.
(It's not  Golan$coords   )

```{pseudo}
# download
"http://biogeo.ucdavis.edu/data/diva/adm/ISR_adm.zip",
# extract
unzip *.zip
```

```{r}
states <- readOGR("../includes/ISR_adm/ISR_adm1.shp", "ISR_adm1")
Golan <- subset(states, NAME_1 == "Golan")
## check
## based on http://stackoverflow.com/questions/25985395/how-to-subset-a-shapefile
dim(states)
dim(Golan)
Golan

```
>
An object of class "Polygon"
Slot "labpt":
[1] 35.64728 33.24579

Slot "area":
[1] 6.914819e-05

Slot "hole":
[1] FALSE

Slot "ringDir":
[1] 1

Slot "coords":
          [,1]     [,2]
 [1,] 35.63564 33.24520
 [2,] 35.63712 33.24554
 [3,] 35.64099 33.24652
 [4,] 35.64528 33.24749
 [5,] 35.65008 33.24818
 [6,] 35.65404 33.24831
 [7,] 35.65889 33.24823
 [8,] 35.65992 33.24816
 [9,] 35.65163 33.24443
[10,] 35.64420 33.24282
[11,] 35.63676 33.24432
[12,] 35.63564 33.24520

_______________________________________________
R-sig-Geo mailing list
[hidden email]
https://stat.ethz.ch/mailman/listinfo/r-sig-geo

Re: extract lat-long pairs from sf::st_read MULTIPOLYGON format

Sun, 05/07/2017 - 03:17
Brandon,

sf has sf::st_coordinates which gives a matrix or list of matrices of X
and Y coordinates depending on the feature type.

Tim


On 07.05.2017 10:17, Brandon Payne wrote:
> Roman,
>
> I am looking for advice on how to convert one (extra, missing, not-included) region from the shape file
> "ISR_adm1.shp" @ "http://biogeo.ucdavis.edu/data/diva/adm/ISR_adm.zip",
>
> into that ( https://github.com/trulia/choroplethrAdmin1/blob/master/data/admin1.map.rdata ) data frame format.
>
> _______________________________________________
> R-sig-Geo mailing list
> [hidden email]
> https://stat.ethz.ch/mailman/listinfo/r-sig-geo
--
Tim Appelhans
Research Specialist, Geomarketing
GfK | Bamberger Str. 6, 90425 Nürnberg | Germany
Postal address: Nordwestring 101 | 90419 Nürnberg

_______________________________________________
R-sig-Geo mailing list
[hidden email]
https://stat.ethz.ch/mailman/listinfo/r-sig-geo

Re: extract lat-long pairs from sf::st_read MULTIPOLYGON format

Sun, 05/07/2017 - 03:15

Roman,

I am looking for advice on how to convert one (extra, missing, not-included) region from the shape file
"ISR_adm1.shp" @ "http://biogeo.ucdavis.edu/data/diva/adm/ISR_adm.zip",

into that ( https://github.com/trulia/choroplethrAdmin1/blob/master/data/admin1.map.rdata ) data frame format.

_______________________________________________
R-sig-Geo mailing list
[hidden email]
https://stat.ethz.ch/mailman/listinfo/r-sig-geo

Re: extract lat-long pairs from sf::st_read MULTIPOLYGON format

Sun, 05/07/2017 - 02:49
After downloading
https://github.com/trulia/choroplethrAdmin1/blob/master/data/admin1.map.rdata
and

load("admin1.map.rdata")
ilmap<-admin1.map[which(admin1.map$admin == "slovenia"), ]


I get a data.frame

> names(ilmap)
 [1] "long"       "lat"        "order"      "hole"       "piece"
 "group"
 [7] "id"         "adm1_code"  "OBJECTID_1" "diss_me"    "adm1_cod_1"
"iso_3166_2"
[13] "wikipedia"  "iso_a2"     "adm0_sr"    "name"       "name_alt"
"name_local"
[19] "type"       "type_en"    "code_local" "code_hasc"  "note"
"hasc_maybe"
[25] "region"     "region_cod" "provnum_ne" "gadm_level" "check_me"
"scalerank"
[31] "datarank"   "abbrev"     "postal"     "area_sqkm"  "sameascity"
"labelrank"
[37] "featurecla" "name_len"   "mapcolor9"  "mapcolor13" "fips"
"fips_alt"
[43] "woe_id"     "woe_label"  "woe_name"   "latitude"   "longitude"
 "sov_a3"
[49] "adm0_a3"    "adm0_label" "admin"      "geonunit"   "gu_a3"
 "gn_id"
[55] "gn_name"    "gns_id"     "gns_name"   "gn_level"   "gn_region"
 "gn_a1_code"
[61] "region_sub" "sub_code"   "gns_level"  "gns_lang"   "gns_adm1"
"gns_region"


Notice the first two columns. If this isn't what you're after, perhaps try
coordinates().

Cheers,
Roman


On Sat, May 6, 2017 at 10:56 PM, Brandon Payne <[hidden email]>
wrote:

>
> ## I am trying to modify the
>
> install_github("choroplethrAdmin1", "arilamstein")
>
> package.  It can show a map by
>
> choroplethrAdmin1::admin1_map("israel")
>
> The shape data is hidden in .Rdata, (principle of encapsulation)
> so to get at it I had to
>
> ```{bash}
> git clone [hidden email]:arilamstein/choroplethrAdmin1.git
> cd chor*
> ```
> ```{r}
> load("/Users/AbuDavid/scratch/R/choroplethrAdmin1/data/
> admin1.regions.rdata")
> ilmap<-admin1.map[which(admin1.map$admin == name), ]
> View(ilmap)
> #write.csv(ilmap, file = "israel.csv", row.names=FALSE)
>
> ```
>
> the format of this data looks like
>
> 35.13070357     32.70585928     israel  haifa district  1879.1  272218
> FALSE   1       1879
> 35.12837813     32.70314627     israel  haifa district  1879.1  272219
> FALSE   1       1879
> 35.12574263     32.7008725      israel  haifa district  1879.1  272220
> FALSE   1       1879
> 35.12419234     32.69989065     israel  haifa district  1879.1  272221
> FALSE   1       1879
> 35.12248701     32.69919302     israel  haifa district  1879.1  272222
> FALSE   1       1879
> 35.12078169     32.69911551     israel  haifa district  1879.1  272223
> FALSE   1       1879
> 35.02130456     32.38117626     israel  central district        1880.1
> 272295  FALSE   1       1880
> 35.0327303      32.38220026     israel  central district        1880.1
> 272296  FALSE   1       1880
> 35.02879684     32.36515737     israel  central district        1880.1
> 272297  FALSE   1       1880
> 35.02156213     32.34459015     israel  central district        1880.1
> 272298  FALSE   1       1880
> 35.01784143     32.34221303     israel  central district        1880.1
> 272299  FALSE
> 1       1880
>
> ## Question:
>
> How can I get the data from my shape file as 2 columns: c(lat, long) ?
>
> I don't mind using something other than sf:st_read.
>
> ```{r importshapes}
> editedmap <- sf::st_read("../includes/ISR_adm_edited/ISR_adm1.shp")
> golan <- editedmap[which(editedmap$NAME_1 == "Golan"), ]
> names(editedmap)
> ```
>
> > names(editedmap)
>  [1] "ID_0"      "ISO"       "NAME_0"    "ID_1"      "NAME_1"    "TYPE_1"
>
>  [7] "ENGTYPE_1" "NL_NAME_1" "VARNAME_1" "geometry"
>
> _______________________________________________
> R-sig-Geo mailing list
> [hidden email]
> https://stat.ethz.ch/mailman/listinfo/r-sig-geo
>


--
In God we trust, all others bring data.

        [[alternative HTML version deleted]]

_______________________________________________
R-sig-Geo mailing list
[hidden email]
https://stat.ethz.ch/mailman/listinfo/r-sig-geo

extract lat-long pairs from sf::st_read MULTIPOLYGON format

Sat, 05/06/2017 - 15:55

## I am trying to modify the

install_github("choroplethrAdmin1", "arilamstein")

package.  It can show a map by

choroplethrAdmin1::admin1_map("israel")

The shape data is hidden in .Rdata, (principle of encapsulation)
so to get at it I had to

```{bash}
git clone [hidden email]:arilamstein/choroplethrAdmin1.git
cd chor*
```
```{r}
load("/Users/AbuDavid/scratch/R/choroplethrAdmin1/data/admin1.regions.rdata")
ilmap<-admin1.map[which(admin1.map$admin == name), ]
View(ilmap)
#write.csv(ilmap, file = "israel.csv", row.names=FALSE)

```

the format of this data looks like

35.13070357 32.70585928 israel haifa district 1879.1 272218 FALSE 1 1879
35.12837813 32.70314627 israel haifa district 1879.1 272219 FALSE 1 1879
35.12574263 32.7008725 israel haifa district 1879.1 272220 FALSE 1 1879
35.12419234 32.69989065 israel haifa district 1879.1 272221 FALSE 1 1879
35.12248701 32.69919302 israel haifa district 1879.1 272222 FALSE 1 1879
35.12078169 32.69911551 israel haifa district 1879.1 272223 FALSE 1 1879
35.02130456 32.38117626 israel central district 1880.1 272295 FALSE 1 1880
35.0327303 32.38220026 israel central district 1880.1 272296 FALSE 1 1880
35.02879684 32.36515737 israel central district 1880.1 272297 FALSE 1 1880
35.02156213 32.34459015 israel central district 1880.1 272298 FALSE 1 1880
35.01784143 32.34221303 israel central district 1880.1 272299 FALSE
1 1880

## Question:

How can I get the data from my shape file as 2 columns: c(lat, long) ?

I don't mind using something other than sf:st_read.

```{r importshapes}
editedmap <- sf::st_read("../includes/ISR_adm_edited/ISR_adm1.shp")
golan <- editedmap[which(editedmap$NAME_1 == "Golan"), ]
names(editedmap)
```

> names(editedmap)
 [1] "ID_0"      "ISO"       "NAME_0"    "ID_1"      "NAME_1"    "TYPE_1"  

 [7] "ENGTYPE_1" "NL_NAME_1" "VARNAME_1" "geometry"

_______________________________________________
R-sig-Geo mailing list
[hidden email]
https://stat.ethz.ch/mailman/listinfo/r-sig-geo

KrigeST function giving errors

Fri, 05/05/2017 - 18:22
Hi,

If I applies a variogram model into KrigeST fucntion, I get the error as follow:
> DE_kriged <- krigeST(count~1, data=stfdf, newdata=stf,modelList=sumMetricVgm)
Error in chol.default(A) :   the leading minor of order 16 is not positive definiteIn addition: Warning message:In krigeST(count ~ 1, data = stfdf, newdata = stf, modelList = sumMetricVgm) :  The spatio-temporal variogram model does not carry a time unit attribute: krisgeST cannot check whether the temporal distance metrics coincide.

I have tried applying different variogram models and different points.
https://drive.google.com/open?id=0B3tdiSZ9hBj4dVZXOXItUjdrRlE

Best,Ankur
        [[alternative HTML version deleted]]

_______________________________________________
R-sig-Geo mailing list
[hidden email]
https://stat.ethz.ch/mailman/listinfo/r-sig-geo

Re: Cross-validation for kriging in R (package geoR): how to include the trend while reestimate is TRUE?

Thu, 05/04/2017 - 04:55
Dear Patrick,

Thanks a lot for your quick reply and taking the time for digging deeper into the function! I will contact the package authors about the issue.

Best, Vera

---

V.M. (Vera) van Zoest, MSc | PhD candidate |
University of Twente<http://www.utwente.nl/> | Faculty ITC<http://www.itc.nl/> | Department Earth Observation Science (EOS)<https://www.itc.nl/EOS> |
ITC Building, room 2-038 | T: +31 (0)53 – 489 4412 | [hidden email]<mailto:[hidden email]> |

Study Geoinformatics: www.itc.nl/geoinformatics<http://www.itc.nl/geoinformatics>


From: Patrick Schratz [mailto:[hidden email]]
Sent: maandag 1 mei 2017 20:04
To: [hidden email]; Zoest, V.M. van (ITC) <[hidden email]>
Subject: Re: [R-sig-Geo] Cross-validation for kriging in R (package geoR): how to include the trend while reestimate is TRUE?

Hi Vera,

I started debugging a bit and the error is in line 192 of the `xvalid()` function which uses a subfunction `cv.f`

    res <- as.data.frame(t(apply(matrix(locations.xvalid),
      1, cv.f)))

which then does the call to `vario()` in lines 125-141.
Here, the error appears because coords (which is created in line 90) seems to be of different length then trend (which is taken from your provided variog.obj).

if (is.null(variog.obj))
            stop("xvalid: when argument reestimate = TRUE an object with the fitted variogram model must be provided in the argument variog.obj ")
          CVvar <- variog(coords = cv.coords, data = cv.data,
            uvec = variog.obj$uvec, trend = variog.obj$trend,
            lambda = variog.obj$lambda, option = variog.obj$output.type,
            estimator.type = variog.obj$estimator.type,
            nugget.tolerance = variog.obj$nugget.tolerance,
            max.dist = max(variog.obj$u), pairs.min = 2,
            bin.cloud = FALSE, direction = variog.obj$direction,
            tolerance = variog.obj$tolerance, unit.angle = "radians",
            messages = FALSE, ...)
          CVmod <- variofit(vario = CVvar, ini.cov.pars = model$cov.pars,
            cov.model = model$cov.model, fix.nugget = model$fix.nugget,
            nugget = model$nugget, fix.kappa = model$fix.kappa,
            kappa = model$kappa, max.dist = model$max.dist,
            minimisation.function = model$minimisation.function,
            weights = model$weights, messages = FALSE,
            …)

Because we are debugging somewhat deep here and the issue might be quickly solved by contacting the package authors (they should get it working quickly since they provide the option ‘reestimate = TRUE'), I would try to do so first before doing any more detailed inspection of the error.

Cheers, Patrick

PhD Student at Department of Geography - GIScience group
Friedrich-Schiller-University Jena, Germany
Tel.: +49-3641-9-48973
Web: https://pat-s.github.io<https://pat-s.github.io/>

On 1. May 2017, 16:31 +0200, wrote:


http://stackoverflow.com/questions/43520716/cross-validation-for-kriging-in-r-how-to-include-the-trend-while-reestimating-t

        [[alternative HTML version deleted]]

_______________________________________________
R-sig-Geo mailing list
[hidden email]
https://stat.ethz.ch/mailman/listinfo/r-sig-geo

Re: sp::disaggregate() does not seem to add the necessary "comment" attribute to identify holes if Polygons does not contain exactly 1 polygon with 0 or 1 holes

Mon, 05/01/2017 - 13:42
Roger,

Works for me now.  Thanks for your patience with this.

I'll definitely look at sf for future projects, but for now, I just want to
analyze my darn data!  :)

 - Anne


On 05/01/2017 05:23 AM, Roger Bivand wrote:
> On Sun, 30 Apr 2017, Anne C. Hanna wrote:
>
>> Okay, I think I've tracked it down.  This is... absurdly complicated and I
>> don't know what the actual correct resolution is, but I'm pretty sure that at
>> least I know what's going wrong.
>
> Good, and thanks for staying with this. I've pushed a fix to my github repo,
> and pull request #29 to edzer/sp.
>
> Note that this also works out of the box:
>
> library(sf)
> sf <- st_read(dsn = "2011 Voting District Boundary Shapefiles", layer =
> "MCDS", stringsAsFactors = FALSE)
> sfd <- st_cast(sf, "POLYGON")
>
> However, using sf::st_touches() is a bit less intuitive for now than the spdep
> "nb" object generated by spdep::poly2nb(), which also provides Queen (touch at
> one or more boundary points) and Rook (touch at more than one boundary point),
> and sf::st_touches() is seven times slower than spdep::poly2nb() now.
> sf::st_touches() could be speeded up using an STR tree (spdep::poly2nb() can
> do this, but in any case only looks at candidate neighbours with intersecting
> bounding boxes). So:
>
> sp_sfd <- as(sfd, "Spatial")
> library(spdep)
> sp_nb <- poly2nb(sp_sfd)
>
> seems a reasonable workflow to find the adjacencies under Queen/Rook and
> snapping control.
>
> Please try out the disaggregate fix and report back - and consider shifting to
> sf.
>
> Roger
>
>>
>> Basically it comes down to the fact that createSPComment() does not actually
>> check whether each individual polygon in the SpatialPolygons object has a
>> comment (much less a correct comment).  Instead, it looks at a top-level
>> "comment" attribute.  If that top-level comment attribute (i.e. sf@comment for
>> my SpatialPolygonsDataFrame object stored as "sf") is "TRUE", then
>> createSPComment() does nothing.
>>
>> And here's where the second layer of the problem comes in for my dataset ---
>> my original data, pre-disaggregation, has complete and correct comments.  Some
>> of those comments (for objects where it's a single polygon with 0 or 1 holes)
>> are being retained by the basic disaggregation algorithm, while others are
>> not.  Your patch attempts to fill in the holes by running createSPComment() on
>> the final results.  But... the thing you are running createSPComment() on is
>> SpatialPolygons(p), where p is the list of disaggregated polygons.  And the
>> SpatialPolygons() constructor looks at the list of the polygons you feed it,
>> and if *any* of those polygons have comments, it sets the top-level "comment"
>> attribute of its output to "TRUE", even if other polygons are missing
>> comments.  (And it also doesn't check the comments it gets for correctness.)
>> So the fact that some polygon comments are retained in p means that
>> SpatialPolygons() lies to createSPComment() about whether it has any missing
>> comments that need to be fixed.
>>
>> On the other hand, when I manually run createPolygonComment() on the
>> individual Polygons objects in disaggregate()'s output, I don't even look at
>> the top-level "comment" attribute, so it works just fine.
>>
>> My little toy test polygons, on the other hand, were just not complex enough
>> to end up with partial comments in them, so they also didn't experience this
>> error.
>>
>> I am not familiar enough with the design philosophy behind sp to suggest at
>> what level this issue should be fixed, but I hope this is at least enough info
>> to help those who do know what they're doing get it corrected.
>>
>> - Anne
>>
>>
>> On 04/30/2017 12:18 PM, Anne C. Hanna wrote:
>>> Roger,
>>>
>>> Unfortunately I have a use case for you of createSPComment() not working.
>>>
>>> I tried your new version of disaggregate() on the original test script I sent
>>> you, and it does seem to have fixed all the cases in there.  However, when I
>>> tried it on my actual data, it had the same failure mode as before. The
>>> original dataset has what appear to be correct comments, but the
>>> disaggregate() output is full of NULLs in place of comments for the multi-part
>>> and multi-hole polygons.
>>>
>>> createSPComment() appears to be what's failing, rather than just some logic in
>>> your disaggregate() function --- when I try to run createSPComment() on the
>>> disaggregate() output, I still get the same set of NULLs.  I can fix it if I
>>> run createPolygonsComment() individually on every Polygons object in my
>>> SpatialPolygonsDataFrame.
>>>
>>> I'm not sure what is different between my dataset and the test polygons I was
>>> using earlier, as createSPComment() seems to handle the test shapes just fine
>>> (and converting my SpatialPolygonsDataFrame into just a set of SpatialPolygons
>>> also doesn't help).  But presumably it is better to have createSPComment()
>>> fixed than to have to work around it in disaggregate().  So I guess I'll look
>>> at the code for that and see if I can find what's wrong.
>>>
>>> Just in case you want to see this in action, I've attached a little test
>>> script.  The data set I am working with is available at:
>>>
>>> http://aws.redistricting.state.pa.us/Redistricting/Resources/GISData/2011-Voting-District-Boundary-Shapefiles.zip
>>>
>>>
>>>  - Anne
>>>
>>>
>>> On 04/29/2017 10:06 AM, Roger Bivand wrote:
>>>> On Sat, 29 Apr 2017, Anne C. Hanna wrote:
>>>>
>>>>> Roger,
>>>>>
>>>>> This looks great, and I will try it out ASAP.  I do have one reservation
>>>>> though --- it seems you are using createSPComment() to reconstruct the
>>>>> comments, and I have seen some discussion that that may not be reliable in
>>>>> all cases (e.g. if the initial polygons are wonky in some way).  I don't
>>>>> know a lot about this, but is it possible that it would be preferable to
>>>>> parse and appropriately disaggregate the original comments strings (if they
>>>>> exist), so as to deal slightly more smoothly with such cases (i.e., the
>>>>> polygons would still be wonky, but at least the hole/polygon matching would
>>>>> track with whatever was in the original data)?
>>>>
>>>> Contributions very welcome - this was code moved from the raster package,
>>>> so I
>>>> really have little feeling for why it is necessary.
>>>>
>>>> Please provide references and use cases. GEOS is strict in its treatments of
>>>> geometries, but does work in most cases. People just expressing opinions
>>>> doesn't help, and may well be misleading. It is possible to clean geometries
>>>> too, see the cleangeo package. createPolygonsComment probably isn't
>>>> foolproof,
>>>> but specific reports help, because then it is possible to do something about
>>>> it. sp classes didn't use simple features when written because sf was not
>>>> widely used then - introduced in 2004, when sp was being developed. Using sf
>>>> helps, and rgeos::createPolygonsComment was a work-around from 2010 when
>>>> rgeos
>>>> was written.
>>>>
>>>> Even when you use sf in the sf package, you will still run into invalid
>>>> geometries because the geometries are in fact invalid, the sf package also
>>>> uses GEOS.
>>>>
>>>> Roger
>>>>
>>>>>
>>>>> I also had no success using createSPComment to fix disaggregate()'s output
>>>>> previously, even though my polygons are perfectly non-wonky, so I am perhaps
>>>>> a little more untrusting of it than I should be.  But I'll let you know how
>>>>> this version works with my data.  Thanks for addressing this so quickly!
>>>>>
>>>>> - Anne
>>>>>
>>>>>
>>>>> On 04/28/2017 12:11 PM, Roger Bivand wrote:
>>>>>> I've pushed the fix to my fork:
>>>>>>
>>>>>> https://github.com/rsbivand/sp
>>>>>>
>>>>>> and created a pull request:
>>>>>>
>>>>>> https://github.com/edzer/sp/pull/28
>>>>>>
>>>>>> Only one part of a complicated set if nested if() in disaggregate() was
>>>>>> adding
>>>>>> comments, but in some settings the existing comments survived the
>>>>>> disaggregation. Now the Polygons object comment attribute is re-created for
>>>>>> all Polygons objects. This is version 1.2-6, also including code changes
>>>>>> that
>>>>>> internally affect rgdal and rgeos - you may like to re-install them from
>>>>>> source after installing this sp version (shouldn't matter).
>>>>>>
>>>>>> Roger
>>>>>>
>>>>>> On Fri, 28 Apr 2017, Anne C. Hanna wrote:
>>>>>>
>>>>>>> Hello.  I first posted this issue report on the sp GitHub repo
>>>>>>> (https://github.com/edzer/sp/issues/27) and it was suggested that I
>>>>>>> redirect
>>>>>>> it to here.
>>>>>>>
>>>>>>> I am working with a geographic dataset with complex borders. The data is
>>>>>>> stored as a SpatialPolygonsDataFrame.  Each Polygons object in the data
>>>>>>> frame
>>>>>>> may be composed of multiple disjoint polygons, and each polygon may have
>>>>>>> multiple holes.  I want to disaggregate each of the Polygons objects
>>>>>>> into its
>>>>>>> individual disjoint polygons and construct an adjacency matrix for all the
>>>>>>> disjoint components, and I was using disaggregate() to do this.  However,
>>>>>>> when
>>>>>>> I run gTouches() on the disaggregated data, in order to compute the
>>>>>>> adjacencies, I get a number of warnings like this:
>>>>>>>
>>>>>>> Error in RGEOSBinPredFunc(spgeom1, spgeom2, byid, "rgeos_touches") :
>>>>>>> Polygons
>>>>>>> object missing comment attribute ignoring hole(s).  See function
>>>>>>> createSPComment.
>>>>>>>
>>>>>>> Looking at the Polygons "comment" attributes in the
>>>>>>> SpatialPolygonsDataFrame
>>>>>>> output by disaggregate(), I see that the only comment values are "0"
>>>>>>> (indicating a single polygon with no holes), "0 1" (indicating a single
>>>>>>> polygon with a single hole), and NULL (apparently no comment was written).
>>>>>>> Since I know my dataset contains several Polygons objects which are
>>>>>>> composed
>>>>>>> of multiple disjoint regions, and also several Polygons which contain more
>>>>>>> than one hole, this is not the expected result.  In reading the
>>>>>>> disaggregate()
>>>>>>> code in the sp GitHub repository (specifically, explodePolygons()), I also
>>>>>>> can't see anywhere the comment is being added for the cases where a
>>>>>>> Polygons
>>>>>>> object has more than two parts or more than two holes.  It actually seems
>>>>>>> like
>>>>>>> it's getting carried along almost accidentally in the few cases that do
>>>>>>> get
>>>>>>> comments, and neglected otherwise.
>>>>>>>
>>>>>>> Assuming I'm not failing to understand the code and the desired behavior
>>>>>>> (entirely possible, as I am new at working with this software!), this
>>>>>>> seems
>>>>>>> suboptimal to me.  My dataset is pretty well-behaved (despite its
>>>>>>> complexity),
>>>>>>> so I should be able to fix my issues with judicious application of
>>>>>>> createPolygonsComment.  But I had a heck of a time figuring out what was
>>>>>>> going
>>>>>>> wrong with gTouches, since Polygons comment management appears to be a
>>>>>>> pretty
>>>>>>> obscure field (and createSPComment wasn't working for me, for whatever
>>>>>>> reason).  So it seems like it might be better if disaggregate() just
>>>>>>> parses
>>>>>>> and passes along the comments from its input correctly, or, if it's
>>>>>>> absolutely
>>>>>>> necessary to not create comments, passes nothing and warns clearly in the
>>>>>>> manual that comments and associated hole information are being lost.
>>>>>>> Passing
>>>>>>> along comments in some cases while silently dropping them in others seems
>>>>>>> like
>>>>>>> kind of the worst of both worlds.
>>>>>>>
>>>>>>> I've attached a set of tests I wrote to demonstrate desired/undesired
>>>>>>> behavior: disaggregate_comment_tests.R.  My R version is 3.4.0, my sp
>>>>>>> version
>>>>>>> is 1.2-4, my rgeos version is 0.3-23 (SVN revision 546), and my GEOS
>>>>>>> runtime
>>>>>>> version is 3.5.1-CAPI-1.9.1 r4246.  I am using Debian Release 9.0 with
>>>>>>> kernel
>>>>>>> version 4.9.0-2-amd64.  I hope this is useful; please let me know if
>>>>>>> you need
>>>>>>> more info or if there is a better place to post this issue.
>>>>>>>
>>>>>>> - Anne
>>>>>>>
>>>>>>
>>>>>
>>>>>
>>>>
>>
>>
>
_______________________________________________
R-sig-Geo mailing list
[hidden email]
https://stat.ethz.ch/mailman/listinfo/r-sig-geo
signature.asc (201 bytes) Download Attachment

Re: Cross-validation for kriging in R (package geoR): how to include the trend while reestimate is TRUE?

Mon, 05/01/2017 - 13:04
Hi Vera,

I started debugging a bit and the error is in line 192 of the `xvalid()` function which uses a subfunction `cv.f`

    res <- as.data.frame(t(apply(matrix(locations.xvalid),
      1, cv.f)))

which then does the call to `vario()` in lines 125-141.
Here, the error appears because coords (which is created in line 90) seems to be of different length then trend (which is taken from your provided variog.obj).

if (is.null(variog.obj))
            stop("xvalid: when argument reestimate = TRUE an object with the fitted variogram model must be provided in the argument variog.obj ")
          CVvar <- variog(coords = cv.coords, data = cv.data,
            uvec = variog.obj$uvec, trend = variog.obj$trend,
            lambda = variog.obj$lambda, option = variog.obj$output.type,
            estimator.type = variog.obj$estimator.type,
            nugget.tolerance = variog.obj$nugget.tolerance,
            max.dist = max(variog.obj$u), pairs.min = 2,
            bin.cloud = FALSE, direction = variog.obj$direction,
            tolerance = variog.obj$tolerance, unit.angle = "radians",
            messages = FALSE, ...)
          CVmod <- variofit(vario = CVvar, ini.cov.pars = model$cov.pars,
            cov.model = model$cov.model, fix.nugget = model$fix.nugget,
            nugget = model$nugget, fix.kappa = model$fix.kappa,
            kappa = model$kappa, max.dist = model$max.dist,
            minimisation.function = model$minimisation.function,
            weights = model$weights, messages = FALSE,
            …)

Because we are debugging somewhat deep here and the issue might be quickly solved by contacting the package authors (they should get it working quickly since they provide the option ‘reestimate = TRUE'), I would try to do so first before doing any more detailed inspection of the error.

Cheers, Patrick

PhD Student at Department of Geography - GIScience group
Friedrich-Schiller-University Jena, Germany
Tel.: +49-3641-9-48973
Web: https://pat-s.github.io

On 1. May 2017, 16:31 +0200, wrote:
>
> http://stackoverflow.com/questions/43520716/cross-validation-for-kriging-in-r-how-to-include-the-trend-while-reestimating-t

        [[alternative HTML version deleted]]

_______________________________________________
R-sig-Geo mailing list
[hidden email]
https://stat.ethz.ch/mailman/listinfo/r-sig-geo

Cross-validation for kriging in R (package geoR): how to include the trend while reestimate is TRUE?

Mon, 05/01/2017 - 09:30
Dear all,

I have a question related to the function xvalid (package geoR), which I asked on StackOverflow before but unfortunately did not get answered, probably because it is too specifically related to spatial statistics and this specific function (http://stackoverflow.com/questions/43520716/cross-validation-for-kriging-in-r-how-to-include-the-trend-while-reestimating-t). I hope anyone of you is able to answer it.

I would like to compute a variogram, fit it, and then perform cross-validation. Function xvalid seems to work pretty nice to do the cross-validation. It works when I set reestimate=TRUE (so it reestimates the variogram for every point removed from the dataset in cross-validation) and it also works when using a trend. However, it does not seem to work when combining these two. Here is a reproducible example using the Meuse example dataset:

library(geoR)
library(sp)
data(meuse) # import data
coordinates(meuse) = ~x+y # make spatialpointsdataframe
meuse@proj4string <- CRS("+init=epsg:28992") # add projection
meuse_geo <- as.geodata(meuse) # create object of class geodata for geoR compatibility
meuse_geo$data <- meuse@data # attach all data (incl. covariates) to meuse_geo
meuse_vario <- variog(geodata=meuse_geo, data=meuse_geo$data$lead, trend= ~meuse_geo$data$elev) # variogram
meuse_vfit <- variofit(meuse_vario, nugget=0.1, fix.nugget=T) # fit
# cross-validation works fine:
xvalid(geodata=meuse_geo, data=meuse_geo$data$lead, model=meuse_vfit, variog.obj = meuse_vario, reestimate=F)
# cross-validation does not work when reestimate = T:
xvalid(geodata=meuse_geo, data=meuse_geo$data$lead, model=meuse_vfit, variog.obj = meuse_vario, reestimate=T)

The error I get is:

Error in variog(coords = cv.coords, data = cv.data, uvec = variog.obj$uvec,  : coords and trend have incompatible sizes

It seems to remove the point from the dataset during cross-validation, but it doesn't seem to remove the point from the covariates/trend data. Any ideas on solving this / work-arounds? Thanks a lot in advance for thinking along.

Best, Vera

---

V.M. (Vera) van Zoest, MSc | PhD candidate |
University of Twente<http://www.utwente.nl/> | Faculty ITC<http://www.itc.nl/> | Department Earth Observation Science (EOS)<https://www.itc.nl/EOS> |
ITC Building, room 2-038 | T: +31 (0)53 - 489 4412 | [hidden email]<mailto:[hidden email]> |

Study Geoinformatics: www.itc.nl/geoinformatics<http://www.itc.nl/geoinformatics>


        [[alternative HTML version deleted]]

_______________________________________________
R-sig-Geo mailing list
[hidden email]
https://stat.ethz.ch/mailman/listinfo/r-sig-geo

authorized model formulas for spatio-temporal models, please

Mon, 05/01/2017 - 09:06
Hello everyone.


Where would I find the authorized model formulas for spatio-temporal models, please?


?Thanks,

Erin



Erin M. Hodgess
Associate Professor
Department of Mathematics and Statistics
University of Houston - Downtown
mailto: [hidden email]

        [[alternative HTML version deleted]]

_______________________________________________
R-sig-Geo mailing list
[hidden email]
https://stat.ethz.ch/mailman/listinfo/r-sig-geo

Re: sp::disaggregate() does not seem to add the necessary "comment" attribute to identify holes if Polygons does not contain exactly 1 polygon with 0 or 1 holes

Mon, 05/01/2017 - 04:23
On Sun, 30 Apr 2017, Anne C. Hanna wrote:

> Okay, I think I've tracked it down.  This is... absurdly complicated and I
> don't know what the actual correct resolution is, but I'm pretty sure that at
> least I know what's going wrong.

Good, and thanks for staying with this. I've pushed a fix to my github
repo, and pull request #29 to edzer/sp.

Note that this also works out of the box:

library(sf)
sf <- st_read(dsn = "2011 Voting District Boundary Shapefiles", layer =
"MCDS", stringsAsFactors = FALSE)
sfd <- st_cast(sf, "POLYGON")

However, using sf::st_touches() is a bit less intuitive for now than the
spdep "nb" object generated by spdep::poly2nb(), which also provides Queen
(touch at one or more boundary points) and Rook (touch at more than one
boundary point), and sf::st_touches() is seven times slower than
spdep::poly2nb() now. sf::st_touches() could be speeded up using an STR
tree (spdep::poly2nb() can do this, but in any case only looks at
candidate neighbours with intersecting bounding boxes). So:

sp_sfd <- as(sfd, "Spatial")
library(spdep)
sp_nb <- poly2nb(sp_sfd)

seems a reasonable workflow to find the adjacencies under Queen/Rook and
snapping control.

Please try out the disaggregate fix and report back - and consider
shifting to sf.

Roger

>
> Basically it comes down to the fact that createSPComment() does not actually
> check whether each individual polygon in the SpatialPolygons object has a
> comment (much less a correct comment).  Instead, it looks at a top-level
> "comment" attribute.  If that top-level comment attribute (i.e. sf@comment for
> my SpatialPolygonsDataFrame object stored as "sf") is "TRUE", then
> createSPComment() does nothing.
>
> And here's where the second layer of the problem comes in for my dataset ---
> my original data, pre-disaggregation, has complete and correct comments.  Some
> of those comments (for objects where it's a single polygon with 0 or 1 holes)
> are being retained by the basic disaggregation algorithm, while others are
> not.  Your patch attempts to fill in the holes by running createSPComment() on
> the final results.  But... the thing you are running createSPComment() on is
> SpatialPolygons(p), where p is the list of disaggregated polygons.  And the
> SpatialPolygons() constructor looks at the list of the polygons you feed it,
> and if *any* of those polygons have comments, it sets the top-level "comment"
> attribute of its output to "TRUE", even if other polygons are missing
> comments.  (And it also doesn't check the comments it gets for correctness.)
> So the fact that some polygon comments are retained in p means that
> SpatialPolygons() lies to createSPComment() about whether it has any missing
> comments that need to be fixed.
>
> On the other hand, when I manually run createPolygonComment() on the
> individual Polygons objects in disaggregate()'s output, I don't even look at
> the top-level "comment" attribute, so it works just fine.
>
> My little toy test polygons, on the other hand, were just not complex enough
> to end up with partial comments in them, so they also didn't experience this
> error.
>
> I am not familiar enough with the design philosophy behind sp to suggest at
> what level this issue should be fixed, but I hope this is at least enough info
> to help those who do know what they're doing get it corrected.
>
> - Anne
>
>
> On 04/30/2017 12:18 PM, Anne C. Hanna wrote:
>> Roger,
>>
>> Unfortunately I have a use case for you of createSPComment() not working.
>>
>> I tried your new version of disaggregate() on the original test script I sent
>> you, and it does seem to have fixed all the cases in there.  However, when I
>> tried it on my actual data, it had the same failure mode as before. The
>> original dataset has what appear to be correct comments, but the
>> disaggregate() output is full of NULLs in place of comments for the multi-part
>> and multi-hole polygons.
>>
>> createSPComment() appears to be what's failing, rather than just some logic in
>> your disaggregate() function --- when I try to run createSPComment() on the
>> disaggregate() output, I still get the same set of NULLs.  I can fix it if I
>> run createPolygonsComment() individually on every Polygons object in my
>> SpatialPolygonsDataFrame.
>>
>> I'm not sure what is different between my dataset and the test polygons I was
>> using earlier, as createSPComment() seems to handle the test shapes just fine
>> (and converting my SpatialPolygonsDataFrame into just a set of SpatialPolygons
>> also doesn't help).  But presumably it is better to have createSPComment()
>> fixed than to have to work around it in disaggregate().  So I guess I'll look
>> at the code for that and see if I can find what's wrong.
>>
>> Just in case you want to see this in action, I've attached a little test
>> script.  The data set I am working with is available at:
>>
>> http://aws.redistricting.state.pa.us/Redistricting/Resources/GISData/2011-Voting-District-Boundary-Shapefiles.zip
>>
>>  - Anne
>>
>>
>> On 04/29/2017 10:06 AM, Roger Bivand wrote:
>>> On Sat, 29 Apr 2017, Anne C. Hanna wrote:
>>>
>>>> Roger,
>>>>
>>>> This looks great, and I will try it out ASAP.  I do have one reservation
>>>> though --- it seems you are using createSPComment() to reconstruct the
>>>> comments, and I have seen some discussion that that may not be reliable in
>>>> all cases (e.g. if the initial polygons are wonky in some way).  I don't
>>>> know a lot about this, but is it possible that it would be preferable to
>>>> parse and appropriately disaggregate the original comments strings (if they
>>>> exist), so as to deal slightly more smoothly with such cases (i.e., the
>>>> polygons would still be wonky, but at least the hole/polygon matching would
>>>> track with whatever was in the original data)?
>>>
>>> Contributions very welcome - this was code moved from the raster package, so I
>>> really have little feeling for why it is necessary.
>>>
>>> Please provide references and use cases. GEOS is strict in its treatments of
>>> geometries, but does work in most cases. People just expressing opinions
>>> doesn't help, and may well be misleading. It is possible to clean geometries
>>> too, see the cleangeo package. createPolygonsComment probably isn't foolproof,
>>> but specific reports help, because then it is possible to do something about
>>> it. sp classes didn't use simple features when written because sf was not
>>> widely used then - introduced in 2004, when sp was being developed. Using sf
>>> helps, and rgeos::createPolygonsComment was a work-around from 2010 when rgeos
>>> was written.
>>>
>>> Even when you use sf in the sf package, you will still run into invalid
>>> geometries because the geometries are in fact invalid, the sf package also
>>> uses GEOS.
>>>
>>> Roger
>>>
>>>>
>>>> I also had no success using createSPComment to fix disaggregate()'s output
>>>> previously, even though my polygons are perfectly non-wonky, so I am perhaps
>>>> a little more untrusting of it than I should be.  But I'll let you know how
>>>> this version works with my data.  Thanks for addressing this so quickly!
>>>>
>>>> - Anne
>>>>
>>>>
>>>> On 04/28/2017 12:11 PM, Roger Bivand wrote:
>>>>> I've pushed the fix to my fork:
>>>>>
>>>>> https://github.com/rsbivand/sp
>>>>>
>>>>> and created a pull request:
>>>>>
>>>>> https://github.com/edzer/sp/pull/28
>>>>>
>>>>> Only one part of a complicated set if nested if() in disaggregate() was adding
>>>>> comments, but in some settings the existing comments survived the
>>>>> disaggregation. Now the Polygons object comment attribute is re-created for
>>>>> all Polygons objects. This is version 1.2-6, also including code changes that
>>>>> internally affect rgdal and rgeos - you may like to re-install them from
>>>>> source after installing this sp version (shouldn't matter).
>>>>>
>>>>> Roger
>>>>>
>>>>> On Fri, 28 Apr 2017, Anne C. Hanna wrote:
>>>>>
>>>>>> Hello.  I first posted this issue report on the sp GitHub repo
>>>>>> (https://github.com/edzer/sp/issues/27) and it was suggested that I redirect
>>>>>> it to here.
>>>>>>
>>>>>> I am working with a geographic dataset with complex borders. The data is
>>>>>> stored as a SpatialPolygonsDataFrame.  Each Polygons object in the data frame
>>>>>> may be composed of multiple disjoint polygons, and each polygon may have
>>>>>> multiple holes.  I want to disaggregate each of the Polygons objects into its
>>>>>> individual disjoint polygons and construct an adjacency matrix for all the
>>>>>> disjoint components, and I was using disaggregate() to do this.  However,
>>>>>> when
>>>>>> I run gTouches() on the disaggregated data, in order to compute the
>>>>>> adjacencies, I get a number of warnings like this:
>>>>>>
>>>>>> Error in RGEOSBinPredFunc(spgeom1, spgeom2, byid, "rgeos_touches") : Polygons
>>>>>> object missing comment attribute ignoring hole(s).  See function
>>>>>> createSPComment.
>>>>>>
>>>>>> Looking at the Polygons "comment" attributes in the SpatialPolygonsDataFrame
>>>>>> output by disaggregate(), I see that the only comment values are "0"
>>>>>> (indicating a single polygon with no holes), "0 1" (indicating a single
>>>>>> polygon with a single hole), and NULL (apparently no comment was written).
>>>>>> Since I know my dataset contains several Polygons objects which are composed
>>>>>> of multiple disjoint regions, and also several Polygons which contain more
>>>>>> than one hole, this is not the expected result.  In reading the
>>>>>> disaggregate()
>>>>>> code in the sp GitHub repository (specifically, explodePolygons()), I also
>>>>>> can't see anywhere the comment is being added for the cases where a Polygons
>>>>>> object has more than two parts or more than two holes.  It actually seems
>>>>>> like
>>>>>> it's getting carried along almost accidentally in the few cases that do get
>>>>>> comments, and neglected otherwise.
>>>>>>
>>>>>> Assuming I'm not failing to understand the code and the desired behavior
>>>>>> (entirely possible, as I am new at working with this software!), this seems
>>>>>> suboptimal to me.  My dataset is pretty well-behaved (despite its
>>>>>> complexity),
>>>>>> so I should be able to fix my issues with judicious application of
>>>>>> createPolygonsComment.  But I had a heck of a time figuring out what was
>>>>>> going
>>>>>> wrong with gTouches, since Polygons comment management appears to be a pretty
>>>>>> obscure field (and createSPComment wasn't working for me, for whatever
>>>>>> reason).  So it seems like it might be better if disaggregate() just parses
>>>>>> and passes along the comments from its input correctly, or, if it's
>>>>>> absolutely
>>>>>> necessary to not create comments, passes nothing and warns clearly in the
>>>>>> manual that comments and associated hole information are being lost.  Passing
>>>>>> along comments in some cases while silently dropping them in others seems
>>>>>> like
>>>>>> kind of the worst of both worlds.
>>>>>>
>>>>>> I've attached a set of tests I wrote to demonstrate desired/undesired
>>>>>> behavior: disaggregate_comment_tests.R.  My R version is 3.4.0, my sp version
>>>>>> is 1.2-4, my rgeos version is 0.3-23 (SVN revision 546), and my GEOS runtime
>>>>>> version is 3.5.1-CAPI-1.9.1 r4246.  I am using Debian Release 9.0 with kernel
>>>>>> version 4.9.0-2-amd64.  I hope this is useful; please let me know if you need
>>>>>> more info or if there is a better place to post this issue.
>>>>>>
>>>>>> - Anne
>>>>>>
>>>>>
>>>>
>>>>
>>>
>
>
--
Roger Bivand
Department of Economics, Norwegian School of Economics,
Helleveien 30, N-5045 Bergen, Norway.
voice: +47 55 95 93 55; e-mail: [hidden email]
Editor-in-Chief of The R Journal, https://journal.r-project.org/index.html
http://orcid.org/0000-0003-2392-6140
https://scholar.google.no/citations?user=AWeghB0AAAAJ&hl=en

_______________________________________________
R-sig-Geo mailing list
[hidden email]
https://stat.ethz.ch/mailman/listinfo/r-sig-geo
Roger Bivand
Department of Economics
Norwegian School of Economics
Helleveien 30
N-5045 Bergen, Norway

Re: sp::disaggregate() does not seem to add the necessary "comment" attribute to identify holes if Polygons does not contain exactly 1 polygon with 0 or 1 holes

Sun, 04/30/2017 - 14:35
Okay, I think I've tracked it down.  This is... absurdly complicated and I
don't know what the actual correct resolution is, but I'm pretty sure that at
least I know what's going wrong.

Basically it comes down to the fact that createSPComment() does not actually
check whether each individual polygon in the SpatialPolygons object has a
comment (much less a correct comment).  Instead, it looks at a top-level
"comment" attribute.  If that top-level comment attribute (i.e. sf@comment for
my SpatialPolygonsDataFrame object stored as "sf") is "TRUE", then
createSPComment() does nothing.

And here's where the second layer of the problem comes in for my dataset ---
my original data, pre-disaggregation, has complete and correct comments.  Some
of those comments (for objects where it's a single polygon with 0 or 1 holes)
are being retained by the basic disaggregation algorithm, while others are
not.  Your patch attempts to fill in the holes by running createSPComment() on
the final results.  But... the thing you are running createSPComment() on is
SpatialPolygons(p), where p is the list of disaggregated polygons.  And the
SpatialPolygons() constructor looks at the list of the polygons you feed it,
and if *any* of those polygons have comments, it sets the top-level "comment"
attribute of its output to "TRUE", even if other polygons are missing
comments.  (And it also doesn't check the comments it gets for correctness.)
So the fact that some polygon comments are retained in p means that
SpatialPolygons() lies to createSPComment() about whether it has any missing
comments that need to be fixed.

On the other hand, when I manually run createPolygonComment() on the
individual Polygons objects in disaggregate()'s output, I don't even look at
the top-level "comment" attribute, so it works just fine.

My little toy test polygons, on the other hand, were just not complex enough
to end up with partial comments in them, so they also didn't experience this
error.

I am not familiar enough with the design philosophy behind sp to suggest at
what level this issue should be fixed, but I hope this is at least enough info
to help those who do know what they're doing get it corrected.

 - Anne


On 04/30/2017 12:18 PM, Anne C. Hanna wrote:
> Roger,
>
> Unfortunately I have a use case for you of createSPComment() not working.
>
> I tried your new version of disaggregate() on the original test script I sent
> you, and it does seem to have fixed all the cases in there.  However, when I
> tried it on my actual data, it had the same failure mode as before. The
> original dataset has what appear to be correct comments, but the
> disaggregate() output is full of NULLs in place of comments for the multi-part
> and multi-hole polygons.
>
> createSPComment() appears to be what's failing, rather than just some logic in
> your disaggregate() function --- when I try to run createSPComment() on the
> disaggregate() output, I still get the same set of NULLs.  I can fix it if I
> run createPolygonsComment() individually on every Polygons object in my
> SpatialPolygonsDataFrame.
>
> I'm not sure what is different between my dataset and the test polygons I was
> using earlier, as createSPComment() seems to handle the test shapes just fine
> (and converting my SpatialPolygonsDataFrame into just a set of SpatialPolygons
> also doesn't help).  But presumably it is better to have createSPComment()
> fixed than to have to work around it in disaggregate().  So I guess I'll look
> at the code for that and see if I can find what's wrong.
>
> Just in case you want to see this in action, I've attached a little test
> script.  The data set I am working with is available at:
>
> http://aws.redistricting.state.pa.us/Redistricting/Resources/GISData/2011-Voting-District-Boundary-Shapefiles.zip
>
>  - Anne
>
>
> On 04/29/2017 10:06 AM, Roger Bivand wrote:
>> On Sat, 29 Apr 2017, Anne C. Hanna wrote:
>>
>>> Roger,
>>>
>>> This looks great, and I will try it out ASAP.  I do have one reservation
>>> though --- it seems you are using createSPComment() to reconstruct the
>>> comments, and I have seen some discussion that that may not be reliable in
>>> all cases (e.g. if the initial polygons are wonky in some way).  I don't
>>> know a lot about this, but is it possible that it would be preferable to
>>> parse and appropriately disaggregate the original comments strings (if they
>>> exist), so as to deal slightly more smoothly with such cases (i.e., the
>>> polygons would still be wonky, but at least the hole/polygon matching would
>>> track with whatever was in the original data)?
>>
>> Contributions very welcome - this was code moved from the raster package, so I
>> really have little feeling for why it is necessary.
>>
>> Please provide references and use cases. GEOS is strict in its treatments of
>> geometries, but does work in most cases. People just expressing opinions
>> doesn't help, and may well be misleading. It is possible to clean geometries
>> too, see the cleangeo package. createPolygonsComment probably isn't foolproof,
>> but specific reports help, because then it is possible to do something about
>> it. sp classes didn't use simple features when written because sf was not
>> widely used then - introduced in 2004, when sp was being developed. Using sf
>> helps, and rgeos::createPolygonsComment was a work-around from 2010 when rgeos
>> was written.
>>
>> Even when you use sf in the sf package, you will still run into invalid
>> geometries because the geometries are in fact invalid, the sf package also
>> uses GEOS.
>>
>> Roger
>>
>>>
>>> I also had no success using createSPComment to fix disaggregate()'s output
>>> previously, even though my polygons are perfectly non-wonky, so I am perhaps
>>> a little more untrusting of it than I should be.  But I'll let you know how
>>> this version works with my data.  Thanks for addressing this so quickly!
>>>
>>> - Anne
>>>
>>>
>>> On 04/28/2017 12:11 PM, Roger Bivand wrote:
>>>> I've pushed the fix to my fork:
>>>>
>>>> https://github.com/rsbivand/sp
>>>>
>>>> and created a pull request:
>>>>
>>>> https://github.com/edzer/sp/pull/28
>>>>
>>>> Only one part of a complicated set if nested if() in disaggregate() was adding
>>>> comments, but in some settings the existing comments survived the
>>>> disaggregation. Now the Polygons object comment attribute is re-created for
>>>> all Polygons objects. This is version 1.2-6, also including code changes that
>>>> internally affect rgdal and rgeos - you may like to re-install them from
>>>> source after installing this sp version (shouldn't matter).
>>>>
>>>> Roger
>>>>
>>>> On Fri, 28 Apr 2017, Anne C. Hanna wrote:
>>>>
>>>>> Hello.  I first posted this issue report on the sp GitHub repo
>>>>> (https://github.com/edzer/sp/issues/27) and it was suggested that I redirect
>>>>> it to here.
>>>>>
>>>>> I am working with a geographic dataset with complex borders. The data is
>>>>> stored as a SpatialPolygonsDataFrame.  Each Polygons object in the data frame
>>>>> may be composed of multiple disjoint polygons, and each polygon may have
>>>>> multiple holes.  I want to disaggregate each of the Polygons objects into its
>>>>> individual disjoint polygons and construct an adjacency matrix for all the
>>>>> disjoint components, and I was using disaggregate() to do this.  However,
>>>>> when
>>>>> I run gTouches() on the disaggregated data, in order to compute the
>>>>> adjacencies, I get a number of warnings like this:
>>>>>
>>>>> Error in RGEOSBinPredFunc(spgeom1, spgeom2, byid, "rgeos_touches") : Polygons
>>>>> object missing comment attribute ignoring hole(s).  See function
>>>>> createSPComment.
>>>>>
>>>>> Looking at the Polygons "comment" attributes in the SpatialPolygonsDataFrame
>>>>> output by disaggregate(), I see that the only comment values are "0"
>>>>> (indicating a single polygon with no holes), "0 1" (indicating a single
>>>>> polygon with a single hole), and NULL (apparently no comment was written).
>>>>> Since I know my dataset contains several Polygons objects which are composed
>>>>> of multiple disjoint regions, and also several Polygons which contain more
>>>>> than one hole, this is not the expected result.  In reading the
>>>>> disaggregate()
>>>>> code in the sp GitHub repository (specifically, explodePolygons()), I also
>>>>> can't see anywhere the comment is being added for the cases where a Polygons
>>>>> object has more than two parts or more than two holes.  It actually seems
>>>>> like
>>>>> it's getting carried along almost accidentally in the few cases that do get
>>>>> comments, and neglected otherwise.
>>>>>
>>>>> Assuming I'm not failing to understand the code and the desired behavior
>>>>> (entirely possible, as I am new at working with this software!), this seems
>>>>> suboptimal to me.  My dataset is pretty well-behaved (despite its
>>>>> complexity),
>>>>> so I should be able to fix my issues with judicious application of
>>>>> createPolygonsComment.  But I had a heck of a time figuring out what was
>>>>> going
>>>>> wrong with gTouches, since Polygons comment management appears to be a pretty
>>>>> obscure field (and createSPComment wasn't working for me, for whatever
>>>>> reason).  So it seems like it might be better if disaggregate() just parses
>>>>> and passes along the comments from its input correctly, or, if it's
>>>>> absolutely
>>>>> necessary to not create comments, passes nothing and warns clearly in the
>>>>> manual that comments and associated hole information are being lost.  Passing
>>>>> along comments in some cases while silently dropping them in others seems
>>>>> like
>>>>> kind of the worst of both worlds.
>>>>>
>>>>> I've attached a set of tests I wrote to demonstrate desired/undesired
>>>>> behavior: disaggregate_comment_tests.R.  My R version is 3.4.0, my sp version
>>>>> is 1.2-4, my rgeos version is 0.3-23 (SVN revision 546), and my GEOS runtime
>>>>> version is 3.5.1-CAPI-1.9.1 r4246.  I am using Debian Release 9.0 with kernel
>>>>> version 4.9.0-2-amd64.  I hope this is useful; please let me know if you need
>>>>> more info or if there is a better place to post this issue.
>>>>>
>>>>> - Anne
>>>>>
>>>>
>>>
>>>
>>
_______________________________________________
R-sig-Geo mailing list
[hidden email]
https://stat.ethz.ch/mailman/listinfo/r-sig-geo
signature.asc (201 bytes) Download Attachment

Re: sp::disaggregate() does not seem to add the necessary "comment" attribute to identify holes if Polygons does not contain exactly 1 polygon with 0 or 1 holes

Sun, 04/30/2017 - 11:18
Roger,

Unfortunately I have a use case for you of createSPComment() not working.

I tried your new version of disaggregate() on the original test script I sent
you, and it does seem to have fixed all the cases in there.  However, when I
tried it on my actual data, it had the same failure mode as before. The
original dataset has what appear to be correct comments, but the
disaggregate() output is full of NULLs in place of comments for the multi-part
and multi-hole polygons.

createSPComment() appears to be what's failing, rather than just some logic in
your disaggregate() function --- when I try to run createSPComment() on the
disaggregate() output, I still get the same set of NULLs.  I can fix it if I
run createPolygonsComment() individually on every Polygons object in my
SpatialPolygonsDataFrame.

I'm not sure what is different between my dataset and the test polygons I was
using earlier, as createSPComment() seems to handle the test shapes just fine
(and converting my SpatialPolygonsDataFrame into just a set of SpatialPolygons
also doesn't help).  But presumably it is better to have createSPComment()
fixed than to have to work around it in disaggregate().  So I guess I'll look
at the code for that and see if I can find what's wrong.

Just in case you want to see this in action, I've attached a little test
script.  The data set I am working with is available at:

http://aws.redistricting.state.pa.us/Redistricting/Resources/GISData/2011-Voting-District-Boundary-Shapefiles.zip

 - Anne


On 04/29/2017 10:06 AM, Roger Bivand wrote:
> On Sat, 29 Apr 2017, Anne C. Hanna wrote:
>
>> Roger,
>>
>> This looks great, and I will try it out ASAP.  I do have one reservation
>> though --- it seems you are using createSPComment() to reconstruct the
>> comments, and I have seen some discussion that that may not be reliable in
>> all cases (e.g. if the initial polygons are wonky in some way).  I don't
>> know a lot about this, but is it possible that it would be preferable to
>> parse and appropriately disaggregate the original comments strings (if they
>> exist), so as to deal slightly more smoothly with such cases (i.e., the
>> polygons would still be wonky, but at least the hole/polygon matching would
>> track with whatever was in the original data)?
>
> Contributions very welcome - this was code moved from the raster package, so I
> really have little feeling for why it is necessary.
>
> Please provide references and use cases. GEOS is strict in its treatments of
> geometries, but does work in most cases. People just expressing opinions
> doesn't help, and may well be misleading. It is possible to clean geometries
> too, see the cleangeo package. createPolygonsComment probably isn't foolproof,
> but specific reports help, because then it is possible to do something about
> it. sp classes didn't use simple features when written because sf was not
> widely used then - introduced in 2004, when sp was being developed. Using sf
> helps, and rgeos::createPolygonsComment was a work-around from 2010 when rgeos
> was written.
>
> Even when you use sf in the sf package, you will still run into invalid
> geometries because the geometries are in fact invalid, the sf package also
> uses GEOS.
>
> Roger
>
>>
>> I also had no success using createSPComment to fix disaggregate()'s output
>> previously, even though my polygons are perfectly non-wonky, so I am perhaps
>> a little more untrusting of it than I should be.  But I'll let you know how
>> this version works with my data.  Thanks for addressing this so quickly!
>>
>> - Anne
>>
>>
>> On 04/28/2017 12:11 PM, Roger Bivand wrote:
>>> I've pushed the fix to my fork:
>>>
>>> https://github.com/rsbivand/sp
>>>
>>> and created a pull request:
>>>
>>> https://github.com/edzer/sp/pull/28
>>>
>>> Only one part of a complicated set if nested if() in disaggregate() was adding
>>> comments, but in some settings the existing comments survived the
>>> disaggregation. Now the Polygons object comment attribute is re-created for
>>> all Polygons objects. This is version 1.2-6, also including code changes that
>>> internally affect rgdal and rgeos - you may like to re-install them from
>>> source after installing this sp version (shouldn't matter).
>>>
>>> Roger
>>>
>>> On Fri, 28 Apr 2017, Anne C. Hanna wrote:
>>>
>>>> Hello.  I first posted this issue report on the sp GitHub repo
>>>> (https://github.com/edzer/sp/issues/27) and it was suggested that I redirect
>>>> it to here.
>>>>
>>>> I am working with a geographic dataset with complex borders. The data is
>>>> stored as a SpatialPolygonsDataFrame.  Each Polygons object in the data frame
>>>> may be composed of multiple disjoint polygons, and each polygon may have
>>>> multiple holes.  I want to disaggregate each of the Polygons objects into its
>>>> individual disjoint polygons and construct an adjacency matrix for all the
>>>> disjoint components, and I was using disaggregate() to do this.  However,
>>>> when
>>>> I run gTouches() on the disaggregated data, in order to compute the
>>>> adjacencies, I get a number of warnings like this:
>>>>
>>>> Error in RGEOSBinPredFunc(spgeom1, spgeom2, byid, "rgeos_touches") : Polygons
>>>> object missing comment attribute ignoring hole(s).  See function
>>>> createSPComment.
>>>>
>>>> Looking at the Polygons "comment" attributes in the SpatialPolygonsDataFrame
>>>> output by disaggregate(), I see that the only comment values are "0"
>>>> (indicating a single polygon with no holes), "0 1" (indicating a single
>>>> polygon with a single hole), and NULL (apparently no comment was written).
>>>> Since I know my dataset contains several Polygons objects which are composed
>>>> of multiple disjoint regions, and also several Polygons which contain more
>>>> than one hole, this is not the expected result.  In reading the
>>>> disaggregate()
>>>> code in the sp GitHub repository (specifically, explodePolygons()), I also
>>>> can't see anywhere the comment is being added for the cases where a Polygons
>>>> object has more than two parts or more than two holes.  It actually seems
>>>> like
>>>> it's getting carried along almost accidentally in the few cases that do get
>>>> comments, and neglected otherwise.
>>>>
>>>> Assuming I'm not failing to understand the code and the desired behavior
>>>> (entirely possible, as I am new at working with this software!), this seems
>>>> suboptimal to me.  My dataset is pretty well-behaved (despite its
>>>> complexity),
>>>> so I should be able to fix my issues with judicious application of
>>>> createPolygonsComment.  But I had a heck of a time figuring out what was
>>>> going
>>>> wrong with gTouches, since Polygons comment management appears to be a pretty
>>>> obscure field (and createSPComment wasn't working for me, for whatever
>>>> reason).  So it seems like it might be better if disaggregate() just parses
>>>> and passes along the comments from its input correctly, or, if it's
>>>> absolutely
>>>> necessary to not create comments, passes nothing and warns clearly in the
>>>> manual that comments and associated hole information are being lost.  Passing
>>>> along comments in some cases while silently dropping them in others seems
>>>> like
>>>> kind of the worst of both worlds.
>>>>
>>>> I've attached a set of tests I wrote to demonstrate desired/undesired
>>>> behavior: disaggregate_comment_tests.R.  My R version is 3.4.0, my sp version
>>>> is 1.2-4, my rgeos version is 0.3-23 (SVN revision 546), and my GEOS runtime
>>>> version is 3.5.1-CAPI-1.9.1 r4246.  I am using Debian Release 9.0 with kernel
>>>> version 4.9.0-2-amd64.  I hope this is useful; please let me know if you need
>>>> more info or if there is a better place to post this issue.
>>>>
>>>> - Anne
>>>>
>>>
>>
>>
>
_______________________________________________
R-sig-Geo mailing list
[hidden email]
https://stat.ethz.ch/mailman/listinfo/r-sig-geo
more_disaggregate_comment_tests.R (1K) Download Attachment
signature.asc (201 bytes) Download Attachment

Re: sp::disaggregate() does not seem to add the necessary "comment" attribute to identify holes if Polygons does not contain exactly 1 polygon with 0 or 1 holes

Sat, 04/29/2017 - 09:06
On Sat, 29 Apr 2017, Anne C. Hanna wrote:

> Roger,
>
> This looks great, and I will try it out ASAP.  I do have one reservation
> though --- it seems you are using createSPComment() to reconstruct the
> comments, and I have seen some discussion that that may not be reliable
> in all cases (e.g. if the initial polygons are wonky in some way).  I
> don't know a lot about this, but is it possible that it would be
> preferable to parse and appropriately disaggregate the original comments
> strings (if they exist), so as to deal slightly more smoothly with such
> cases (i.e., the polygons would still be wonky, but at least the
> hole/polygon matching would track with whatever was in the original
> data)?
Contributions very welcome - this was code moved from the raster package,
so I really have little feeling for why it is necessary.

Please provide references and use cases. GEOS is strict in its treatments
of geometries, but does work in most cases. People just expressing
opinions doesn't help, and may well be misleading. It is possible to clean
geometries too, see the cleangeo package. createPolygonsComment probably
isn't foolproof, but specific reports help, because then it is possible to
do something about it. sp classes didn't use simple features when written
because sf was not widely used then - introduced in 2004, when sp was
being developed. Using sf helps, and rgeos::createPolygonsComment was a
work-around from 2010 when rgeos was written.

Even when you use sf in the sf package, you will still run into invalid
geometries because the geometries are in fact invalid, the sf package
also uses GEOS.

Roger

>
> I also had no success using createSPComment to fix disaggregate()'s
> output previously, even though my polygons are perfectly non-wonky, so I
> am perhaps a little more untrusting of it than I should be.  But I'll
> let you know how this version works with my data.  Thanks for addressing
> this so quickly!
>
> - Anne
>
>
> On 04/28/2017 12:11 PM, Roger Bivand wrote:
>> I've pushed the fix to my fork:
>>
>> https://github.com/rsbivand/sp
>>
>> and created a pull request:
>>
>> https://github.com/edzer/sp/pull/28
>>
>> Only one part of a complicated set if nested if() in disaggregate() was adding
>> comments, but in some settings the existing comments survived the
>> disaggregation. Now the Polygons object comment attribute is re-created for
>> all Polygons objects. This is version 1.2-6, also including code changes that
>> internally affect rgdal and rgeos - you may like to re-install them from
>> source after installing this sp version (shouldn't matter).
>>
>> Roger
>>
>> On Fri, 28 Apr 2017, Anne C. Hanna wrote:
>>
>>> Hello.  I first posted this issue report on the sp GitHub repo
>>> (https://github.com/edzer/sp/issues/27) and it was suggested that I redirect
>>> it to here.
>>>
>>> I am working with a geographic dataset with complex borders. The data is
>>> stored as a SpatialPolygonsDataFrame.  Each Polygons object in the data frame
>>> may be composed of multiple disjoint polygons, and each polygon may have
>>> multiple holes.  I want to disaggregate each of the Polygons objects into its
>>> individual disjoint polygons and construct an adjacency matrix for all the
>>> disjoint components, and I was using disaggregate() to do this.  However, when
>>> I run gTouches() on the disaggregated data, in order to compute the
>>> adjacencies, I get a number of warnings like this:
>>>
>>> Error in RGEOSBinPredFunc(spgeom1, spgeom2, byid, "rgeos_touches") : Polygons
>>> object missing comment attribute ignoring hole(s).  See function
>>> createSPComment.
>>>
>>> Looking at the Polygons "comment" attributes in the SpatialPolygonsDataFrame
>>> output by disaggregate(), I see that the only comment values are "0"
>>> (indicating a single polygon with no holes), "0 1" (indicating a single
>>> polygon with a single hole), and NULL (apparently no comment was written).
>>> Since I know my dataset contains several Polygons objects which are composed
>>> of multiple disjoint regions, and also several Polygons which contain more
>>> than one hole, this is not the expected result.  In reading the disaggregate()
>>> code in the sp GitHub repository (specifically, explodePolygons()), I also
>>> can't see anywhere the comment is being added for the cases where a Polygons
>>> object has more than two parts or more than two holes.  It actually seems like
>>> it's getting carried along almost accidentally in the few cases that do get
>>> comments, and neglected otherwise.
>>>
>>> Assuming I'm not failing to understand the code and the desired behavior
>>> (entirely possible, as I am new at working with this software!), this seems
>>> suboptimal to me.  My dataset is pretty well-behaved (despite its complexity),
>>> so I should be able to fix my issues with judicious application of
>>> createPolygonsComment.  But I had a heck of a time figuring out what was going
>>> wrong with gTouches, since Polygons comment management appears to be a pretty
>>> obscure field (and createSPComment wasn't working for me, for whatever
>>> reason).  So it seems like it might be better if disaggregate() just parses
>>> and passes along the comments from its input correctly, or, if it's absolutely
>>> necessary to not create comments, passes nothing and warns clearly in the
>>> manual that comments and associated hole information are being lost.  Passing
>>> along comments in some cases while silently dropping them in others seems like
>>> kind of the worst of both worlds.
>>>
>>> I've attached a set of tests I wrote to demonstrate desired/undesired
>>> behavior: disaggregate_comment_tests.R.  My R version is 3.4.0, my sp version
>>> is 1.2-4, my rgeos version is 0.3-23 (SVN revision 546), and my GEOS runtime
>>> version is 3.5.1-CAPI-1.9.1 r4246.  I am using Debian Release 9.0 with kernel
>>> version 4.9.0-2-amd64.  I hope this is useful; please let me know if you need
>>> more info or if there is a better place to post this issue.
>>>
>>> - Anne
>>>
>>
>
>
--
Roger Bivand
Department of Economics, Norwegian School of Economics,
Helleveien 30, N-5045 Bergen, Norway.
voice: +47 55 95 93 55; e-mail: [hidden email]
Editor-in-Chief of The R Journal, https://journal.r-project.org/index.html
http://orcid.org/0000-0003-2392-6140
https://scholar.google.no/citations?user=AWeghB0AAAAJ&hl=en

_______________________________________________
R-sig-Geo mailing list
[hidden email]
https://stat.ethz.ch/mailman/listinfo/r-sig-geo
Roger Bivand
Department of Economics
Norwegian School of Economics
Helleveien 30
N-5045 Bergen, Norway

Using gdalUtils to reproject hdf file

Fri, 04/28/2017 - 19:22
 Dear all,

I am now working with GEFED dataset, which is a global wild file dataset.You can download one from FTP: fuoco.geog.umd.edu; Login name: fire; Password: burnt; The file is stored into gfed4/monthly/. Any one file will be example.

My task is to translate the hdf file into tif, which can be done by gdalUtils package. But there is no georeferecing information in the file. Instead, the only information I could find is from the data manual (can be found in gfed4 folder in that FTP): "Each product file contains seven data layers, each stored as a separate HDF4 Scientific Data Set (SDS). Units apply to each layer after multiplying by the specified scale factor. Each data layer has 720 rows and 1440 columns which correspond to the global 0.25◦ GFED grid. The center of the upper left grid cell is located at longitude 179.875◦W, 89.875◦N.". I think the original src can be written from this statement. And I can pass it to a_src and a_ullr variable in gdal_translate command. But I have no idea what is the string for a_src.

Any recommendation will be appreciated.

Tianyi



        [[alternative HTML version deleted]]

_______________________________________________
R-sig-Geo mailing list
[hidden email]
https://stat.ethz.ch/mailman/listinfo/r-sig-geo

Re: sp::disaggregate() does not seem to add the necessary "comment" attribute to identify holes if Polygons does not contain exactly 1 polygon with 0 or 1 holes

Fri, 04/28/2017 - 18:25
Roger,

This looks great, and I will try it out ASAP.  I do have one reservation
though --- it seems you are using createSPComment() to reconstruct the
comments, and I have seen some discussion that that may not be reliable in all
cases (e.g. if the initial polygons are wonky in some way).  I don't know a
lot about this, but is it possible that it would be preferable to parse and
appropriately disaggregate the original comments strings (if they exist), so
as to deal slightly more smoothly with such cases (i.e., the polygons would
still be wonky, but at least the hole/polygon matching would track with
whatever was in the original data)?

I also had no success using createSPComment to fix disaggregate()'s output
previously, even though my polygons are perfectly non-wonky, so I am perhaps a
little more untrusting of it than I should be.  But I'll let you know how this
version works with my data.  Thanks for addressing this so quickly!

 - Anne


On 04/28/2017 12:11 PM, Roger Bivand wrote:
> I've pushed the fix to my fork:
>
> https://github.com/rsbivand/sp
>
> and created a pull request:
>
> https://github.com/edzer/sp/pull/28
>
> Only one part of a complicated set if nested if() in disaggregate() was adding
> comments, but in some settings the existing comments survived the
> disaggregation. Now the Polygons object comment attribute is re-created for
> all Polygons objects. This is version 1.2-6, also including code changes that
> internally affect rgdal and rgeos - you may like to re-install them from
> source after installing this sp version (shouldn't matter).
>
> Roger
>
> On Fri, 28 Apr 2017, Anne C. Hanna wrote:
>
>> Hello.  I first posted this issue report on the sp GitHub repo
>> (https://github.com/edzer/sp/issues/27) and it was suggested that I redirect
>> it to here.
>>
>> I am working with a geographic dataset with complex borders. The data is
>> stored as a SpatialPolygonsDataFrame.  Each Polygons object in the data frame
>> may be composed of multiple disjoint polygons, and each polygon may have
>> multiple holes.  I want to disaggregate each of the Polygons objects into its
>> individual disjoint polygons and construct an adjacency matrix for all the
>> disjoint components, and I was using disaggregate() to do this.  However, when
>> I run gTouches() on the disaggregated data, in order to compute the
>> adjacencies, I get a number of warnings like this:
>>
>> Error in RGEOSBinPredFunc(spgeom1, spgeom2, byid, "rgeos_touches") : Polygons
>> object missing comment attribute ignoring hole(s).  See function
>> createSPComment.
>>
>> Looking at the Polygons "comment" attributes in the SpatialPolygonsDataFrame
>> output by disaggregate(), I see that the only comment values are "0"
>> (indicating a single polygon with no holes), "0 1" (indicating a single
>> polygon with a single hole), and NULL (apparently no comment was written).
>> Since I know my dataset contains several Polygons objects which are composed
>> of multiple disjoint regions, and also several Polygons which contain more
>> than one hole, this is not the expected result.  In reading the disaggregate()
>> code in the sp GitHub repository (specifically, explodePolygons()), I also
>> can't see anywhere the comment is being added for the cases where a Polygons
>> object has more than two parts or more than two holes.  It actually seems like
>> it's getting carried along almost accidentally in the few cases that do get
>> comments, and neglected otherwise.
>>
>> Assuming I'm not failing to understand the code and the desired behavior
>> (entirely possible, as I am new at working with this software!), this seems
>> suboptimal to me.  My dataset is pretty well-behaved (despite its complexity),
>> so I should be able to fix my issues with judicious application of
>> createPolygonsComment.  But I had a heck of a time figuring out what was going
>> wrong with gTouches, since Polygons comment management appears to be a pretty
>> obscure field (and createSPComment wasn't working for me, for whatever
>> reason).  So it seems like it might be better if disaggregate() just parses
>> and passes along the comments from its input correctly, or, if it's absolutely
>> necessary to not create comments, passes nothing and warns clearly in the
>> manual that comments and associated hole information are being lost.  Passing
>> along comments in some cases while silently dropping them in others seems like
>> kind of the worst of both worlds.
>>
>> I've attached a set of tests I wrote to demonstrate desired/undesired
>> behavior: disaggregate_comment_tests.R.  My R version is 3.4.0, my sp version
>> is 1.2-4, my rgeos version is 0.3-23 (SVN revision 546), and my GEOS runtime
>> version is 3.5.1-CAPI-1.9.1 r4246.  I am using Debian Release 9.0 with kernel
>> version 4.9.0-2-amd64.  I hope this is useful; please let me know if you need
>> more info or if there is a better place to post this issue.
>>
>> - Anne
>>
>
_______________________________________________
R-sig-Geo mailing list
[hidden email]
https://stat.ethz.ch/mailman/listinfo/r-sig-geo
signature.asc (201 bytes) Download Attachment

Pages