Subscribe to R-sig-geo feed
This is an archive for R-sig-geo, not a forum. It is not possible to post through Nabble - you may not start a new thread nor follow up an existing thread. If you wish to post, but are not subscribed to the list through the list homepage, subscribe first (through the list homepage, not via Nabble) and post from your subscribed email address. Until 2015-06-20, subscribers could post through Nabble, but policy has been changed as too many non-subscribers misunderstood the interface.
Updated: 1 hour 17 min ago

Re: Error while using predict.sarlm

Sat, 05/25/2019 - 11:18
On Fri, 24 May 2019, Amitha Puranik wrote:

> I am facing an error while using predict.sarlm to make predictions for spatial
> lag model generated using lagsarlm. I used the following code:
>
> predicted = predict(fit.lag, listw=weightmatrix, newdata=missed_data,
> pred.type="TS", zero.policy = T)
>
> For the argument newdata, I have passed the same data missed_data which I
> used to fit the spatial lag model.
>
> When I run the above code, I get the following error message: “Error in
> predict.sarlm(fit.lag, listw = weightmatrix, newdata = missed_data,  :
> mismatch between newdata and spatial weights. newdata should have region.id
> as row.names” The predict method has to identify the weights applying to the newdata. So
it uses the region.id attribute of the neighbour object, and the row.names
of the newdata object. If they do not match, it error-exits. If shp below
was read in the typical way, the default region.id may be the FID of the
input file (0, ..., (n-1)), but the default row.names of newdata may be 1,
..., n.

For example:

> library(sf)
Linking to GEOS 3.7.2, GDAL 3.0.0, PROJ 6.1.0
> boston_506 <- st_read(system.file(
+                                   "shapes/boston_tracts.shp",
+                                   package="spData")[1])
Reading layer `boston_tracts' from data source
`/home/rsb/lib/r_libs/spData/shapes/boston_tracts.shp' using driver `ESRI
Shapefile'
Simple feature collection with 506 features and 36 fields
geometry type:  POLYGON
dimension:      XY
bbox:           xmin: -71.52311 ymin: 42.00305 xmax: -70.63823 ymax:
42.67307
epsg (SRID):    4267
proj4string:    +proj=longlat +datum=NAD27 +no_defs
> nb_q <- spdep::poly2nb(boston_506)
> lw_q <- spdep::nb2listw(nb_q, style="W")
> boston_489 <- boston_506[!is.na(boston_506$median),]
> nb_q_489 <- spdep::poly2nb(boston_489)
> lw_q_489 <- spdep::nb2listw(nb_q_489, style="W", zero.policy=TRUE)
> form <- formula(log(median) ~ CRIM + ZN + INDUS + CHAS +
+                 I((NOX*10)^2) + I(RM^2) + AGE + log(DIS) +
+                 log(RAD) + TAX + PTRATIO + I(BB/100) +
+                 log(I(LSTAT/100)))
> suppressPackageStartupMessages(library(spatialreg))
>
> eigs_489 <- eigenw(lw_q_489)
>
> SLM_489 <- lagsarlm(form, data=boston_489,
+           listw=lw_q_489, zero.policy=TRUE,
+           control=list(pre_eig=eigs_489))
>
> nd <- boston_506[is.na(boston_506$median),]
> t0 <- exp(predict(SLM_489, newdata=nd, listw=lw_q,
+                   pred.type="TS", zero.policy=TRUE))
> str(attr(lw_q, "region.id"))
  chr [1:506] "1" "2" "3" "4" "5" "6" "7" "8" "9" "10" "11" "12" "13" "14"
"15" "16" ...
> str(row.names(nd))
  chr [1:17] "13" "14" "15" "17" "43" "50" "312" "313" "314" "317" "337"
"346" "355" ...
> all(row.names(nd) %in% attr(lw_q, "region.id"))
[1] TRUE
# introduce a wrong row.name
> row.names(nd)[1] <- "0"
> all(row.names(nd) %in% attr(lw_q, "region.id"))
[1] FALSE
> t0 <- exp(predict(SLM_489, newdata=nd, listw=lw_q,
+                   pred.type="TS", zero.policy=TRUE))
Error in predict.sarlm(SLM_489, newdata = nd, listw = lw_q,
   pred.type = "TS",  :
   mismatch between newdata and spatial weights. newdata should have
   region.id as row.names

In this case, the row.names of the input object to spdep::poly2nb() and
the region.id matched, as the newdata were subsetted from the same object.
We don't know the values for your data, but you should be able to check
them. It is important that they align the data with the weights correctly
for obvious reasons.

Hope this helps,

Roger

>
> I have obtained the weight matrix from the function below
>
> weightMat <- function(shp){
>
>  dnb <- knearneigh(coordinates(shp), k=4)
>
>  dnb <- knn2nb(dnb) #create nb
>
>  lw <- nb2listw(dnb, style="W",zero.policy=TRUE) #create lw
>
>  return(lw)
>
> }
>
> To cross check and make sure there are no discrepancies, I have run the
> following lines
>
> length(weightmatrix$weights)
>
> nrow(missed_data)
>
> nrow(coordinates(shape))
>
> For all the codes above, the result is 182, which is the sample size of
> data.
>
> Can anyone offer me some guidance in solving this problem? Thanks for your
> help.
>
>
>       Thanks & regards,
>
> *Amitha Puranik*
>
> Assistant Professor,
>
> Department of Statistics, PSPH
>
> Phone:0820-2922407
> Address:Department of Statistics,
>
> Health Sciences Library, Level 6,
>
> Manipal Academy of Higher Education,Manipal,Karnataka,India
>
> An Institute of Eminence (Status Accorded by MHRD)
>
> [[alternative HTML version deleted]]
>
> _______________________________________________
> R-sig-Geo mailing list
> [hidden email]
> https://stat.ethz.ch/mailman/listinfo/r-sig-geo
> --
Roger Bivand
Department of Economics, Norwegian School of Economics,
Helleveien 30, N-5045 Bergen, Norway.
voice: +47 55 95 93 55; e-mail: [hidden email]
https://orcid.org/0000-0003-2392-6140
https://scholar.google.no/citations?user=AWeghB0AAAAJ&hl=en
_______________________________________________
R-sig-Geo mailing list
[hidden email]
https://stat.ethz.ch/mailman/listinfo/r-sig-geo
Roger Bivand
Department of Economics
Norwegian School of Economics
Helleveien 30
N-5045 Bergen, Norway

Landmap package - automated mapping using Ensemble Machine Learning

Fri, 05/24/2019 - 17:25

I've wrapped functions extending #SuperLearner and #subsemble R pkgs
(Ensemble Machine Learning) which now can be used to generate Spatial
Predictions (fully automated framework, requires no special
geostatistical assumptions or choices). You can install the working
package from:

https://github.com/Envirometrix/landmap

The general method is explained in https://peerj.com/articles/5518/

The package can also be used to download some of the 2TB GeoTiffs
available via https://openlandmap.org.

Contributions and comments are welcome.

Few things I am still struggling with are:

1. Deriving prediction errors and predictions intervals
(https://github.com/Envirometrix/landmap/blob/master/R/train.spLearner.R#L252),
2. Generating spatial simulations using EML models,
3. Building models with larger data sets e.g. >>1000 points,

Thanks in advance,

Tom Hengl
https://opengeohub.org/people/tom-hengl

_______________________________________________
R-sig-Geo mailing list
[hidden email]
https://stat.ethz.ch/mailman/listinfo/r-sig-geo

Error while using predict.sarlm

Fri, 05/24/2019 - 12:47
I am facing an error while using predict.sarlm to make predictions for spatial
lag model generated using lagsarlm. I used the following code:

predicted = predict(fit.lag, listw=weightmatrix, newdata=missed_data,
pred.type="TS", zero.policy = T)

For the argument newdata, I have passed the same data missed_data which I
used to fit the spatial lag model.

When I run the above code, I get the following error message: “Error in
predict.sarlm(fit.lag, listw = weightmatrix, newdata = missed_data,  :
mismatch between newdata and spatial weights. newdata should have region.id
as row.names”

I have obtained the weight matrix from the function below

weightMat <- function(shp){

  dnb <- knearneigh(coordinates(shp), k=4)

  dnb <- knn2nb(dnb) #create nb

  lw <- nb2listw(dnb, style="W",zero.policy=TRUE) #create lw

  return(lw)

}

To cross check and make sure there are no discrepancies, I have run the
following lines

length(weightmatrix$weights)

nrow(missed_data)

nrow(coordinates(shape))

For all the codes above, the result is 182, which is the sample size of
data.

Can anyone offer me some guidance in solving this problem? Thanks for your
help.


       Thanks & regards,

*Amitha Puranik*

Assistant Professor,

Department of Statistics, PSPH

Phone:0820-2922407
Address:Department of Statistics,

Health Sciences Library, Level 6,

Manipal Academy of Higher Education,Manipal,Karnataka,India

An Institute of Eminence (Status Accorded by MHRD)

        [[alternative HTML version deleted]]

_______________________________________________
R-sig-Geo mailing list
[hidden email]
https://stat.ethz.ch/mailman/listinfo/r-sig-geo

Re: GRID based calculations

Fri, 05/24/2019 - 10:20
Hi,

You probably haven't gotten any answers because this is an extremely
vague question, and a problem not very well suited for R.

You could do the preliminary calculations in R, but other GIS software
is more suited to the point and click model.

If you run into specific problems with R code, we can certainly help,
but with a question this general all I can really suggest are
a. Lots of time with google.
b. Looking at similar applications developed by other people.
c. Hiring a GIS programmer.

Best,
Sarah

On Thu, May 23, 2019 at 8:20 AM Fatih Kara <[hidden email]> wrote:
>
> Hi,
>
> We are developing a grid based database which provides several types of data. One of the layers in this database will be land cover data with 30 land cover types. Our grids will cover almost 90 pixels and our purpose is that if someone click on the grid, percentage of land cover types should be seen automatically like 20% forest, 30% water, and 50% grass.
>
> What should we do? Is there any related code samples?
>
> Thanks
> --
> Fatih Kara
>
> _______________________________________________
> R-sig-Geo mailing list
> [hidden email]
> https://stat.ethz.ch/mailman/listinfo/r-sig-geo


--
Sarah Goslee (she/her)
http://www.numberwright.com

_______________________________________________
R-sig-Geo mailing list
[hidden email]
https://stat.ethz.ch/mailman/listinfo/r-sig-geo

nested spatial model

Fri, 05/24/2019 - 05:24
hello1-i'm going to work with colash data following nested spatial model that in statistical method for spatial data analysis (oliver_schabben berger  page150) .i devided data to 3 subgroups.i have fitted varigram model to every 3 subgroup. and also i have calculated covarianc matrix of them. but this matrixs  havn't same dimention .and i coud not caculate  $(4.23) of this page. what can i do ?
2-I have  regular lattic data (wheat_yield ,page 434 of cerssi 1993, statistic for spatial data) when i have calculated the kriging ,unfortunatly  the  covariance  matrix is 0 i'm going  to work following  method on the book for this data .what can i do?thank's in advance for your help and time.

        [[alternative HTML version deleted]]

_______________________________________________
R-sig-Geo mailing list
[hidden email]
https://stat.ethz.ch/mailman/listinfo/r-sig-geo

GRID based calculations

Thu, 05/23/2019 - 07:20
Hi,

We are developing a grid based database which provides several types of data. One of the layers in this database will be land cover data with 30 land cover types. Our grids will cover almost 90 pixels and our purpose is that if someone click on the grid, percentage of land cover types should be seen automatically like 20% forest, 30% water, and 50% grass.
 
What should we do? Is there any related code samples?

Thanks
--
Fatih Kara

_______________________________________________
R-sig-Geo mailing list
[hidden email]
https://stat.ethz.ch/mailman/listinfo/r-sig-geo

Re: focal analysis on raster

Thu, 05/23/2019 - 02:24
Dear Jeff,
1. raster::extract(Raster*, SpatialPoints*)
2. raster::extract(Raster*, rgeos::gBuffer(SpatialPoints*, width, byid =
TRUE))
If your points are Simple Features, use sf::st_buffer(x, dist) instead
of rgeos::gBuffer().

HTH,
Ákos Bede-Fazekas
Hungarian Academy of Sciences

2019.05.23. 3:56 keltezéssel, Stratford, Jeff via R-sig-Geo írta:
> Hi everyone,
>
> I am looking to estimate the number of pixels to quantify different
> landuses around sampling points for different projects (bird diversity,
> nests, predation rates on clay caterpillars, etc). Our base map is a
> Landsat 7 map.
>
> I can plot our sample points and plot the Landsat map (using raster) but
> I'm not sure
>
> 1. How to overlay the sample points onto the Landsat map
> 2. How to do the focal analysis using the sample points as centers in
> circular buffers of 200 and 1000 m radii
>
> Any help would be greatly appreciated.
>
> Many thanks,
>
> Jeff
>
> ********************************************************
> Jeffrey A. Stratford, PhD
> Department of Biology and Health Sciences &
> Director of Study Abroad
> Office: Cohen Science Center 210
> Address: 84 W South Street
> Wilkes University, PA 18766 USA
> https://sites.google.com/a/wilkes.edu/stratford/home
> Blog https://wordpress.com/posts/concreteornithology.blog
> ********************************************************
>
> [[alternative HTML version deleted]]
>
> _______________________________________________
> R-sig-Geo mailing list
> [hidden email]
> https://stat.ethz.ch/mailman/listinfo/r-sig-geo
>
_______________________________________________
R-sig-Geo mailing list
[hidden email]
https://stat.ethz.ch/mailman/listinfo/r-sig-geo

focal analysis on raster

Wed, 05/22/2019 - 20:56
Hi everyone,

I am looking to estimate the number of pixels to quantify different
landuses around sampling points for different projects (bird diversity,
nests, predation rates on clay caterpillars, etc). Our base map is a
Landsat 7 map.

I can plot our sample points and plot the Landsat map (using raster) but
I'm not sure

1. How to overlay the sample points onto the Landsat map
2. How to do the focal analysis using the sample points as centers in
circular buffers of 200 and 1000 m radii

Any help would be greatly appreciated.

Many thanks,

Jeff

********************************************************
Jeffrey A. Stratford, PhD
Department of Biology and Health Sciences &
Director of Study Abroad
Office: Cohen Science Center 210
Address: 84 W South Street
Wilkes University, PA 18766 USA
https://sites.google.com/a/wilkes.edu/stratford/home
Blog https://wordpress.com/posts/concreteornithology.blog
********************************************************

        [[alternative HTML version deleted]]

_______________________________________________
R-sig-Geo mailing list
[hidden email]
https://stat.ethz.ch/mailman/listinfo/r-sig-geo

Placing a world map on r plot

Tue, 05/21/2019 - 13:28
I am trying to place a world map on my current plots made by level plot. This plot was made as follows: 
    library(raster)
    library(ncdf4)
    library(maps)
    library(maptools)
    library(rasterVis)
    library(ggplot2)
    library(rgdal)
    library(sp)
    library(gridExtra)    
MFplot4<-levelplot(MFMeaner3,margin=F, at=Fcutpoints4,cuts=11,
    pretty=TRUE,par.settings=mapTheme, main="Historical five-day maximum  
    precipitation (mm/day) model mean")
The object "MFMeaner3" has the following attributes:    class       : RasterLayer
    dimensions  : 64, 128, 8192  (nrow, ncol, ncell)
    resolution  : 2.8125, 2.789327  (x, y)
    extent      : -181.4062, 178.5938, -89.25846, 89.25846  (xmin, xmax, ymin,
    ymax)
    coord. ref. : +proj=longlat +datum=WGS84 +ellps=WGS84 +towgs84=0,0,0
    data source : in memory
    names       : layer
    values      : 0.1583802, 164.2064  (min, max)
Here was my attempt to place a world map overlay on the above plot:
    world.outlines<-map("world", plot=FALSE)
    world.outlines.sp<-map2SpatialLines(world.outlines,proj4string =  
    CRS("+proj=longlat"))

    MFplot4 + layer(sp.lines(world.outlines.sp,col="black",lwd=0.5))However, this leads to the following error:
    Error: Attempted to create layer with no stat.
I also tried placing a simple world map using this:
    MFplot4 + plot(wrld_simpl)
But I receive this error:
    Error in UseMethod("as.layer") :
    no applicable method for 'as.layer' applied to an object of class "NULL"
Why would these errors occur?

Any assistance with this would be extremely appreciated!
        [[alternative HTML version deleted]]

_______________________________________________
R-sig-Geo mailing list
[hidden email]
https://stat.ethz.ch/mailman/listinfo/r-sig-geo

Effective number of parameters (enp)

Tue, 05/21/2019 - 11:52
Hello all,

I am trying to interpret the model output from the geographically weighted regression R function ‘gwr' in the package ‘spgwr’.

There are two “effective number of parameters”:
Effective number of parameters (residual: 2traceS - traceS’S)
Effective number of parameters (model: traceS)

How do I interpret them? Do you know which one is for the calculation of adjusted p values addressed here <https://github.com/cran/GWmodel/blob/master/R/gwr.t.adjust.r>?

Thank you,
Dongmei
        [[alternative HTML version deleted]]

_______________________________________________
R-sig-Geo mailing list
[hidden email]
https://stat.ethz.ch/mailman/listinfo/r-sig-geo

Dissolve only overlapping polygons in R using sf

Sun, 05/19/2019 - 15:18
Hi,

I extracted information from open street map that contains many polygons, some of them overlapping. I want to union/dissolve all polygons that have overlap with other polygons. I am looking for an approach that uses the sf package. I posted this question on StackExchange here: https://gis.stackexchange.com/questions/323038/dissolve-only-overlapping-polygons-in-r-using-sf/323067#323067 with a reproducible simple example. With help of a reader (Spacedman), I managed to solve it for the simple case. Unfortunately, it seems the osm data is somewhat messy and the solution does not seem to work (or I do not understand it completely). For this reason I am posting it here in the hope somebody has an idea.

The reproducible examples is as follows - three clusters of rectangles of which two have overlap:

sq = function(pt, sz = 1) st_polygon(list(rbind(c(pt - sz), c(pt[1] + sz, pt[2] - sz), c(pt + sz), c(pt[1] - sz, pt[2] + sz), c(pt - sz)))) x = st_sf(box = 1:6, st_sfc(sq(c(4.2,4.2)), sq(c(0,0)), sq(c(1, -0.8)), sq(c(0.5, 1.7)), sq(c(3,3)), sq(c(-3, -3))))
plot(x)

parts <- st_cast(st_union(x),"POLYGON")
plot(parts)

clust <- unlist(st_intersects(x, parts))

diss <- cbind(x, clust) %>%
  group_by(clust) %>%
  summarize(box = paste(box, collapse = ", ")) # I would like to preserve the names of the polygons before the dissolve hence the paste command.
plot(diss[1])

However, when I run this on my dataset from Open Street Map, which can be found here:  https://www.dropbox.com/s/4n259yhh5swqgf2/osm_data.rds?dl=0, it does not work. When I use st_intersects(x, parts) to determine the links between the features and the clusters it appears that features can be related to two (or more) clusters. I do not understand how this is possible as then these two clusters should be regarded as one cluster. Right? How can I adjust the code above so all the overlapping polygons in my dataset will be dissolved?

Thanks,
Michiel

M. (Michiel) van Dijk, PhD
Research scholar | Ecosystems Services and Management (ESM) | International Institute for Applied Systems Analysis (IIASA) Senior researcher (out of office) | International Policy Division (IB) | Wageningen Economic Research

ySchlossplatz 1 - A-2361 Laxenburg, Austria, room s-138 S +43 2236 807 537
Skype: michiel.van.dijk
S https://www.researchgate.net/profile/Michiel_Van_Dijk/
/ http://nl.linkedin.com/pub/michiel-van-dijk/1a/805/346

_______________________________________________
R-sig-Geo mailing list
[hidden email]
https://stat.ethz.ch/mailman/listinfo/r-sig-geo

Re: Question spgwr package - running time gwr()

Sat, 05/18/2019 - 16:05
On Fri, 17 May 2019, Raphael Mesaric wrote:

> Dear Roger,
>
> Thank you very much for your reply.
>
> My thesis supervisor told me to use GWR to explore small-scale
> differences and test the consistency of the SAR models.

It happens. From your problem description, I'm unsure whether GWR or SAR
are appropriate methods, and would suggest the mixed-model and multilevel
literatures. Mapable IID and spatially structured random effects may be
more appropriate.

Roger

>
> So, I tried to fit the GWR model with all my observations. However, if I
> understood you correctly, I would have to choose smaller sections of the
> grid and try to fit a GWR there. What I am not sure about how to do this
> in order to get informative results.
>
> As for the SAR model, the dependent variable consists n blocks where the
> m-th entry of each block corresponds to the m-th cell of my grid. So, if
> I have to reduce the grid to about 5K entries, I would need to take a
> subset of roughly 250 cells and then take all the 20 observations for
> each of these cells? And then try to fit a GWR there?
>
> Best,
>
> Raphael
>
>
>> Am 17.05.2019 um 14:20 schrieb Roger Bivand <[hidden email]>:
>>
>> On Fri, 17 May 2019, Raphael Mesaric via R-sig-Geo wrote:
>>
>>> Dear all,
>>>
>>> Is there an option to shorten the running time for the gwr() function, similar to the ‚LU‘ method for lagsarlm() in the spdep package? Because I have a model with roughly 500’000 observations, and the running time at the moment is quite long, respectively it has not yet terminated.
>>
>> What are you actually doing? Why did you choose GWR? Are you fitting a GWR with 500K observations, or have you fitted a GWR with many fewer observations, and are now rendering that fitted model with 500K fit points? GWR is only for detecting possible non-stationarity or similar mis-specification in moderately sized data sets. Trying to fit with 500K gives a dense hat matrix of 500K x 500K, which is imposssible (or were it possible would be uninformative). Think of 5K as a sensible maximum if GWR is condidered sensible at all. I would think that finding a bandwidth is impossible too.
>>
>> Roger
>>
>>>
>>> Thank you for your help in advance.
>>>
>>> Best regards,
>>>
>>> Raphael Mesaric
>>> _______________________________________________
>>> R-sig-Geo mailing list
>>> [hidden email]
>>> https://stat.ethz.ch/mailman/listinfo/r-sig-geo
>>>
>>
>> --
>> Roger Bivand
>> Department of Economics, Norwegian School of Economics,
>> Helleveien 30, N-5045 Bergen, Norway.
>> voice: +47 55 95 93 55; e-mail: [hidden email]
>> https://orcid.org/0000-0003-2392-6140
>> https://scholar.google.no/citations?user=AWeghB0AAAAJ&hl=en
>
> --
Roger Bivand
Department of Economics, Norwegian School of Economics,
Helleveien 30, N-5045 Bergen, Norway.
voice: +47 55 95 93 55; e-mail: [hidden email]
https://orcid.org/0000-0003-2392-6140
https://scholar.google.no/citations?user=AWeghB0AAAAJ&hl=en
_______________________________________________
R-sig-Geo mailing list
[hidden email]
https://stat.ethz.ch/mailman/listinfo/r-sig-geo
Roger Bivand
Department of Economics
Norwegian School of Economics
Helleveien 30
N-5045 Bergen, Norway

Re: Question spdep package - extending nb object

Sat, 05/18/2019 - 09:24
Look at multilevel models, fitting iid and/or mrf to the three regular dimensions. Look at the mixed models literature. Some references in https://doi.org/10.1016/j.spasta.2017.01.002, but there is much more.

Hope this helps,

Roger

Roger Bivand
Norwegian School of Economics
Bergen, Norway



Fra: Raphael Mesaric
Sendt: fredag 17. mai, 14.58
Emne: Re: [R-sig-Geo] Question spdep package - extending nb object
Til: Roger Bivand
Kopi: [hidden email]


Dear Roger,

Thank you very much for your reply.

It is kind of a spatial panel model, but I have only one time series. To describe it differently:

The dependent variable consists of twenty blocks with averaged data. Each block is related to one season and one out of five time slots (that is why I have 20 blocks - 4 seasons times 5 time slots). And each block of data refers to the very same grid, i.e. the first entry of the first block corresponds to the same call as the first entry of all the other blocks, the second entries correspond to the second cell etc. The grid is of hectare resolution, i.e. 100 x 100m (one could see this as similar to post codes).

Therefore, nb2blocknb() sounds interesting, even though I do not really have „location-less observations“, but just observations with different attributes (season and time). Would it work if my ‚ID‘ argument is built in such a way that it repeats 19 times the identification tags of the first block? For example:

rep(seq(from = 1, to = m, by = 1),19) , where m is the number of cells in my grid?

Best,

Raphael


Am 17.05.2019 um 14:13 schrieb Roger Bivand <[hidden email]<mailto:[hidden email]>>:

On Thu, 16 May 2019, Raphael Mesaric via R-sig-Geo wrote:

Dear all,

I am working on a spatial regression model which is built in such a way that the dependent variable consists of n blocks of multiple entries, and each block is referring to the same spatial grid. In order to run the lagsarlm() function or the errsarlm() function, I now need to extend my initial nb object from one grid to n grids. In other words, I need the nb object to be repeated n times. However, so far I did not find a way to do this properly.

I tried to just replicate the nb object by repeating the entries of the neighbours list and the weights list, respectively.

dnn800_3w$neighbours <- rep(dnn800_3w$neighbours,20)
dnn800_3w$weights <- rep(dnn800_3w$weights,20)

When I do this, I get the following error message:

Error in is.symmetric.nb(listw$neighbours, FALSE) : Not neighbours list

Which makes sense, as the matrix is not symmetric any more due to the steps described above. However, I cannot think of a correct implementation at the moment.

Not even worth trying, as you found. Do you mean a Kronecker product, as found in spatial panel models? Or do you mean that you have blocks of observations without position that belong to upper-level objects with known position (postcodes or similar)? If the latter, look at ?spdep::nb2blocknb.

Hope this clarifies,

Roger


Can you help me with this issue? Any suggestions are highly appreciated.

Thank you in advance.

Best regards,

Raphael Mesaric
[[alternative HTML version deleted]]

_______________________________________________
R-sig-Geo mailing list
[hidden email]<mailto:[hidden email]>
https://stat.ethz.ch/mailman/listinfo/r-sig-geo


--
Roger Bivand
Department of Economics, Norwegian School of Economics,
Helleveien 30, N-5045 Bergen, Norway.
voice: +47 55 95 93 55; e-mail: [hidden email]<mailto:[hidden email]>
https://orcid.org/0000-0003-2392-6140
https://scholar.google.no/citations?user=AWeghB0AAAAJ&hl=en




        [[alternative HTML version deleted]]

_______________________________________________
R-sig-Geo mailing list
[hidden email]
https://stat.ethz.ch/mailman/listinfo/r-sig-geo
Roger Bivand
Department of Economics
Norwegian School of Economics
Helleveien 30
N-5045 Bergen, Norway

Re: Question spgwr package - running time gwr()

Fri, 05/17/2019 - 07:58
Dear Roger,

Thank you very much for your reply.

My thesis supervisor told me to use GWR to explore small-scale differences and test the consistency of the SAR models.

So, I tried to fit the GWR model with all my observations. However, if I understood you correctly, I would have to choose smaller sections of the grid and try to fit a GWR there. What I am not sure about how to do this in order to get informative results.

As for the SAR model, the dependent variable consists n blocks where the m-th entry of each block corresponds to the m-th cell of my grid. So, if I have to reduce the grid to about 5K entries, I would need to take a subset of roughly 250 cells and then take all the 20 observations for each of these cells? And then try to fit a GWR there?

Best,

Raphael


> Am 17.05.2019 um 14:20 schrieb Roger Bivand <[hidden email]>:
>
> On Fri, 17 May 2019, Raphael Mesaric via R-sig-Geo wrote:
>
>> Dear all,
>>
>> Is there an option to shorten the running time for the gwr() function, similar to the ‚LU‘ method for lagsarlm() in the spdep package? Because I have a model with roughly 500’000 observations, and the running time at the moment is quite long, respectively it has not yet terminated.
>
> What are you actually doing? Why did you choose GWR? Are you fitting a GWR with 500K observations, or have you fitted a GWR with many fewer observations, and are now rendering that fitted model with 500K fit points? GWR is only for detecting possible non-stationarity or similar mis-specification in moderately sized data sets. Trying to fit with 500K gives a dense hat matrix of 500K x 500K, which is imposssible (or were it possible would be uninformative). Think of 5K as a sensible maximum if GWR is condidered sensible at all. I would think that finding a bandwidth is impossible too.
>
> Roger
>
>>
>> Thank you for your help in advance.
>>
>> Best regards,
>>
>> Raphael Mesaric
>> _______________________________________________
>> R-sig-Geo mailing list
>> [hidden email]
>> https://stat.ethz.ch/mailman/listinfo/r-sig-geo
>>
>
> --
> Roger Bivand
> Department of Economics, Norwegian School of Economics,
> Helleveien 30, N-5045 Bergen, Norway.
> voice: +47 55 95 93 55; e-mail: [hidden email]
> https://orcid.org/0000-0003-2392-6140
> https://scholar.google.no/citations?user=AWeghB0AAAAJ&hl=en
_______________________________________________
R-sig-Geo mailing list
[hidden email]
https://stat.ethz.ch/mailman/listinfo/r-sig-geo

Re: Question spdep package - extending nb object

Fri, 05/17/2019 - 07:58
Dear Roger,

Thank you very much for your reply.

It is kind of a spatial panel model, but I have only one time series. To describe it differently:

The dependent variable consists of twenty blocks with averaged data. Each block is related to one season and one out of five time slots (that is why I have 20 blocks - 4 seasons times 5 time slots). And each block of data refers to the very same grid, i.e. the first entry of the first block corresponds to the same call as the first entry of all the other blocks, the second entries correspond to the second cell etc. The grid is of hectare resolution, i.e. 100 x 100m (one could see this as similar to post codes).

Therefore, nb2blocknb() sounds interesting, even though I do not really have „location-less observations“, but just observations with different attributes (season and time). Would it work if my ‚ID‘ argument is built in such a way that it repeats 19 times the identification tags of the first block? For example:

rep(seq(from = 1, to = m, by = 1),19) , where m is the number of cells in my grid?

Best,

Raphael


> Am 17.05.2019 um 14:13 schrieb Roger Bivand <[hidden email]>:
>
> On Thu, 16 May 2019, Raphael Mesaric via R-sig-Geo wrote:
>
>> Dear all,
>>
>> I am working on a spatial regression model which is built in such a way that the dependent variable consists of n blocks of multiple entries, and each block is referring to the same spatial grid. In order to run the lagsarlm() function or the errsarlm() function, I now need to extend my initial nb object from one grid to n grids. In other words, I need the nb object to be repeated n times. However, so far I did not find a way to do this properly.
>>
>> I tried to just replicate the nb object by repeating the entries of the neighbours list and the weights list, respectively.
>>
>> dnn800_3w$neighbours <- rep(dnn800_3w$neighbours,20)
>> dnn800_3w$weights <- rep(dnn800_3w$weights,20)
>>
>> When I do this, I get the following error message:
>>
>> Error in is.symmetric.nb(listw$neighbours, FALSE) : Not neighbours list
>>
>> Which makes sense, as the matrix is not symmetric any more due to the steps described above. However, I cannot think of a correct implementation at the moment.
>
> Not even worth trying, as you found. Do you mean a Kronecker product, as found in spatial panel models? Or do you mean that you have blocks of observations without position that belong to upper-level objects with known position (postcodes or similar)? If the latter, look at ?spdep::nb2blocknb.
>
> Hope this clarifies,
>
> Roger
>
>>
>> Can you help me with this issue? Any suggestions are highly appreciated.
>>
>> Thank you in advance.
>>
>> Best regards,
>>
>> Raphael Mesaric
>> [[alternative HTML version deleted]]
>>
>> _______________________________________________
>> R-sig-Geo mailing list
>> [hidden email] <mailto:[hidden email]>
>> https://stat.ethz.ch/mailman/listinfo/r-sig-geo <https://stat.ethz.ch/mailman/listinfo/r-sig-geo>
>>
>
> --
> Roger Bivand
> Department of Economics, Norwegian School of Economics,
> Helleveien 30, N-5045 Bergen, Norway.
> voice: +47 55 95 93 55 <tel:+47%2055%2095%2093%2055>; e-mail: [hidden email] <mailto:[hidden email]>
> https://orcid.org/0000-0003-2392-6140 <https://orcid.org/0000-0003-2392-6140>
> https://scholar.google.no/citations?user=AWeghB0AAAAJ&hl=en <https://scholar.google.no/citations?user=AWeghB0AAAAJ&hl=en>
        [[alternative HTML version deleted]]

_______________________________________________
R-sig-Geo mailing list
[hidden email]
https://stat.ethz.ch/mailman/listinfo/r-sig-geo

Re: sp

Fri, 05/17/2019 - 07:26
On Fri, 17 May 2019, Francesco Perugini via R-sig-Geo wrote:

> Hello,thanks for the suggestions. I've tried to switch to the new function readOGR() instead of the readShapePolyas follows
> ### Readpolygon shape files into SpatialPolygonsDataFrame objects
>
> #a <-readShapePoly("C:/Users/Francesco pc/ Prov2011_WGS84",IDvar="COD_PRO")
>
> a <-readOGR(dsn="C:/Users/Francesco pc/",layer="Prov2011_WGS84")

> The readOGR function seems to work. However I've now some difficulties
> in Matching spatial dataframe and dataframe.

See my previous reply. The sf package provides better support for merging
sf objects and data.frames than sp. Do remember to eliminate your use of
shapefile functions from maptools, moving to sf instead. You'll find sf
covered well in Lovelace et al. (2019), also at:

https://geocompr.robinlovelace.net/

Also look there for left_join() and othe *_join() methods as well as
merge().

Roger


> My code is:
> quars<-  read.dta("dataset_resilience_perR.dta")         #reading file from STATA
>
> save(quars,file = "quars.rda")
>
> load("quars.rda")
>
> names(quars)
>
> attach(quars)
>
> data.quars  <- quars
>
> dim(data.quars)
>
> MATCH <-match(a$COD_PRO, data.quars$codpro); MATCH
>
> ### OBJECTID
>
> data.quars1<- data.quars[MATCH,]; dim(data.quars1)
>
> row.names(data.quars1)<- a$COD_PRO; dim(data.quars1)
>
> aa <-spCbind(a, data.quars1)
>
> sample<- aa[!is.na(aa$stand),]
>
> #
>
> writePolyShape(sample,"sample",factor2char= TRUE)
>
> sample<- readShapePoly("sample",IDvar="COD_PRO")
>
> The last two coding lines must also be changed. But I get an error
> before that, when using the spCbind Error in spCbind(a, data.quars1) :
> row names not identicalwhich I was not having before when using the
> readShapePoly function.Don't know why. Thanks a lot for the
> suggestions.Francesco
>
>
>
>
>
>    Il giovedì 16 maggio 2019, 21:24:22 CEST, Ben Tupper <[hidden email]> ha scritto:
>
> Hi,
> This seems like a different question than your original question.  I have a couple of suggestions...
> Check out ogrInfo() from the rgdal package.  It may help you understand the contents of your shape file.
> Be sure that you are using the readOGR() function properly - I don't see IDvar in the help page for readOGR. 
> Keeping mind that only you have the data, post this new question to the R-sig-geo list in way that helps your readers understand what you have done and what the problem is.
> Best wishes,Ben
>
> On May 16, 2019, at 1:13 PM, Francesco Perugini <[hidden email]> wrote:
> Dear Ben Tupper,sorry for the inconvinience cuased.
> I've now switched into plain text. Hopefully it reads fine now.
> Here again my previuos email to the forum then
> ___________
> Prof. Bivand,sorry again. I'm trying to update the coding I used.I'm now changing, as suggested, a <- readShapePoly("C:/Users/Francesco pc/Prov2011_WGS84", IDvar="COD_PRO")into
> a <- readOGR("C:/Users/Francesco pc/Prov2011_WGS84", IDvar="COD_PRO")
> I guess IDvar is not correct. Should that be layer="COD_PRO" ?Thanks a lotFrancesco
> ___________
>
>
>    Il giovedì 16 maggio 2019, 18:54:55 CEST, Ben Tupper <[hidden email]> ha scritto:
>
> Hi Francesco,
> It really would help the rest of us if you would configure your email client to send plain text, rather than rich-formatted text or HTML formatted text.  The latter two simply mess up what others, like me, see when we read your messages.  Most email clients, including Yahoo, allow you to switch.  Attached is a snapshot of my Yahoo mail - you can access the option through the "..." menu at the bottom.
> Best wishes,Ben
>
> <yahoo-plain-text.png>
>
> On May 16, 2019, at 12:07 PM, Francesco Perugini via R-sig-Geo <[hidden email]> wrote:
> Prof. Bivand,sorry again. I'm trying to update the coding I used.I'm now changing, as suggested, a <- readShapePoly("C:/Users/Francesco pc/Prov2011_WGS84", IDvar="COD_PRO")into
> a <- readOGR("C:/Users/Francesco pc/Prov2011_WGS84", IDvar="COD_PRO")
> I guess IDvar is not correct. Should that be layer="COD_PRO" ?Thanks a lotFrancesco
>
>
>
>    Il giovedì 16 maggio 2019, 15:47:36 CEST, Roger Bivand <[hidden email]> ha scritto:  
>
> On Thu, 16 May 2019, Francesco Perugini wrote:
>
>
> Thanks a lot for the reply Prof. Bivand. It works fine now. Yes I did 
> not use R since few months.I was wondering why this piece of coding left 
> is not working: neighbors.knn1 <- knn2nb(knearneigh(coord, 1, 
> longlat=F), sym=F)## Global G dlwknn1.B <- nb2listw(neighbors.knn1, 
> style="B", zero.policy=TRUE) globalG.test(CRIME, dlwknn1.B, 
> zero.policy=F)
>
>
> Note that HTML always mangles code. You have not shown any errors apart 
> from:
>
>
>
> Error: object 'CRIME' not found
>
>
> which is self-evident, as ls() in your workspace will show. You would 
> always have needed columbus$CRIME to access CRIME.
>
> Roger
>
>
> Thanks again for your help.Francesco
>
>     Il giovedì 16 maggio 2019, 15:07:37 CEST, Roger Bivand <[hidden email]> ha scritto:
>
> Please post in plain text to avoid code mangling. You have not noticed
> that a lot has been happening. First, data sets from spdep have mostly
> been moved to spData. Next, spData mostly uses sf to read and format data.
> Finally you may also see changes as spdep model fitting functions are in
> spatialreg and will shortly be dropped from spdep. In your case:
>
>
> library(sp)
> library(spdep)
>
> Loading required package: spData
> Loading required package: sf
> Linking to GEOS 3.7.2, GDAL 3.0.0, PROJ 6.1.0
>
> example(columbus)
>
>
> colmbs> columbus <- st_read(system.file("shapes/columbus.shp",
> package="spData")[1], quiet=TRUE)
>
> colmbs> col.gal.nb <- read.gal(system.file("weights/columbus.gal",
> package="spData")[1])
>
> coord <- coordinates(columbus)
>
> Error in (function (classes, fdef, mtable)  :
>   unable to find an inherited method for function ‘coordinates’ for
> signature ‘"sf"’
>
> columbus <- as(columbus, "Spatial")
> coord <- coordinates(columbus)
>
>
> Giving the full output, you can see that example(columbus) reads in the
> data and neighbours from spData, using sf. Consequently, you'd need to
> coerce columbus to an sp class if you do not want to upgrade your
> workflow to sf compatability.
>
> Hope this clarifies,
>
> Roger
>
> On Thu, 16 May 2019, Francesco Perugini via R-sig-Geo wrote:
>
>
> Dear all,I'm not very familiar with R.
> I'm tying to use this code I've written months ago. At that time it was
> working but now it is not.
>
> library(sp)
> library(spdep)
> example(columbus)
> coord <- coordinates(columbus)and get this message
>
> Error in (function (classes, fdef, mtable)  : unable to find an
> inherited method for function ‘coordinates’ for signature ‘"sf"’
> I've tried to also calllibrary(sf)but got the same error.I was wondering
> why?
> Thanks a lot.Francesco
>
>
>
>     [[alternative HTML version deleted]]
>
> _______________________________________________
> R-sig-Geo mailing list
> [hidden email]
> https://stat.ethz.ch/mailman/listinfo/r-sig-geo
>
>
>
>
>
>
> -- 
> Roger Bivand
> Department of Economics, Norwegian School of Economics,
> Helleveien 30, N-5045 Bergen, Norway.
> voice: +47 55 95 93 55; e-mail: [hidden email]
> https://orcid.org/0000-0003-2392-6140
> https://scholar.google.no/citations?user=AWeghB0AAAAJ&hl=en  
> [[alternative HTML version deleted]]
>
> _______________________________________________
> R-sig-Geo mailing list
> [hidden email]
> https://stat.ethz.ch/mailman/listinfo/r-sig-geo
>
> Ben Tupper
> Bigelow Laboratory for Ocean Sciences
> 60 Bigelow Drive, P.O. Box 380
> East Boothbay, Maine 04544
> http://www.bigelow.org
> Ecological Forecasting: https://eco.bigelow.org/
>
>
>
>
>  <yahoo-plain-text.png>
>
> Ben Tupper
> Bigelow Laboratory for Ocean Sciences
> 60 Bigelow Drive, P.O. Box 380
> East Boothbay, Maine 04544
> http://www.bigelow.org
> Ecological Forecasting: https://eco.bigelow.org/
>
>
>
>
>
> [[alternative HTML version deleted]]
>
> _______________________________________________
> R-sig-Geo mailing list
> [hidden email]
> https://stat.ethz.ch/mailman/listinfo/r-sig-geo
> --
Roger Bivand
Department of Economics, Norwegian School of Economics,
Helleveien 30, N-5045 Bergen, Norway.
voice: +47 55 95 93 55; e-mail: [hidden email]
https://orcid.org/0000-0003-2392-6140
https://scholar.google.no/citations?user=AWeghB0AAAAJ&hl=en
_______________________________________________
R-sig-Geo mailing list
[hidden email]
https://stat.ethz.ch/mailman/listinfo/r-sig-geo
Roger Bivand
Department of Economics
Norwegian School of Economics
Helleveien 30
N-5045 Bergen, Norway

Re: Question spgwr package - running time gwr()

Fri, 05/17/2019 - 07:20
On Fri, 17 May 2019, Raphael Mesaric via R-sig-Geo wrote:

> Dear all,
>
> Is there an option to shorten the running time for the gwr() function,
> similar to the ‚LU‘ method for lagsarlm() in the spdep package? Because
> I have a model with roughly 500’000 observations, and the running time
> at the moment is quite long, respectively it has not yet terminated.

What are you actually doing? Why did you choose GWR? Are you fitting a GWR
with 500K observations, or have you fitted a GWR with many fewer
observations, and are now rendering that fitted model with 500K fit
points? GWR is only for detecting possible non-stationarity or similar
mis-specification in moderately sized data sets. Trying to fit with 500K
gives a dense hat matrix of 500K x 500K, which is imposssible (or were it
possible would be uninformative). Think of 5K as a sensible maximum if GWR
is condidered sensible at all. I would think that finding a bandwidth is
impossible too.

Roger

>
> Thank you for your help in advance.
>
> Best regards,
>
> Raphael Mesaric
> _______________________________________________
> R-sig-Geo mailing list
> [hidden email]
> https://stat.ethz.ch/mailman/listinfo/r-sig-geo
> --
Roger Bivand
Department of Economics, Norwegian School of Economics,
Helleveien 30, N-5045 Bergen, Norway.
voice: +47 55 95 93 55; e-mail: [hidden email]
https://orcid.org/0000-0003-2392-6140
https://scholar.google.no/citations?user=AWeghB0AAAAJ&hl=en
_______________________________________________
R-sig-Geo mailing list
[hidden email]
https://stat.ethz.ch/mailman/listinfo/r-sig-geo
Roger Bivand
Department of Economics
Norwegian School of Economics
Helleveien 30
N-5045 Bergen, Norway

Re: Question spdep package - extending nb object

Fri, 05/17/2019 - 07:13
On Thu, 16 May 2019, Raphael Mesaric via R-sig-Geo wrote:

> Dear all,
>
> I am working on a spatial regression model which is built in such a way
> that the dependent variable consists of n blocks of multiple entries,
> and each block is referring to the same spatial grid. In order to run
> the lagsarlm() function or the errsarlm() function, I now need to extend
> my initial nb object from one grid to n grids. In other words, I need
> the nb object to be repeated n times. However, so far I did not find a
> way to do this properly.
>
> I tried to just replicate the nb object by repeating the entries of the
> neighbours list and the weights list, respectively.
>
> dnn800_3w$neighbours <- rep(dnn800_3w$neighbours,20)
> dnn800_3w$weights <- rep(dnn800_3w$weights,20)
>
> When I do this, I get the following error message:
>
> Error in is.symmetric.nb(listw$neighbours, FALSE) : Not neighbours list
>
> Which makes sense, as the matrix is not symmetric any more due to the
> steps described above. However, I cannot think of a correct
> implementation at the moment.
Not even worth trying, as you found. Do you mean a Kronecker product, as
found in spatial panel models? Or do you mean that you have blocks of
observations without position that belong to upper-level objects with
known position (postcodes or similar)? If the latter, look at
?spdep::nb2blocknb.

Hope this clarifies,

Roger

>
> Can you help me with this issue? Any suggestions are highly appreciated.
>
> Thank you in advance.
>
> Best regards,
>
> Raphael Mesaric
> [[alternative HTML version deleted]]
>
> _______________________________________________
> R-sig-Geo mailing list
> [hidden email]
> https://stat.ethz.ch/mailman/listinfo/r-sig-geo
>
--
Roger Bivand
Department of Economics, Norwegian School of Economics,
Helleveien 30, N-5045 Bergen, Norway.
voice: +47 55 95 93 55; e-mail: [hidden email]
https://orcid.org/0000-0003-2392-6140
https://scholar.google.no/citations?user=AWeghB0AAAAJ&hl=en

_______________________________________________
R-sig-Geo mailing list
[hidden email]
https://stat.ethz.ch/mailman/listinfo/r-sig-geo
Roger Bivand
Department of Economics
Norwegian School of Economics
Helleveien 30
N-5045 Bergen, Norway

Re: sp

Fri, 05/17/2019 - 07:09
On Thu, 16 May 2019, Francesco Perugini wrote:

> Prof. Bivand,sorry again. I'm trying to update the coding I used.I'm now
> changing, as suggested, a <- readShapePoly("C:/Users/Francesco
> pc/Prov2011_WGS84", IDvar="COD_PRO")into

maptools::readShapePoly() has been deprecated for many years, I should
make it defunct soon (remove shapefile support completely). The IDvar=
argument was used to set the row.names of the sp object returned.

> a <- readOGR("C:/Users/Francesco pc/Prov2011_WGS84", IDvar="COD_PRO")

rgdal::readOGR cannot itself set the row.names and as you would see from
the help page, there is no such argument. So:

row.names(a) <- as.character(a$COD_PRO)

should work. In any case, all should now be moving to sf::st_read(), so:

a <- sf::st_read("C:/Users/Francesco pc/Prov2011_WGS84.shp")

should work. From there on, merge() is better supported.

Roger

> I guess IDvar is not correct. Should that be layer="COD_PRO" ?Thanks a
> lotFrancesco
>
>
>
>    Il giovedì 16 maggio 2019, 15:47:36 CEST, Roger Bivand <[hidden email]> ha scritto:
>
> On Thu, 16 May 2019, Francesco Perugini wrote:
>
>> Thanks a lot for the reply Prof. Bivand. It works fine now. Yes I did
>> not use R since few months.I was wondering why this piece of coding left
>> is not working: neighbors.knn1 <- knn2nb(knearneigh(coord, 1,
>> longlat=F), sym=F)## Global G dlwknn1.B <- nb2listw(neighbors.knn1,
>> style="B", zero.policy=TRUE) globalG.test(CRIME, dlwknn1.B,
>> zero.policy=F)
>
> Note that HTML always mangles code. You have not shown any errors apart
> from:
>
>>
>> Error: object 'CRIME' not found
>
> which is self-evident, as ls() in your workspace will show. You would
> always have needed columbus$CRIME to access CRIME.
>
> Roger
>
>> Thanks again for your help.Francesco
>>
>>     Il giovedì 16 maggio 2019, 15:07:37 CEST, Roger Bivand <[hidden email]> ha scritto:
>>
>> Please post in plain text to avoid code mangling. You have not noticed
>> that a lot has been happening. First, data sets from spdep have mostly
>> been moved to spData. Next, spData mostly uses sf to read and format data.
>> Finally you may also see changes as spdep model fitting functions are in
>> spatialreg and will shortly be dropped from spdep. In your case:
>>
>>> library(sp)
>>> library(spdep)
>> Loading required package: spData
>> Loading required package: sf
>> Linking to GEOS 3.7.2, GDAL 3.0.0, PROJ 6.1.0
>>> example(columbus)
>>
>> colmbs> columbus <- st_read(system.file("shapes/columbus.shp",
>> package="spData")[1], quiet=TRUE)
>>
>> colmbs> col.gal.nb <- read.gal(system.file("weights/columbus.gal",
>> package="spData")[1])
>>> coord <- coordinates(columbus)
>> Error in (function (classes, fdef, mtable)  :
>>   unable to find an inherited method for function ‘coordinates’ for
>> signature ‘"sf"’
>>> columbus <- as(columbus, "Spatial")
>>> coord <- coordinates(columbus)
>>
>> Giving the full output, you can see that example(columbus) reads in the
>> data and neighbours from spData, using sf. Consequently, you'd need to
>> coerce columbus to an sp class if you do not want to upgrade your
>> workflow to sf compatability.
>>
>> Hope this clarifies,
>>
>> Roger
>>
>> On Thu, 16 May 2019, Francesco Perugini via R-sig-Geo wrote:
>>
>>> Dear all,I'm not very familiar with R.
>>> I'm tying to use this code I've written months ago. At that time it was
>>> working but now it is not.
>>>> library(sp)
>>>> library(spdep)
>>>> example(columbus)
>>>> coord <- coordinates(columbus)and get this message
>>> Error in (function (classes, fdef, mtable)  : unable to find an
>>> inherited method for function ‘coordinates’ for signature ‘"sf"’
>>> I've tried to also calllibrary(sf)but got the same error.I was wondering
>>> why?
>>> Thanks a lot.Francesco
>>>
>>>
>>>
>>>     [[alternative HTML version deleted]]
>>>
>>> _______________________________________________
>>> R-sig-Geo mailing list
>>> [hidden email]
>>> https://stat.ethz.ch/mailman/listinfo/r-sig-geo
>>>
>>
>>
>
> --
Roger Bivand
Department of Economics, Norwegian School of Economics,
Helleveien 30, N-5045 Bergen, Norway.
voice: +47 55 95 93 55; e-mail: [hidden email]
https://orcid.org/0000-0003-2392-6140
https://scholar.google.no/citations?user=AWeghB0AAAAJ&hl=en
_______________________________________________
R-sig-Geo mailing list
[hidden email]
https://stat.ethz.ch/mailman/listinfo/r-sig-geo
Roger Bivand
Department of Economics
Norwegian School of Economics
Helleveien 30
N-5045 Bergen, Norway

Use raster::getData to access long term bioclimatic RFE CHIRPS ERA5?

Fri, 05/17/2019 - 04:03

Dear List,
I am a long term user of raster::getData to easily retrieve historical time-series from WorldClim. Is there a plan (or separate R packages) to extend this feature to other bioclimatic datasets, in particular:

- USGS RFE v2.0 https://edcintl.cr.usgs.gov/downloads/sciweb1/shared/fews/web/africa/daily/rfe/downloads/daily/
- CHIRPS v2 ftp://chg-ftpout.geog.ucsb.edu/pub/org/chg/products/CHIRPS-2.0/africa_daily/tifs/p05/
- eModis NDVI https://edcintl.cr.usgs.gov/downloads/sciweb1/shared/fews/web/africa/west/dekadal/emodis/ndvi_c6/temporallysmoothedndvi/downloads/dekadal/
- SSEBop ETa https://edcintl.cr.usgs.gov/downloads/sciweb1/shared/fews/web/africa/dekadal/eta/downloads/
- etc..

Thanks for any pointer, --Mel.

-- Melanie BACOU <a class="moz-txt-link-freetext" href="tel:+1(202)492-7978">tel:+1(202)492-7978 [hidden email] <a class="moz-txt-link-freetext" href="skype:mbacou">skype:mbacou https://linkedin.com/in/mbacou
_______________________________________________
R-sig-Geo mailing list
[hidden email]
https://stat.ethz.ch/mailman/listinfo/r-sig-geo

Pages