Subscribe to R-sig-geo feed
This is an archive for R-sig-geo, not a forum. It is not possible to post through Nabble - you may not start a new thread nor follow up an existing thread. If you wish to post, but are not subscribed to the list through the list homepage, subscribe first (through the list homepage, not via Nabble) and post from your subscribed email address. Until 2015-06-20, subscribers could post through Nabble, but policy has been changed as too many non-subscribers misunderstood the interface.
Updated: 1 hour 13 min ago

Re: Sf find number of parts in multigeometry

14 hours 24 min ago
I see perfectly good answers to this, but can't resist sharing my own
approach.

 I use  gibble::gibble for a summary of parts. See how the multi-part
object is number 4, and has 3 subobjects (subobject will repeat within
object for holes).

library(sf)
x <- read_sf(system.file("gpkg/nc.gpkg", package = "sf"))
gibble::gibble(x)
#    A tibble: 108 x 5
#     nrow  ncol type         subobject object
#    <int> <int> <chr>            <int>  <int>
#1    27     2 MULTIPOLYGON         1      1
#2    26     2 MULTIPOLYGON         1      2
#3    28     2 MULTIPOLYGON         1      3
#4    26     2 MULTIPOLYGON         1      4
#5     7     2 MULTIPOLYGON         2      4
#6     5     2 MULTIPOLYGON         3      4
#7    34     2 MULTIPOLYGON         1      5
#...

(I use that for mapping out set-based operations on geometry data, it
doesn't make a huge amount of sense on its own. I suppose a rapply scheme
could be constructed to pluck out things, but you'd also want path extents
and sizes and so forth for greater control).

But, if the biggest area of each multipolygon is what you want, give each a
unique ID,  use st_cast to POLYGON, group by parent and slice out the
largest.

library(dplyr)

x %>% mutate(ID = row_number()) %>% st_cast("POLYGON") %>%
group_by(ID) %>% arrange(desc(st_area(.))) %>% slice(1) %>% ungroup()

Note that holes within a part might reduce the area logic, but so will
details of the map projection in use and so on. It's helpful to learn the
structure of the underlying geometry lists to craft your own rogue
solutions.

HTH


On Tue, Oct 23, 2018, 13:32 Martin Tomko <[hidden email]> wrote:

> I am looking for an equivalent to Postgis ST_NumGeometries
> https://postgis.net/docs/ST_NumGeometries.html
> I have multipolygons in an sf df, where most of them are likely to be
> single part. I want to identify those that are not single part, and
> possibly retain only the largest polygon part, and cast all into Polygon.
> Any advice is appreciated, thanks!
>
> Martin
>
>
>         [[alternative HTML version deleted]]
>
> _______________________________________________
> R-sig-Geo mailing list
> [hidden email]
> https://stat.ethz.ch/mailman/listinfo/r-sig-geo
> --
Dr. Michael Sumner
Software and Database Engineer
Australian Antarctic Division
203 Channel Highway
Kingston Tasmania 7050 Australia

        [[alternative HTML version deleted]]

_______________________________________________
R-sig-Geo mailing list
[hidden email]
https://stat.ethz.ch/mailman/listinfo/r-sig-geo

Re: Sf find number of parts in multigeometry

16 hours 2 min ago
As an addition to obrl soil's answer, I have come to like

lengths(st_geometry(x)) - mind the s!

Best
Tim


On 10/23/2018 06:39 AM, obrl soil wrote:
> Hi Martin,
>
> for a multipolygon geometry column, you're working with a list of
> lists of lists of matrices, so you can iterate over the geometry
> column like
>
>      sapply(st_geometry(x), length)
>
> (or purrr::map_int(), perhaps) to report number of polygon components.
> To subset the first polygon component and drop the rest,
>
>      st_geometry(x) <- st_sfc(lapply(st_geometry(x), function(y)
> st_polygon(y[[1]])), crs = 4326) # or whatever crs you're using
>
> I would first run st_buffer(x, dist = 0L) and/or
> lwgeom::st_make_valid() to ensure the geometries are properly
> structured, although this can't solve every problem. Note that AFAIK
> there's no inherent sorting to multipolygon components, although you
> could probably reorder them by area or number of vertices before
> subsetting if you were really keen.
>
> cheers
> L
> On Tue, Oct 23, 2018 at 12:32 PM Martin Tomko <[hidden email]> wrote:
>> I am looking for an equivalent to Postgis ST_NumGeometries https://postgis.net/docs/ST_NumGeometries.html
>> I have multipolygons in an sf df, where most of them are likely to be single part. I want to identify those that are not single part, and possibly retain only the largest polygon part, and cast all into Polygon.
>> Any advice is appreciated, thanks!
>>
>> Martin
>>
>>
>>          [[alternative HTML version deleted]]
>>
>> _______________________________________________
>> R-sig-Geo mailing list
>> [hidden email]
>> https://stat.ethz.ch/mailman/listinfo/r-sig-geo
> _______________________________________________
> R-sig-Geo mailing list
> [hidden email]
> https://stat.ethz.ch/mailman/listinfo/r-sig-geo
_______________________________________________
R-sig-Geo mailing list
[hidden email]
https://stat.ethz.ch/mailman/listinfo/r-sig-geo

Re: Sf find number of parts in multigeometry

Mon, 10/22/2018 - 23:39
Hi Martin,

for a multipolygon geometry column, you're working with a list of
lists of lists of matrices, so you can iterate over the geometry
column like

    sapply(st_geometry(x), length)

(or purrr::map_int(), perhaps) to report number of polygon components.
To subset the first polygon component and drop the rest,

    st_geometry(x) <- st_sfc(lapply(st_geometry(x), function(y)
st_polygon(y[[1]])), crs = 4326) # or whatever crs you're using

I would first run st_buffer(x, dist = 0L) and/or
lwgeom::st_make_valid() to ensure the geometries are properly
structured, although this can't solve every problem. Note that AFAIK
there's no inherent sorting to multipolygon components, although you
could probably reorder them by area or number of vertices before
subsetting if you were really keen.

cheers
L
On Tue, Oct 23, 2018 at 12:32 PM Martin Tomko <[hidden email]> wrote:
>
> I am looking for an equivalent to Postgis ST_NumGeometries https://postgis.net/docs/ST_NumGeometries.html
> I have multipolygons in an sf df, where most of them are likely to be single part. I want to identify those that are not single part, and possibly retain only the largest polygon part, and cast all into Polygon.
> Any advice is appreciated, thanks!
>
> Martin
>
>
>         [[alternative HTML version deleted]]
>
> _______________________________________________
> R-sig-Geo mailing list
> [hidden email]
> https://stat.ethz.ch/mailman/listinfo/r-sig-geo
_______________________________________________
R-sig-Geo mailing list
[hidden email]
https://stat.ethz.ch/mailman/listinfo/r-sig-geo

Sf find number of parts in multigeometry

Mon, 10/22/2018 - 21:32
I am looking for an equivalent to Postgis ST_NumGeometries https://postgis.net/docs/ST_NumGeometries.html
I have multipolygons in an sf df, where most of them are likely to be single part. I want to identify those that are not single part, and possibly retain only the largest polygon part, and cast all into Polygon.
Any advice is appreciated, thanks!

Martin


        [[alternative HTML version deleted]]

_______________________________________________
R-sig-Geo mailing list
[hidden email]
https://stat.ethz.ch/mailman/listinfo/r-sig-geo

logLik value of the spml spatial panel model

Mon, 10/22/2018 - 10:37
Dear List:

In a recent attempt to compare spatial panel models using splm package,
I intend to use the logLik value that was produced by the spml()
routine. When I ran the model, I realize that the logLik value is not
always produced, especially when the spatial.error is set to "b", a NULL
value is returned. I know I must have missed something here, but can
anyone point out why the logLik value is not returned when the
spatial.error value is set to "b" (a spatial panel lag model always
return a logLik value). I tried a bit to dig into the codes, but didn't
find much that can answer my question.

Thank you.

Best,

Danlin

--
___________________________________________
Danlin Yu, Ph.D.
Professor of GIS and Urban Geography
Department of Earth & Environmental Studies
Montclair State University
Montclair, NJ, 07043
Tel: 973-655-4313
Fax: 973-655-4072
email: [hidden email]
webpage: csam.montclair.edu/~yu

_______________________________________________
R-sig-Geo mailing list
[hidden email]
https://stat.ethz.ch/mailman/listinfo/r-sig-geo

Re: spgm

Mon, 10/22/2018 - 09:31
Thanks for the help, I appreciate it :). It works quite well.

One last question, I tried some diagnostics on the spgm models, but it never worked out as it always returned an error. Basically what I would like to do is a test (e.g. LM or information criterion) that would help me to select the proper weight matrix for the model.


E.g.:

library(splm)
data(Produc, package="plm")
data(usaww)
fm <- log(gsp)~log(pcap)+log(pc)+log(emp)+unemp
matrix1 <- kronecker(diag(length(unique(Produc$year))), usaww)
listw1 <- mat2listw(matrix1, style="W")
tr <- trW(as(listw1, "CsparseMatrix"), m=100)
GM <- spgm(fm, data=Produc, listw = usaww, moments="within", spatial.error = FALSE, lag = TRUE)
GMi <- spgm(update(fm, . ~ . - unemp), data=Produc, listw = usaww, moments="within", spatial.error = FALSE, lag = TRUE, endog = ~ unemp,
            instruments = ~ hwy + water + util)
slmtest(GM)
Error in UseMethod("slmtest") :
  no applicable method for 'slmtest' applied to an object of class "stsls"


Is it possible to maybe compute the Pseudo-R2?

Best,

Martin

-----Original Message-----
From: Roger Bivand [mailto:[hidden email]]
Sent: Thursday, October 11, 2018 10:00 AM
To: Hulényi Martin <[hidden email]>
Cc: [hidden email]; Giovanni Millo <[hidden email]>
Subject: Re: [R-sig-Geo] spgm

On Wed, 3 Oct 2018, Roger Bivand wrote:

> Please provide a reproducible example using the Produc dataset, so
> that it is clearer what the problem is. I do not know that there is
> any literature on such impacts, please provide references.

My cut at an example:

library(splm)
data(Produc, package="plm")
data(usaww)
matrix1 <- kronecker(diag(length(unique(Produc$year))), usaww)
listw1 <- mat2listw(matrix1, style="W")
tr <- trW(as(listw1, "CsparseMatrix"), m=100) GM <- spgm(fm, data=Produc, listw = usaww, moments="within", spatial.error = FALSE, lag = TRUE) impacts(GM, listw=listw1) impacts(GM, tr=tr) GMi <- spgm(update(fm, . ~ . - unemp), data=Produc, listw = usaww, moments="within", spatial.error = FALSE, lag = TRUE, endog = ~ unemp, instruments = ~ hwy + water + util)
summary(GMi)
GMii <- spgm(update(fm, . ~ . - unemp - log(emp)), data=Produc, listw = usaww, moments="within", spatial.error = FALSE, lag = TRUE, endog = ~ unemp + log(emp), instruments = ~ hwy + water + util)
summary(GMii)

The summary objects show the coefficients, etc. for endogeneous variables first, before the spatial coefficient. spdep::impacts.stsls() expects the spatial coefficient listed before the variables (dropping the intercept).
If you use:

put_endog_last <- function(stsls) {
   if (is.null(stsls$endog)) stop("no endogenous variables in fitted
model")
   n_endog <- length(all.vars(stsls$endog))
   coefs <- stsls$coefficients
   n_coefs <- length(coefs)
   flip <- c((n_endog+1):n_coefs, 1:n_endog)
   stsls$coefficients <- coefs[flip]
   stsls$var <- stsls$var[flip, flip]
   stsls
}

to flip the orderings in the splm::spgm() output object, impacts can be calculated using the spdep::impacts.stsls() method. I've added the splm maintainer to CC - I'm unsure whether other functions in splm (or other argument settings when lag = TRUE) would be affected by patching
splm::spgm() itself.

Hope this helps,

Roger


>
> Roger
>
> Roger Bivand
> Norwegian School of Economics
> Bergen, Norway
>
>
>
>
> On Wed, Oct 3, 2018 at 11:21 AM +0200, "Hulényi Martin" <[hidden email]<mailto:[hidden email]>> wrote:
>
>
> Thank you very much !
> I have one more question regarding the output. I have also one
> endogenous variable in the model.  Your code worked, but it did not show me the indirect and direct effects for the endogenous varibale. Here is my regex:
> spd_01 <- spgm(gdppcgr~lefpayr+lpopgr+linvr+lagwgipca + laglgdppc,
>             data=esifpdata, listw=dm1.lw,
>             model="within", lag=TRUE, spatial.error= FALSE, endog = ~ lefpayr,
>             instruments=~areaprop,
>             method="w2sls")
> matrix1 <- kronecker(diag(length(unique(esifpdata$years))), dm1)
> listw1 <- mat2listw(matrix1, style="W") tr <- trW(as(listw1,
> "CsparseMatrix"), m=100) impacts(spd_01, listw=listw1) impacts(spd_01,
> tr=tr) summary(impacts(spd_01, tr=tr, R=1000), zstats=TRUE,
> short=TRUE)
>
> Best,
>
> MArtin Hulényi
>
>
> ________________________________________
> Od: Roger Bivand
> Odoslané: 29. septembra 2018 14:52
> Komu: Hulényi Martin
> Kópia: [hidden email]
> Predmet: Re: [R-sig-Geo] spgm
>
> On Sat, 29 Sep 2018, Hulényi Martin wrote:
>
>> Dear all,
>>
>>
>> I would like to ask if there is a possibility to apply something
>> similiar to the "impacts" from spdep package for SAR regressions
>> using the spgm function from the splm package.
>>
>
> A reprex would have helped. Here is mine:
>
> data(Produc, package = "plm")
> data(usaww) # dense row-standardised weights matrix GM <-
> spgm(log(gsp)~log(pcap)+log(pc)+log(emp)+unemp, data=Produc,
>   listw = usaww, moments="fullweights", lag=TRUE, spatial.error =
> FALSE)
> class(GM)
> ?impacts.stsls # spdep method for stsls objects
> head(Produc)
> length(unique(Produc$year)) # T
> big <- kronecker(diag(length(unique(Produc$year))), usaww) listw <-
> mat2listw(big, style="W") tr <- trW(as(listw, "CsparseMatrix"), m=100)
> impacts(GM, listw=listw) impacts(GM, tr=tr) summary(impacts(GM, tr=tr,
> R=1000), zstats=TRUE, short=TRUE)
>
> The splm:::impacts.splm() method cannot dispatch on stsls objects, so
> they try to use the spdep:::impacts.stsls() method, but there the data
> rows are n x T but listw is only of n rows. Looking inside
> splm:::impacts.splm(), you see that a sparse kronecker product matrix
> is constructed - either do the same if your n x T is large, or follow
> the above using a dense kronecker product and cast back to listw
> representation to create the trace vector.
>
> Hope this clarifies,
>
> Roger
>
>>
>> Best regards,
>>
>>
>> Martin Hul???nyi ?
>>
>>
>> [eco.jpg]       Pred vytla???en???m tohto mailu zv???te pros???m vplyv na ???ivotn??? prostredie. ???akujeme.
>> Please consider the environment before printing this e-mail. Thanks
>>
>>       [[alternative HTML version deleted]]
>>
>>
>
> --
> Roger Bivand
> Department of Economics, Norwegian School of Economics, Helleveien 30,
> N-5045 Bergen, Norway.
> voice: +47 55 95 93 55; e-mail: [hidden email]
> http://orcid.org/0000-0003-2392-6140
> https://scholar.google.no/citations?user=AWeghB0AAAAJ&hl=en
> [eco.jpg]       Pred vytlačením tohto mailu zvážte prosím vplyv na životné prostredie. Ďakujeme.
> Please consider the environment before printing this e-mail. Thanks
>
>
> [[alternative HTML version deleted]]
>
> _______________________________________________
> R-sig-Geo mailing list
> [hidden email]
> https://stat.ethz.ch/mailman/listinfo/r-sig-geo
--
Roger Bivand
Department of Economics, Norwegian School of Economics, Helleveien 30, N-5045 Bergen, Norway.
voice: +47 55 95 93 55; e-mail: [hidden email]
http://orcid.org/0000-0003-2392-6140
https://scholar.google.no/citations?user=AWeghB0AAAAJ&hl=en
[eco.jpg]       Pred vytlačením tohto mailu zvážte prosím vplyv na životné prostredie. Ďakujeme.
Please consider the environment before printing this e-mail. Thanks
_______________________________________________
R-sig-Geo mailing list
[hidden email]
https://stat.ethz.ch/mailman/listinfo/r-sig-geo

Re: Unexpected configure error following recent rgeos release?

Mon, 10/22/2018 - 02:29
Dear Carl,

Thanks for reporting this issue. Note that I did post early in September
on https://stat.ethz.ch/pipermail/r-sig-geo/2018-September/026842.html 
about the possibly breaking changes in this release (not breaking
configure, but changing exception handling to try to free memory properly
on exception (topology exceptions being common).

I think that the underlying issue is as seen in:

https://r-forge.r-project.org/scm/viewvc.php/pkg/configure.ac?root=rgeos&r1=553&r2=574

that is one change in mid August caused by chaos in the GEOS RCs (the
version string started including the RC? tag -

https://trac.osgeo.org/geos/ticket/917

which also led to the second change to rgeos/configure.ac, to use
geos-config --cclibs instead of --libs (to avoid the broken version
numbers). There was no problem on CRAN's Debian (testing) on submission. I
assume that this is a GEOS versioning issue, so I've committed a patch to
R-Forge (r580) to continue to use geos-config --libs for GEOS < 3.7.0, and
only use --cclibs >= 3.7.0. Either wait for R-Forge to generate a new
tarball (tomorrow Monday for you if in California, watching
https://r-forge.r-project.org/R/?group_id=602 for rev: 580, then
install.packages("rgeos", repos="http://R-Forge.R-project.org")), or
download configure, configure.ac and insert into the 0.4-1 tarball if
you can test this immediately. I don't have an installed GEOS 3.5.1 (the
latest release for stretch, I think), so I can't check the output of
geos-config --cclibs.

sf has slightly different configure.ac code for setting up GEOS shared
objects, and uses --clibs and --static-clibs, not --libs or --cclibs.

The underlying problem was GEOS releasing RC bundles with broken
versioning strings that added RC? to version strings and shared object
version strings. Released GEOS seems to be OK, but checking against betas
and RCs is normal practice to secure forward compatibility.

Hope this helps,

Roger

On Mon, 22 Oct 2018, Carl Boettiger wrote:

> Dear Roger, R-Sig-Geo list,
>
> Since the release of rgeos 0.4.1 on Friday, I am now unable to install
> the package on debian:stretch where previously the package has always
> installed fine with the standard libgeos-dev libraries.  Installation
> now throws the following error:
>
>    configure: error: initGEOS_r not found in libgeos_c
>
> A more complete trace from the installation is pasted below.  I'm
> building with R 3.5.1; the previous version of rgeos still installs
> fine on the same system.  Thanks for your help.
>
> Sincerely,
>
> Carl
>
> package ‘rgeos’ successfully unpacked and MD5 sums checked
> CC: gcc configure:
> CXX: g++ configure: rgeos: 0.4-1
> checking for /usr/bin/svnversion... no configure: svn revision: 579
> checking for geos-config... /usr/bin/geos-config
> checking geos-config usability... yes
> configure: GEOS version: 3.5.1
> checking geos version at least 3.2.0... yes
> checking geos-config clibs... yes
> checking geos_c.h presence and usability... yes
> checking geos: linking with libgeos_c... no /usr/bin/ld:
> cannot find -lgeos collect2:
> error: ld returned 1 exit status configure:
> Install failure: compilation and/or linkage problems.
> configure: error: initGEOS_r not found in libgeos_c.
>
>
> ---
> Carl Boettiger
> http://carlboettiger.info/
> --
Roger Bivand
Department of Economics, Norwegian School of Economics,
Helleveien 30, N-5045 Bergen, Norway.
voice: +47 55 95 93 55; e-mail: [hidden email]
http://orcid.org/0000-0003-2392-6140
https://scholar.google.no/citations?user=AWeghB0AAAAJ&hl=en
_______________________________________________
R-sig-Geo mailing list
[hidden email]
https://stat.ethz.ch/mailman/listinfo/r-sig-geo
Roger Bivand
Department of Economics
Norwegian School of Economics
Helleveien 30
N-5045 Bergen, Norway

Re: Problems with gstat gaussian variogram and cross validation when fix nugget at 0

Tue, 10/16/2018 - 05:06
Stefano,

Fixing the nugget at zero for the Gaussian varigoram can often cause numerical problems in kriging. If you have two observations that are close in space, their modelled semivariance will be so small that you get numerical issues with the covariance matrix. In some cases it will be singular, in your case it could be inverted, but the absolute values of the weights will be very high for some of the cross-validation locations. It seems you have two stations that are very close to each other, if you remove one of these, you can see how the results will be more similar (still not the same) to the model with a small nugget effect.

Best,
Jon

--
Jon Olav Skøien

European Commission
Joint Research Centre – JRC.E.1
Disaster Risk Management Unit
Building 26b 1/144 | Via E.Fermi 2749, I-21027 Ispra (VA) Italy, TP 267
[hidden email]
Tel:  +39 0332 789205
Disclaimer: Views expressed in this email are those of the individual and do not necessarily represent official views of the European Commission.


________________________________________
From: R-sig-Geo [[hidden email]] on behalf of Stefano Menichetti [[hidden email]]
Sent: 12 October 2018 16:36
To: [hidden email]
Subject: [R-sig-Geo] Problems with gstat gaussian variogram and cross validation when fix nugget at 0

Dear all, I have a question about fixing nugget at 0 in a Gaussian
variogram using *gstat *package

I have encountered a serious problem on a variogram of piezometric data
that give "strange" results if I want to fix nugget to 0.

This is the file and how to import:

/library(gstat)/
//file =>
https://drive.google.com/open?id=1AOilEhR8G8YANVuFIBUcmEg-6UG1Ajrr//

/lay1 =� read.delim(paste(path,*file*,sep=""),sep="\t",dec=",")//
//SpData = na.exclude(subset(lay1, select = c("X","Y","Observed"))) //
//colnames(SpData)=c("X","Y","Z")/

The initial variogram:

/coordinates(SpData) = ~X+Y//
//var = variogram(Z~1,SpData)/

show a slow initial rise and so suggest a *Gaussian *type.

After a first variogram with:

/m = vgm(200,"Gau",3000,0)//
//
//plot(var, model=m, //
//main = paste(m$model[2],"//
//Nug=",m$psill[1],", Sill=",round(m$psill[2]),",
Range=",round(m$range[2]),sep ="")//
//)/

I have obtained a fitted variogram like this,

/m = fit.variogram(var, m)/

with a reasonable cross-validation results:

/krg_cv = krige.cv(Z~1, SpData, model = m)//
//plot(krg_cv$observed,krg_cv$var1.pred, xlab="Z observed", ylab="Z
predicted")//
//abline(a=0,b=1)//
//paste("media residui =",mean(krg_cv$residual))//
//paste("RMS residui =",mean(krg_cv$residual^2)^0.5)/

Resulting map it is very smooth, despite the small, it seems, nugget and
so *I have tried to fix nugget at 0* (exact interpolation).

But cross validation results with

/m = vgm(216,"Gau",3640,*0*)/

are dramatically wrong and also very different from cv results with

/m = vgm(216,"Gau",3640,*1e-3*)/

What happens ? I wrong in something ? Why nugget paraneter at 0 or at
small value influnce so hard the results ?

Thank you very much in advance,


Stefano


Dott. Geol. Stefano Menichetti
=============================================================
ARPAT - Agenzia per la Protezione dell'Ambiente della Toscana
Settore Tecnico SIRA - Sistema Informativo Regionale Ambientale via Nicola Porpora 22 - 50144 Firenze
tel. 0553206333 cell. 3668217978 (3383550147) fax 0553206410
[hidden email]  http:\\www.arpat.toscana.it  http:\\sira.arpat.toscana.it


        [[alternative HTML version deleted]]

_______________________________________________
R-sig-Geo mailing list
[hidden email]
https://stat.ethz.ch/mailman/listinfo/r-sig-geo

Problems with gstat gaussian variogram and cross validation when fix nugget at 0

Fri, 10/12/2018 - 09:36
Dear all, I have a question about fixing nugget at 0 in a Gaussian
variogram using *gstat *package

I have encountered a serious problem on a variogram of piezometric data
that give "strange" results if I want to fix nugget to 0.

This is the file and how to import:

/library(gstat)/
//file =>
https://drive.google.com/open?id=1AOilEhR8G8YANVuFIBUcmEg-6UG1Ajrr//

/lay1 =� read.delim(paste(path,*file*,sep=""),sep="\t",dec=",")//
//SpData = na.exclude(subset(lay1, select = c("X","Y","Observed"))) //
//colnames(SpData)=c("X","Y","Z")/

The initial variogram:

/coordinates(SpData) = ~X+Y//
//var = variogram(Z~1,SpData)/

show a slow initial rise and so suggest a *Gaussian *type.

After a first variogram with:

/m = vgm(200,"Gau",3000,0)//
//
//plot(var, model=m, //
//main = paste(m$model[2],"//
//Nug=",m$psill[1],", Sill=",round(m$psill[2]),",
Range=",round(m$range[2]),sep ="")//
//)/

I have obtained a fitted variogram like this,

/m = fit.variogram(var, m)/

with a reasonable cross-validation results:

/krg_cv = krige.cv(Z~1, SpData, model = m)//
//plot(krg_cv$observed,krg_cv$var1.pred, xlab="Z observed", ylab="Z
predicted")//
//abline(a=0,b=1)//
//paste("media residui =",mean(krg_cv$residual))//
//paste("RMS residui =",mean(krg_cv$residual^2)^0.5)/

Resulting map it is very smooth, despite the small, it seems, nugget and
so *I have tried to fix nugget at 0* (exact interpolation).

But cross validation results with

/m = vgm(216,"Gau",3640,*0*)/

are dramatically wrong and also very different from cv results with

/m = vgm(216,"Gau",3640,*1e-3*)/

What happens ? I wrong in something ? Why nugget paraneter at 0 or at
small value influnce so hard the results ?

Thank you very much in advance,


Stefano


Dott. Geol. Stefano Menichetti
=============================================================
ARPAT - Agenzia per la Protezione dell'Ambiente della Toscana
Settore Tecnico SIRA - Sistema Informativo Regionale Ambientale via Nicola Porpora 22 - 50144 Firenze
tel. 0553206333 cell. 3668217978 (3383550147) fax 0553206410
[hidden email]  http:\\www.arpat.toscana.it  http:\\sira.arpat.toscana.it


        [[alternative HTML version deleted]]


_______________________________________________
R-sig-Geo mailing list
[hidden email]
https://stat.ethz.ch/mailman/listinfo/r-sig-geo

addImageQuery() result stops working in shiny app

Thu, 10/11/2018 - 20:49
Hi all

I've put together a shiny app whose ultimate use would be to share
MODIS-Aqua based estimates of chlorophyll-a concentration in RasterLayer
form.

I would like the user to be able to select from a list of layers that are
in a stack and once that layer is displayed I'd like them to be able to see
the RasterLayer value wherever the cursor is located.

I have no problems make the shiny app do this consistently for a single
RasterLayer.  However, once the user switches to a RasterLayer other than
the default that is shown when you open the app, the value at the cursor is
no longer displayed.

Does anyone know if there is a way to change this behavior by changing my
code?  I am guessing that because of my location on the learning curve that
my code is the issue.  Somehow I think I have failed to correctly relay the
change to addImageQuery().

The code below should be a reproducible example.  It may not be an ideal
example, but it is a self-contained example that will only require that the
user has the required packages.  And it is the best I know how to do.

Thanks in advance to anyone who reads this!

#The example....
library(raster)
library(leaflet)
library(shiny)
library(mapview)
# Create raster data
# Each raster represents the average of multiple rasters
# during a weekly period.
# In this example, there are five weeks represented
# create an extent object
myext <- extent(707900, 980000,540000,1100000)
mycrs <-  "+proj=aea +lat_1=42.122774 +lat_2=49.01518 +lat_0=45.568977
+lon_0=-84.455955 +x_0=1000000 +y_0=1000000 +ellps=GRS80
+towgs84=0,0,0,0,0,0,0 +units=m +no_defs"

r1 <- raster(ncol=50, nrow=75, ext=myext, crs=mycrs)
values(r1) <-rnorm(3750, 0, 2)
r2 <- raster(ncol=50, nrow=75, ext=myext, crs=mycrs)
values(r2) <-rnorm(3750, 0, 2)
r3 <- raster(ncol=50, nrow=75, ext=myext, crs=mycrs)
values(r3) <-rnorm(3750, 0, 2)
r4 <- raster(ncol=50, nrow=75, ext=myext, crs=mycrs)
values(r4) <-rnorm(3750, 0, 2)
r5 <- raster(ncol=50, nrow=75, ext=myext, crs=mycrs)
values(r5) <-rnorm(3750, 0, 2)

# create list of rasters that the user can choose from in the shiny app
myras <- list(r1, r2, r3, r4, r5)
modis.rasters <- stack(myras)

# set up color display
# #this sets up the color palette and is the reverse Spectral with 10 levels
my.max<- 10
x <- -10:my.max # this is the observed range for chlorophyll in the data
names(modis.rasters) <- c("Week of 2016-04-01", "Week of 2016-04-08",
             "Week of 2016-04-15", "Week of 2016-04-22", "Week of
2016-04-29")
pal1 <- colorNumeric(palette = c("#5E4FA2", "#3288BD", "#66C2A5",
"#ABDDA4",
                                 "#E6F598", "#FEE08B", "#FDAE61",
"#F46D43", "#D53E4F", "#9E0142" ), domain =
                       x,na.color = "transparent")

# Create a map for use in shiny app
map <- leaflet() %>% addTiles() %>%
  setView(lng = -86.0589, lat = 43, zoom =7) %>%
  addLegend(pal=pal1, values = values(modis.rasters),
            title ='Random normal variate (mean=0, SD=2)',
position="bottomleft",
            opacity=1)%>%
  addMouseCoordinates(style = "basic")


# Now set up the UI
ui <- shinyUI(fluidPage(

  titlePanel("Stuff"),

  # Generate a row with a sidebar
  sidebarLayout(

    # Define the sidebar with one input
    # Here "period" is a weekly time period/raster
    sidebarPanel(
      selectInput("period", "Choose a time period:",
                  choices=names(modis.rasters)),
      hr(),
      helpText("Some raster data that I will replace.",
               br(),
               width=8)
    ),
    # Create a spot for the map
    mainPanel(leafletOutput('raster_map', width=800,height=900))
  )
)
)

# Define a server for the Shiny app
server <- shinyServer(function(input, output){

  # Fill in the spot we created for a map
  output$raster_map = renderLeaflet({
    map %>%
      addRasterImage(reactiveRaster(), colors=pal1, layerId =input$period,
                     opacity=0.5)%>%
      addImageQuery(reactiveRaster(), type="mousemove", digits=2,
                    position="topright", layerId=input$period)
  })

  reactiveRaster <- reactive({modis.rasters[[input$period]]})
  # add the selected raster to the map
  observe({
    leafletProxy("raster_map") %>%
      clearImages() %>%
      addRasterImage(reactiveRaster(), colors=pal1, layerId =input$period,
                     opacity=0.5)
  })
})
shinyApp(ui = ui, server = server)



--
David Warner
Research Fisheries Biologist
U.S.G.S. Great Lakes Science Center
1451 Green Road
Ann Arbor, MI 48105
734-214-9392
orcid.org/0000-0003-4939-5368

        [[alternative HTML version deleted]]

_______________________________________________
R-sig-Geo mailing list
[hidden email]
https://stat.ethz.ch/mailman/listinfo/r-sig-geo

Re: Clustering Sd Error in Spatial Models.

Thu, 10/11/2018 - 03:10
On Tue, 9 Oct 2018, Amir Neto wrote:

> Hello,
>
> I am currently working on a project estimating a spatial panel model.
> Because I also estimate non-spatial models I am computing the clustered
> standard errors following Stock and Watson (2008).
>
> I tried to do the same for my spatial models however I am running into the
> some errors (depending if I bootstrap or not my clustered
> variance-covariance matrix). Below is a reproducible example.

Although it would be desirable that spatial models matched the
lmtest/sandwich framework better, are Conway standard errors an
alternative? See this interesting blog post correcting an implementation:

https://darinchristensen.com/post/2017-08-21-correction/
https://darinchristensen.com/post/2015-08-30-conley/

https://github.com/darinchristensen/conley-sehttps://github.com/darinchristensen/conley-se

Hope this helps,

Roger

>
>
> #### Example
>> data(Produc, package = "plm")
>> data(usaww)
>>
>> fm <- log(gsp) ~ log(pcap) + log(pc) + log(emp) + unemp
>>
>> fespaterr <- spml(fm, data = Produc, listw = mat2listw(usaww),
> +                   model="within", spatial.error="kkp")
>>
>> library(lmtest)
>> library(sandwich)
>>
>> vcov_test <- vcov_test <- vcovCL(fespaterr, cluster = Produc$state)
> Error in UseMethod("estfun") :
>  no applicable method for 'estfun' applied to an object of class "splm"
>
>> vcov_boot <- vcovBS(fespaterr, cluster = Produc$state, R=250)
> Error in terms.default(x) : no terms component nor attribute
> Error in terms.default(x) : no terms component nor attribute
> Error in terms.default(x) : no terms component nor attributebute
>
> #############
> ### Non spatial
> library(lfe)
>
> fe_cluster <- felm(log(gsp) ~ log(pcap) + log(pc) + log(emp) +
> unemp|year+state|0|state,data=Produc)
>
>> summary(fe)
>
> Call:
>   felm(formula = log(gsp) ~ log(pcap) + log(pc) + log(emp) + unemp |
> year + state | 0 | state, data = Produc)
>
> Residuals:
>      Min        1Q    Median        3Q       Max
> -0.160369 -0.018026 -0.000859  0.016745  0.170752
>
> Coefficients:
>           Estimate Cluster s.e. t value Pr(>|t|)
> log(pcap) -0.030176     0.060042  -0.503   0.6154
> log(pc)    0.168828     0.088331   1.911   0.0563 .
> log(emp)   0.769306     0.087700   8.772   <2e-16 ***
> unemp     -0.004221     0.003294  -1.281   0.2005
> ---
> Signif. codes:  0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1
> ##########
>
> My questions are:
> 1) Is it possible to cluster the variance-covariance matrix on spatial
> models?
> 2) If so, what is the correct procedure?
>
>
> Thank you for your help,
> Amir
> --
> Amir B Ferreira Neto
>
> [[alternative HTML version deleted]]
>
> _______________________________________________
> R-sig-Geo mailing list
> [hidden email]
> https://stat.ethz.ch/mailman/listinfo/r-sig-geo
> --
Roger Bivand
Department of Economics, Norwegian School of Economics,
Helleveien 30, N-5045 Bergen, Norway.
voice: +47 55 95 93 55; e-mail: [hidden email]
http://orcid.org/0000-0003-2392-6140
https://scholar.google.no/citations?user=AWeghB0AAAAJ&hl=en
_______________________________________________
R-sig-Geo mailing list
[hidden email]
https://stat.ethz.ch/mailman/listinfo/r-sig-geo
Roger Bivand
Department of Economics
Norwegian School of Economics
Helleveien 30
N-5045 Bergen, Norway

Re: spgm

Thu, 10/11/2018 - 02:59
On Wed, 3 Oct 2018, Roger Bivand wrote:

> Please provide a reproducible example using the Produc dataset, so that
> it is clearer what the problem is. I do not know that there is any
> literature on such impacts, please provide references.

My cut at an example:

library(splm)
data(Produc, package="plm")
data(usaww)
matrix1 <- kronecker(diag(length(unique(Produc$year))), usaww)
listw1 <- mat2listw(matrix1, style="W")
tr <- trW(as(listw1, "CsparseMatrix"), m=100)
GM <- spgm(fm, data=Produc, listw = usaww, moments="within", spatial.error
= FALSE, lag = TRUE)
impacts(GM, listw=listw1)
impacts(GM, tr=tr)
GMi <- spgm(update(fm, . ~ . - unemp), data=Produc, listw = usaww,
moments="within", spatial.error = FALSE, lag = TRUE, endog = ~ unemp,
instruments = ~ hwy + water + util)
summary(GMi)
GMii <- spgm(update(fm, . ~ . - unemp - log(emp)), data=Produc, listw =
usaww, moments="within", spatial.error = FALSE, lag = TRUE, endog = ~
unemp + log(emp), instruments = ~ hwy + water + util)
summary(GMii)

The summary objects show the coefficients, etc. for endogeneous variables
first, before the spatial coefficient. spdep::impacts.stsls() expects the
spatial coefficient listed before the variables (dropping the intercept).
If you use:

put_endog_last <- function(stsls) {
   if (is.null(stsls$endog)) stop("no endogenous variables in fitted
model")
   n_endog <- length(all.vars(stsls$endog))
   coefs <- stsls$coefficients
   n_coefs <- length(coefs)
   flip <- c((n_endog+1):n_coefs, 1:n_endog)
   stsls$coefficients <- coefs[flip]
   stsls$var <- stsls$var[flip, flip]
   stsls
}

to flip the orderings in the splm::spgm() output object, impacts can be
calculated using the spdep::impacts.stsls() method. I've added the splm
maintainer to CC - I'm unsure whether other functions in splm (or other
argument settings when lag = TRUE) would be affected by patching
splm::spgm() itself.

Hope this helps,

Roger


>
> Roger
>
> Roger Bivand
> Norwegian School of Economics
> Bergen, Norway
>
>
>
>
> On Wed, Oct 3, 2018 at 11:21 AM +0200, "Hulényi Martin" <[hidden email]<mailto:[hidden email]>> wrote:
>
>
> Thank you very much !
> I have one more question regarding the output. I have also one endogenous variable in the model.  Your code worked, but it did not show me the indirect and direct effects for the
> endogenous varibale. Here is my regex:
> spd_01 <- spgm(gdppcgr~lefpayr+lpopgr+linvr+lagwgipca + laglgdppc,
>             data=esifpdata, listw=dm1.lw,
>             model="within", lag=TRUE, spatial.error= FALSE, endog = ~ lefpayr,
>             instruments=~areaprop,
>             method="w2sls")
> matrix1 <- kronecker(diag(length(unique(esifpdata$years))), dm1)
> listw1 <- mat2listw(matrix1, style="W")
> tr <- trW(as(listw1, "CsparseMatrix"), m=100)
> impacts(spd_01, listw=listw1)
> impacts(spd_01, tr=tr)
> summary(impacts(spd_01, tr=tr, R=1000), zstats=TRUE, short=TRUE)
>
> Best,
>
> MArtin Hulényi
>
>
> ________________________________________
> Od: Roger Bivand
> Odoslané: 29. septembra 2018 14:52
> Komu: Hulényi Martin
> Kópia: [hidden email]
> Predmet: Re: [R-sig-Geo] spgm
>
> On Sat, 29 Sep 2018, Hulényi Martin wrote:
>
>> Dear all,
>>
>>
>> I would like to ask if there is a possibility to apply something
>> similiar to the "impacts" from spdep package for SAR regressions using
>> the spgm function from the splm package.
>>
>
> A reprex would have helped. Here is mine:
>
> data(Produc, package = "plm")
> data(usaww) # dense row-standardised weights matrix
> GM <- spgm(log(gsp)~log(pcap)+log(pc)+log(emp)+unemp, data=Produc,
>   listw = usaww, moments="fullweights", lag=TRUE, spatial.error = FALSE)
> class(GM)
> ?impacts.stsls # spdep method for stsls objects
> head(Produc)
> length(unique(Produc$year)) # T
> big <- kronecker(diag(length(unique(Produc$year))), usaww)
> listw <- mat2listw(big, style="W")
> tr <- trW(as(listw, "CsparseMatrix"), m=100)
> impacts(GM, listw=listw)
> impacts(GM, tr=tr)
> summary(impacts(GM, tr=tr, R=1000), zstats=TRUE, short=TRUE)
>
> The splm:::impacts.splm() method cannot dispatch on stsls objects, so they
> try to use the spdep:::impacts.stsls() method, but there the data rows are
> n x T but listw is only of n rows. Looking inside splm:::impacts.splm(),
> you see that a sparse kronecker product matrix is constructed - either do
> the same if your n x T is large, or follow the above using a dense
> kronecker product and cast back to listw representation to create the
> trace vector.
>
> Hope this clarifies,
>
> Roger
>
>>
>> Best regards,
>>
>>
>> Martin Hul???nyi ?
>>
>>
>> [eco.jpg]       Pred vytla???en???m tohto mailu zv???te pros???m vplyv na ???ivotn??? prostredie. ???akujeme.
>> Please consider the environment before printing this e-mail. Thanks
>>
>>       [[alternative HTML version deleted]]
>>
>>
>
> --
> Roger Bivand
> Department of Economics, Norwegian School of Economics,
> Helleveien 30, N-5045 Bergen, Norway.
> voice: +47 55 95 93 55; e-mail: [hidden email]
> http://orcid.org/0000-0003-2392-6140
> https://scholar.google.no/citations?user=AWeghB0AAAAJ&hl=en
> [eco.jpg]       Pred vytlačením tohto mailu zvážte prosím vplyv na životné prostredie. Ďakujeme.
> Please consider the environment before printing this e-mail. Thanks
>
>
> [[alternative HTML version deleted]]
>
> _______________________________________________
> R-sig-Geo mailing list
> [hidden email]
> https://stat.ethz.ch/mailman/listinfo/r-sig-geo--
Roger Bivand
Department of Economics, Norwegian School of Economics,
Helleveien 30, N-5045 Bergen, Norway.
voice: +47 55 95 93 55; e-mail: [hidden email]
http://orcid.org/0000-0003-2392-6140
https://scholar.google.no/citations?user=AWeghB0AAAAJ&hl=en
_______________________________________________
R-sig-Geo mailing list
[hidden email]
https://stat.ethz.ch/mailman/listinfo/r-sig-geo
Roger Bivand
Department of Economics
Norwegian School of Economics
Helleveien 30
N-5045 Bergen, Norway

Re: Error when performing a spatial eigenvector selection in R

Wed, 10/10/2018 - 14:49
On Wed, 10 Oct 2018, DIEGO CEPEDA GOMEZ wrote:

> Dear all,
>
> I am currently trying to do a Gaussian linear regression in R with data
> that may be spatially autocorrelated. My dataset contains geographic
> coordinates (value of longitude, value of latitude), species, independent
> variables (BS and LTS) and some explanatory variables; it looks like:
>
> head(dataset)
>
> coordinates SPECIES                BS   LTS  DEPTH OCEAN(155, 47)
> Cristaphyes abyssorum  8.66 28.3  5373  WPac(150, 41)    Cristaphyes
> abyssorum  8.66 28.3  5250  WPac (-72, -41)   Cristaphyes anomalus
> 8.69   NA    35  EPac(-74, -44)   Cristaphyes anomalus   8.69   NA
> 35  EPac(-57, -46)   Cristaphyes anomalus   8.69   NA    NA  WAtl(29,
> 80)     Cristaphyes arctous    8.32 27.0   393  EAtl
>
> tail(dataset)
>
> coordinates SPECIES                  BS   LTS   DEPTH  OCEAN(-80, 27)
> Zelinkaderes brightae    NA   20.1  13.04  WAtl(-80, 27)
> Zelinkaderes floridensis 7.10 12.4  140.00 WAtl(35, 25)
> Zelinkaderes klepali     NA   25.0  1.00   WInd(9, 57)
> Zelinkaderes submersus   7.99 21.4  30.00  EAtl(130, 36)
> Zelinkaderes yong        NA   12.7  4.50   WAtl
>
> The dataset also include the values of latitude and longitude in separated
> columns.
>
> I extracted positive eigenvector-based spatial filters from a truncated
> matrix of geographic distances among sampling sites. I would like to treat
> spatial filters as candidate explanatory variables in my linear regression
> model. I did this as following:
>
> First of all, I created a neighbor list object (nb). In my case of
> irregular samplings, I used the function knearneight of the R package spdep:
>
> knea8 <-knearneight(coordinates(dataset), longlat=TRUE, k=8)
>
> neib8 <-knn2nb(knea8)
>
> Then, I created a spatial weighting matrix with the function nb2listw of
> the R package spdep:
>
> nb2listw(neib8)
>
> distgab8 <- nbdists(neib8, coordinates(dataset))
>
> str(distgab8)
>
> fdist<-lapply(distgab8, function(x) 1-x/max(dist(coordinates(dataset))))
>
> listwgab8 <- nb2listw(neib8, glist = fdist8, style = "B")
>
> Then, I built spatial predictors to incorporate them in the Gaussian linear
> regression. I did this with the mem function of the R package adespatial,
> as following:
If you are choosing adespatial::mem(), it will probably work differently
from spdep::SpatialFilering(), based on

      Tiefelsdorf M, Griffith DA. (2007) Semiparametric Filtering of
      Spatial Autocorrelation: The Eigenvector Approach. Environment and
      Planning A, 39 (5) 1193 - 1221.

In that approach, the eigenvectors are chosen to reduce the residual
spatial autocorrelation.

>
> mem.gab8 <- mem(listwgab8)
>
> Additionally, Moran's I were computed and tested for each eigenvector with
> the moran.randtest function, as following:
>
> moranI8 <-moran.randtest(mem.gab8, listwgab8, 99)
>
> I obtained some eigenvectors with significant positive spatial
> autocorrelation. Now, I would like to include them in the Gaussian linear
> regression. I tried to do this with the function ME of spdep, as following:
>
Use spdep::SpatialFilering() rather than spdep::ME() for the Gaussian
case. Probably your listwgab8 refers to a different number of observations
than dataset (any missing values in your variables?).

Roger

> GLM1 <- ME(BS~LATITUDE, data=dataset, listw=listwgab8,
> family=gaussian, nsim=99, alpha=0.05)
>
> Unfortunately, I receive this error:
>
> Error in sW %*% var : Cholmod error 'X and/or Y have wrong dimensions' at
> file ../MatrixOps/cholmod_sdmult.c, line 90
>
>
> Would anybody know how I could solve this error? Or, if is there another
> way to perform a spatial eigenvector selection in a Gaussian linear
> regression?
>
> Thank you in advance
>
> Best wishes,
>
> Diego
>
> [[alternative HTML version deleted]]
>
> _______________________________________________
> R-sig-Geo mailing list
> [hidden email]
> https://stat.ethz.ch/mailman/listinfo/r-sig-geo
>
--
Roger Bivand
Department of Economics, Norwegian School of Economics,
Helleveien 30, N-5045 Bergen, Norway.
voice: +47 55 95 93 55; e-mail: [hidden email]
http://orcid.org/0000-0003-2392-6140
https://scholar.google.no/citations?user=AWeghB0AAAAJ&hl=en

_______________________________________________
R-sig-Geo mailing list
[hidden email]
https://stat.ethz.ch/mailman/listinfo/r-sig-geo
Roger Bivand
Department of Economics
Norwegian School of Economics
Helleveien 30
N-5045 Bergen, Norway

Error when performing a spatial eigenvector selection in R

Wed, 10/10/2018 - 02:31
Dear all,

I am currently trying to do a Gaussian linear regression in R with data
that may be spatially autocorrelated. My dataset contains geographic
coordinates (value of longitude, value of latitude), species, independent
variables (BS and LTS) and some explanatory variables; it looks like:

head(dataset)

coordinates SPECIES                BS   LTS  DEPTH OCEAN(155, 47)
Cristaphyes abyssorum  8.66 28.3  5373  WPac(150, 41)    Cristaphyes
abyssorum  8.66 28.3  5250  WPac (-72, -41)   Cristaphyes anomalus
8.69   NA    35  EPac(-74, -44)   Cristaphyes anomalus   8.69   NA
35  EPac(-57, -46)   Cristaphyes anomalus   8.69   NA    NA  WAtl(29,
80)     Cristaphyes arctous    8.32 27.0   393  EAtl

tail(dataset)

coordinates SPECIES                  BS   LTS   DEPTH  OCEAN(-80, 27)
 Zelinkaderes brightae    NA   20.1  13.04  WAtl(-80, 27)
Zelinkaderes floridensis 7.10 12.4  140.00 WAtl(35, 25)
Zelinkaderes klepali     NA   25.0  1.00   WInd(9, 57)
Zelinkaderes submersus   7.99 21.4  30.00  EAtl(130, 36)
Zelinkaderes yong        NA   12.7  4.50   WAtl

The dataset also include the values of latitude and longitude in separated
columns.

I extracted positive eigenvector-based spatial filters from a truncated
matrix of geographic distances among sampling sites. I would like to treat
spatial filters as candidate explanatory variables in my linear regression
model. I did this as following:

First of all, I created a neighbor list object (nb). In my case of
irregular samplings, I used the function knearneight of the R package spdep:

knea8 <-knearneight(coordinates(dataset), longlat=TRUE, k=8)

neib8 <-knn2nb(knea8)

Then, I created a spatial weighting matrix with the function nb2listw of
the R package spdep:

nb2listw(neib8)

distgab8 <- nbdists(neib8, coordinates(dataset))

str(distgab8)

fdist<-lapply(distgab8, function(x) 1-x/max(dist(coordinates(dataset))))

listwgab8 <- nb2listw(neib8, glist = fdist8, style = "B")

Then, I built spatial predictors to incorporate them in the Gaussian linear
regression. I did this with the mem function of the R package adespatial,
as following:

mem.gab8 <- mem(listwgab8)

Additionally, Moran's I were computed and tested for each eigenvector with
the moran.randtest function, as following:

moranI8 <-moran.randtest(mem.gab8, listwgab8, 99)

I obtained some eigenvectors with significant positive spatial
autocorrelation. Now, I would like to include them in the Gaussian linear
regression. I tried to do this with the function ME of spdep, as following:

GLM1 <- ME(BS~LATITUDE, data=dataset, listw=listwgab8,
family=gaussian, nsim=99, alpha=0.05)

Unfortunately, I receive this error:

Error in sW %*% var : Cholmod error 'X and/or Y have wrong dimensions' at
file ../MatrixOps/cholmod_sdmult.c, line 90


Would anybody know how I could solve this error? Or, if is there another
way to perform a spatial eigenvector selection in a Gaussian linear
regression?

Thank you in advance

Best wishes,

Diego

        [[alternative HTML version deleted]]

_______________________________________________
R-sig-Geo mailing list
[hidden email]
https://stat.ethz.ch/mailman/listinfo/r-sig-geo

Clustering Sd Error in Spatial Models.

Tue, 10/09/2018 - 07:46
Hello,

I am currently working on a project estimating a spatial panel model.
Because I also estimate non-spatial models I am computing the clustered
standard errors following Stock and Watson (2008).

I tried to do the same for my spatial models however I am running into the
some errors (depending if I bootstrap or not my clustered
variance-covariance matrix). Below is a reproducible example.


#### Example
> data(Produc, package = "plm")
> data(usaww)
>
> fm <- log(gsp) ~ log(pcap) + log(pc) + log(emp) + unemp
>
> fespaterr <- spml(fm, data = Produc, listw = mat2listw(usaww),
+                   model="within", spatial.error="kkp")
>
> library(lmtest)
> library(sandwich)
>
> vcov_test <- vcov_test <- vcovCL(fespaterr, cluster = Produc$state)
Error in UseMethod("estfun") :
  no applicable method for 'estfun' applied to an object of class "splm"

> vcov_boot <- vcovBS(fespaterr, cluster = Produc$state, R=250)
Error in terms.default(x) : no terms component nor attribute
Error in terms.default(x) : no terms component nor attribute
Error in terms.default(x) : no terms component nor attributebute

#############
### Non spatial
library(lfe)

fe_cluster <- felm(log(gsp) ~ log(pcap) + log(pc) + log(emp) +
unemp|year+state|0|state,data=Produc)

> summary(fe)

Call:
   felm(formula = log(gsp) ~ log(pcap) + log(pc) + log(emp) + unemp |
year + state | 0 | state, data = Produc)

Residuals:
      Min        1Q    Median        3Q       Max
-0.160369 -0.018026 -0.000859  0.016745  0.170752

Coefficients:
           Estimate Cluster s.e. t value Pr(>|t|)
log(pcap) -0.030176     0.060042  -0.503   0.6154
log(pc)    0.168828     0.088331   1.911   0.0563 .
log(emp)   0.769306     0.087700   8.772   <2e-16 ***
unemp     -0.004221     0.003294  -1.281   0.2005
---
Signif. codes:  0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1
##########

My questions are:
1) Is it possible to cluster the variance-covariance matrix on spatial
models?
2) If so, what is the correct procedure?


Thank you for your help,
Amir
--
Amir B Ferreira Neto

        [[alternative HTML version deleted]]

_______________________________________________
R-sig-Geo mailing list
[hidden email]
https://stat.ethz.ch/mailman/listinfo/r-sig-geo

Re: spgm

Wed, 10/03/2018 - 04:26
Please provide a reproducible example using the Produc dataset, so that it is clearer what the problem is. I do not know that there is any literature on such impacts, please provide references.

Roger

Roger Bivand
Norwegian School of Economics
Bergen, Norway




On Wed, Oct 3, 2018 at 11:21 AM +0200, "Hulényi Martin" <[hidden email]<mailto:[hidden email]>> wrote:


Thank you very much !
I have one more question regarding the output. I have also one endogenous variable in the model.  Your code worked, but it did not show me the indirect and direct effects for the
endogenous varibale. Here is my regex:
spd_01 <- spgm(gdppcgr~lefpayr+lpopgr+linvr+lagwgipca + laglgdppc,
             data=esifpdata, listw=dm1.lw,
             model="within", lag=TRUE, spatial.error= FALSE, endog = ~ lefpayr,
             instruments=~areaprop,
             method="w2sls")
matrix1 <- kronecker(diag(length(unique(esifpdata$years))), dm1)
listw1 <- mat2listw(matrix1, style="W")
tr <- trW(as(listw1, "CsparseMatrix"), m=100)
impacts(spd_01, listw=listw1)
impacts(spd_01, tr=tr)
summary(impacts(spd_01, tr=tr, R=1000), zstats=TRUE, short=TRUE)

Best,

MArtin Hulényi


________________________________________
Od: Roger Bivand
Odoslané: 29. septembra 2018 14:52
Komu: Hulényi Martin
Kópia: [hidden email]
Predmet: Re: [R-sig-Geo] spgm

On Sat, 29 Sep 2018, Hulényi Martin wrote:

> Dear all,
>
>
> I would like to ask if there is a possibility to apply something
> similiar to the "impacts" from spdep package for SAR regressions using
> the spgm function from the splm package.
>

A reprex would have helped. Here is mine:

data(Produc, package = "plm")
data(usaww) # dense row-standardised weights matrix
GM <- spgm(log(gsp)~log(pcap)+log(pc)+log(emp)+unemp, data=Produc,
   listw = usaww, moments="fullweights", lag=TRUE, spatial.error = FALSE)
class(GM)
?impacts.stsls # spdep method for stsls objects
head(Produc)
length(unique(Produc$year)) # T
big <- kronecker(diag(length(unique(Produc$year))), usaww)
listw <- mat2listw(big, style="W")
tr <- trW(as(listw, "CsparseMatrix"), m=100)
impacts(GM, listw=listw)
impacts(GM, tr=tr)
summary(impacts(GM, tr=tr, R=1000), zstats=TRUE, short=TRUE)

The splm:::impacts.splm() method cannot dispatch on stsls objects, so they
try to use the spdep:::impacts.stsls() method, but there the data rows are
n x T but listw is only of n rows. Looking inside splm:::impacts.splm(),
you see that a sparse kronecker product matrix is constructed - either do
the same if your n x T is large, or follow the above using a dense
kronecker product and cast back to listw representation to create the
trace vector.

Hope this clarifies,

Roger

>
> Best regards,
>
>
> Martin Hul???nyi ?
>
>
> [eco.jpg]       Pred vytla???en???m tohto mailu zv???te pros???m vplyv na ???ivotn??? prostredie. ???akujeme.
> Please consider the environment before printing this e-mail. Thanks
>
>       [[alternative HTML version deleted]]
>
>
--
Roger Bivand
Department of Economics, Norwegian School of Economics,
Helleveien 30, N-5045 Bergen, Norway.
voice: +47 55 95 93 55; e-mail: [hidden email]
http://orcid.org/0000-0003-2392-6140
https://scholar.google.no/citations?user=AWeghB0AAAAJ&hl=en
[eco.jpg]       Pred vytlačením tohto mailu zvážte prosím vplyv na životné prostredie. Ďakujeme.
Please consider the environment before printing this e-mail. Thanks


        [[alternative HTML version deleted]]

_______________________________________________
R-sig-Geo mailing list
[hidden email]
https://stat.ethz.ch/mailman/listinfo/r-sig-geo
Roger Bivand
Department of Economics
Norwegian School of Economics
Helleveien 30
N-5045 Bergen, Norway

Re: spgm

Wed, 10/03/2018 - 04:21
Thank you very much !
I have one more question regarding the output. I have also one endogenous variable in the model.  Your code worked, but it did not show me the indirect and direct effects for the
endogenous varibale. Here is my regex:
spd_01 <- spgm(gdppcgr~lefpayr+lpopgr+linvr+lagwgipca + laglgdppc,
             data=esifpdata, listw=dm1.lw,
             model="within", lag=TRUE, spatial.error= FALSE, endog = ~ lefpayr,
             instruments=~areaprop,
             method="w2sls")
matrix1 <- kronecker(diag(length(unique(esifpdata$years))), dm1)
listw1 <- mat2listw(matrix1, style="W")
tr <- trW(as(listw1, "CsparseMatrix"), m=100)
impacts(spd_01, listw=listw1)
impacts(spd_01, tr=tr)
summary(impacts(spd_01, tr=tr, R=1000), zstats=TRUE, short=TRUE)

Best,

MArtin Hulényi


________________________________________
Od: Roger Bivand <[hidden email]>
Odoslané: 29. septembra 2018 14:52
Komu: Hulényi Martin
Kópia: [hidden email]
Predmet: Re: [R-sig-Geo] spgm

On Sat, 29 Sep 2018, Hulényi Martin wrote:

> Dear all,
>
>
> I would like to ask if there is a possibility to apply something
> similiar to the "impacts" from spdep package for SAR regressions using
> the spgm function from the splm package.
>

A reprex would have helped. Here is mine:

data(Produc, package = "plm")
data(usaww) # dense row-standardised weights matrix
GM <- spgm(log(gsp)~log(pcap)+log(pc)+log(emp)+unemp, data=Produc,
   listw = usaww, moments="fullweights", lag=TRUE, spatial.error = FALSE)
class(GM)
?impacts.stsls # spdep method for stsls objects
head(Produc)
length(unique(Produc$year)) # T
big <- kronecker(diag(length(unique(Produc$year))), usaww)
listw <- mat2listw(big, style="W")
tr <- trW(as(listw, "CsparseMatrix"), m=100)
impacts(GM, listw=listw)
impacts(GM, tr=tr)
summary(impacts(GM, tr=tr, R=1000), zstats=TRUE, short=TRUE)

The splm:::impacts.splm() method cannot dispatch on stsls objects, so they
try to use the spdep:::impacts.stsls() method, but there the data rows are
n x T but listw is only of n rows. Looking inside splm:::impacts.splm(),
you see that a sparse kronecker product matrix is constructed - either do
the same if your n x T is large, or follow the above using a dense
kronecker product and cast back to listw representation to create the
trace vector.

Hope this clarifies,

Roger

>
> Best regards,
>
>
> Martin Hul???nyi ?
>
>
> [eco.jpg]       Pred vytla???en???m tohto mailu zv???te pros???m vplyv na ???ivotn??? prostredie. ???akujeme.
> Please consider the environment before printing this e-mail. Thanks
>
>       [[alternative HTML version deleted]]
>
>
--
Roger Bivand
Department of Economics, Norwegian School of Economics,
Helleveien 30, N-5045 Bergen, Norway.
voice: +47 55 95 93 55; e-mail: [hidden email]
http://orcid.org/0000-0003-2392-6140
https://scholar.google.no/citations?user=AWeghB0AAAAJ&hl=en
[eco.jpg]       Pred vytlačením tohto mailu zvážte prosím vplyv na životné prostredie. Ďakujeme.
Please consider the environment before printing this e-mail. Thanks

_______________________________________________
R-sig-Geo mailing list
[hidden email]
https://stat.ethz.ch/mailman/listinfo/r-sig-geo

Re: Linear referencing in R

Mon, 10/01/2018 - 09:25
Thank you very much, Kent! This is pretty much what I was looking for. The
only problem is that the new shapefile didn't come with the attributes from
the topoESQ data frame. I made some adjustments, that probably are not the
best ones, but I still made it work.

topoESQ_comp <- read.csv("Files/Topo_estacas_E.csv") %>%
  select(-1) %>%
  rename(height = Altura,
         side = Lado) %>%
  filter(to_km <= 871)

Estacas_1m <- st_read("Files/ESTACAS_1m_EFC_0-871_Policonic_SIRGAS.shp")

topoESQ_comp %>%
  purrr::pmap(function(from_km, to_km, ...) {
    st_linestring(do.call(rbind,
Estacas_1m$geometry[((1000*from_km):(1000*to_km))+1]))
  }) %>%
  st_sfc(., crs = 5880) %>%
  cbind(topoESQ_comp, .) %>%
  st_as_sf(.) %>%
  st_write(., dsn = "Files/Topo_esq.shp", delete_dsn = T)

Best regards,

Rubem Dornas.


Em sáb, 29 de set de 2018 às 22:17, Kent Johnson <[hidden email]>
escreveu:

> You need to snip out the sections of the railroad file rather than
> connecting the endpoints. Maybe this is closer to what you want?
>
> segments = topoESQ_comp %>%
>   filter(to_km <= 871) %>%
>   purrr::pmap(function(from_km, to_km, ...) {
>     st_linestring(do.call(rbind,
> Estacas_1m$geometry[((1000*from_km):(1000*to_km))+1]))
> }) %>%
>   st_sfc
>
> Kent Johnson
>
>
>> Date: Fri, 28 Sep 2018 19:52:03 -0300
>> From: Rubem Dornas <[hidden email]>
>> To: [hidden email]
>> Subject: [R-sig-Geo] Linear referencing in R
>> Hi, people! I hope I'll be not to speculative in my question and that you
>> can comprehend my problem.
>>
>> Well, I have a railroad shapefile and I have two csv files corresponding
>> to
>> the topography in each side of the railroad. The csv are organized in this
>> way:
>>
>> ID_topo, from_km, to_km, height
>> 1, 0, 1.91, 15
>> 2, 1.91, 2.23, -3
>>
>> I created a point shapefile for each meter of the railroad and then I
>> proceeded with a join (not spatial) between the from_km of the topography
>> data frame and the km mark (km_calc) from the point railroad.
>>
>> My goal is to create shapefiles for each of the railroadside topography.
>> The issue is that when I try to make a new line shapefile from topography
>> based on the points of the railroad, what I get is a line that has doesn't
>> follow the curvature of the railroad. Using the example of the csv above,
>> what I get are several straight lines linked by the from_km to to_km.
>> (Here
>> is a link to a print screen from QGIS:
>>
>> https://www.dropbox.com/s/erfsst8pasoqj64/Captura%20de%20tela%202018-09-28%2019.49.47.png?dl=0
>> <http://Image>)
>>
>>
        [[alternative HTML version deleted]]

_______________________________________________
R-sig-Geo mailing list
[hidden email]
https://stat.ethz.ch/mailman/listinfo/r-sig-geo

Video recordings from the OpenGeoHub Summer School IBOT, Prague August 19-25, 2018

Mon, 10/01/2018 - 03:55

The video recordings from the OpenGeoHub Summer School IBOT, Prague
August 19-25, 2018 are now available via our youtube channel:

https://www.youtube.com/channel/UC6HFFFYiV4zEYJlQMIXemWA

I am trying to organize all current and past materials into a structured
course/archive so that everyone should be able to easier find her/his
topics of interest:

https://opengeohub.org/course

*this might take me few weeks because we have collected many materials
over years.

Many many thanks to the lecturers: Edzer Pebesma (IfGI University of
Muenster, Muenster Germany), Roger Bivand (Department of Economics,
Norwegian School of Economics, Bergen, Norway), Markus Neteler
(Mundialis, Bonn, Germany), Tim Appelhans (GfK Geomarketing, Germany),
Robin Lovelace (University of Leeds, UK), Jannes Münchow (Geographic
Information Science, Friedrich Schiller University Jena, Jena, Germany),
Jakub Nowosad (Space Informatics Lab, University of Cincinnati, Ohio
USA), Veronica Andreo (National Institute of Tropical Medicine INMeT,
Argentina) and Hanna Meyer & Chris Reudenbach (Environmental
Informatics, Philipps University Marburg, Germany) for coming to Prague
and for sharing their materials!

And many many thanks to IBOT's department of GIS and remote sensing
Matej Man and Jan Wild for being such excellent hosts!

yours,

Tom Hengl
The OpenGeoHub Foundation
http://www.opengeohub.org/about

_______________________________________________
R-sig-Geo mailing list
[hidden email]
https://stat.ethz.ch/mailman/listinfo/r-sig-geo

Re: Linear referencing in R

Sat, 09/29/2018 - 20:17
You need to snip out the sections of the railroad file rather than
connecting the endpoints. Maybe this is closer to what you want?

segments = topoESQ_comp %>%
  filter(to_km <= 871) %>%
  purrr::pmap(function(from_km, to_km, ...) {
    st_linestring(do.call(rbind,
Estacas_1m$geometry[((1000*from_km):(1000*to_km))+1]))
}) %>%
  st_sfc

Kent Johnson


> Date: Fri, 28 Sep 2018 19:52:03 -0300
> From: Rubem Dornas <[hidden email]>
> To: [hidden email]
> Subject: [R-sig-Geo] Linear referencing in R
> Hi, people! I hope I'll be not to speculative in my question and that you
> can comprehend my problem.
>
> Well, I have a railroad shapefile and I have two csv files corresponding to
> the topography in each side of the railroad. The csv are organized in this
> way:
>
> ID_topo, from_km, to_km, height
> 1, 0, 1.91, 15
> 2, 1.91, 2.23, -3
>
> I created a point shapefile for each meter of the railroad and then I
> proceeded with a join (not spatial) between the from_km of the topography
> data frame and the km mark (km_calc) from the point railroad.
>
> My goal is to create shapefiles for each of the railroadside topography.
> The issue is that when I try to make a new line shapefile from topography
> based on the points of the railroad, what I get is a line that has doesn't
> follow the curvature of the railroad. Using the example of the csv above,
> what I get are several straight lines linked by the from_km to to_km. (Here
> is a link to a print screen from QGIS:
>
> https://www.dropbox.com/s/erfsst8pasoqj64/Captura%20de%20tela%202018-09-28%2019.49.47.png?dl=0
> <http://Image>)
>
>
        [[alternative HTML version deleted]]

_______________________________________________
R-sig-Geo mailing list
[hidden email]
https://stat.ethz.ch/mailman/listinfo/r-sig-geo

Pages