Subscribe to R-sig-geo feed
This is an archive for R-sig-geo, not a forum. It is not possible to post through Nabble - you may not start a new thread nor follow up an existing thread. If you wish to post, but are not subscribed to the list through the list homepage, subscribe first (through the list homepage, not via Nabble) and post from your subscribed email address. Until 2015-06-20, subscribers could post through Nabble, but policy has been changed as too many non-subscribers misunderstood the interface.
Updated: 1 hour 8 min ago

Re: Question spdep package - lagsarlm not terminating

Tue, 05/07/2019 - 00:35
Dear Roger,

Thank you again for your fast response and the helpful answers and clarifications. I will try to resolve all the issues with your recommendations in the coming days.

Just a few comments here:

I created the listw object by nb2listw() (without any error messages) and set „zero.policy=TRUE“ while calling the function. However, I used the function subset() instead of subset.nb(). I will play around with it a bit and see which works best for me. But it is good to know that observations with missing values are dropped by default and the matrix adjusted.

With regards to your question related to GWR: Yes, the error occurs while printing a fitted object. And I also thought that it might be linked to the collinearity in the covariates, just as it is the case for the error with lagsarlm(). However, if you do not recommend it for any other purposes I will have to discuss its use with my thesis supervisor again.


Have a nice day!

Best regards,

Raphael


> Am 06.05.2019 um 17:20 schrieb Roger Bivand <[hidden email]>:
>
> On Mon, 6 May 2019, Raphael Mesaric via R-sig-Geo wrote:
>
>>
>>> Dear Roger,
>>>
>>> Thank you very much for your fast response.
>>>
>>> My weight matrix stems from a rectangular (but not square) grid. So, I think I will have to use the „LU“ method.
>
> No, this is a misunderstanding. All weights matrices used for fitting models are square nxn matrices by definition. If your grid had r rows and c columns, n = r * c, and the weights matrix has n rows and n columns.
>
>>>
>>> However, by doing that several other error messages popped up. The first one was the following:
>>>
>>> Error in spatialreg::errorsarlm(formula = formula, data = data, listw = listw, :
>>>  NAs in lagged dependent variable
>>> In addition: Warning messages:
>>> 1: Function errorsarlm moved to the spatialreg package
>>> 2: In lag.listw(listw, y, zero.policy = zero.policy) :
>>>  NAs in lagged values
>>>
>>> And this happened even though I do actually not have NaN values in my dataset (I removed them in advance). When I completely reload all variables, it works sometimes for a few tries but after a while the error messages pops up again. Do you know why this might happen? And could I theoretically use a dataset with NaN values and the function omits them?
>>>
>
> By default, observations with missing values are dropped and the weights object is subsetted to match. This may produce no-neighbour observations. You can choose to set their spatially lagged values to zero by using zero.policy=TRUE, which is the probable cause of your error. It could very well be that your weights object itself contains no-neighbour observations if not created using spdep::nb2listw() - which will alert you to the problem.
>
>>>
>>> The second error message was the following:
>>>
>>> Error in solve.default(-(mat), tol.solve = tol.solve) :
>>>  system is computationally singular: reciprocal condition number =
>>>  2.71727e-26
>>>
>>> I found a reference online (related to glm) that this might be linked to explanatory variables which have a high correlation. Would eliminating some of the variables do the job here as well?
>>>
>
> Please do not use references online (especially if you do not link to them). The information you need is in the help page, referring to the tol.solve= argument. Indeed, your covariates are scaled such that their are either colinear (unlikely), or that the numerical values of the coefficient variances are very different from the spatial coefficient. You may need to re-scale the response or covariates in order to invert the matrix needed to yield the variances of the coefficients.
>
>>>
>>> Then I have two other questions, where I couldn’t find any answers online:
>>>
>>> Is there an option to eliminate specific rows of a nb object? I tried the following command:
>>>
>>> rook234x259_2 <- rook234x259[index],
>
> spdep::subset.nb() only. Your suggestion only creates a big mess, as subsetting reduces the number of rows, and neighbours are indexed between 1 and n. Consequently, at least some of the remaining vectors will point to neighbours with ids > n, and many of the others will point to the wrong neighbours. You can convert to a sparse matrix, subset using "[", and then convert back, but the subset method should work.
>
>>>
>>> where index is a vector with the row numbers I would like to keep. But this converts the nb to a list object, and there is no list2nb function (as much as I am aware of). And the option „subset“ does not only remove the rows, but also the corresponding cells which leads to a different neighborhood matrix.
>
> I do not know what you mean, of course it has to remove links in both directions.
>
>>>
>>>
>>> The last question refers to the package spgwr. I would like to run this model as well. The gwr function runs successfully, but when calling the variable where I stored the result, I get the following error message instead of my coefficients:
>>>
>>> Error in abs(coef.se <http://coef.se/> <- xm[, cs.ind, drop = FALSE]) :
>>>  non-numeric argument to mathematical function
>>> In addition: Warning messages:
>>> 1: In print.gwr(x) : NAs in coefficients dropped
>>> 2: In cbind(CM, coefficients(x$lm)) :
>>>  number of rows of result is not a multiple of vector length (arg 2)
>>>
>
> I always advise against using GWR for anything other than studies exploring its weaknesses - do not use in research or production. Just showing the error message without a small reproducible example using a built-in data set gives nothing to go on. Does the error occur when using print() on a fitted object? Maybe NAs in coefficients suggests colinearity in your covariates, possibly after weighting.
>
>>> Again, there aren’t any NaN values in my dataset, so I can’t really imagine where the NAs in the coefficients come from.
>
> But it is terribly easy to introduce them numerically, so they are coming from what you are doing.
>
> Roger
>
>>>
>>>
>>> Thank you very much for your help!
>>>
>>> Best regards,
>>>
>>> Raphael
>>>
>>>
>>>> Am 05.05.2019 um 15:21 schrieb Roger Bivand <[hidden email]>:
>>>>
>>>> On Sat, 4 May 2019, Raphael Mesaric via R-sig-Geo wrote:
>>>>
>>>>> Dear all,
>>>>>
>>>>> I have a question with regards to the function „lagsarlm" from the package spdep. My problem is that the function is not terminating. Of course, I have quite a big grid (depending on the selection either 34000 or 60600 entries) and I also have a lot of explanatory variables (about 40). But I am still wondering whether there is something wrong.
>>>>
>>>> If you read the help page (the function is in the spatialreg package and will be dropped from spdep shortly), and look at the references (Bivand et al. 2013), you will see that the default value if the method= argument is "eigen". For small numbers of observations, solving the eigenproblem of a dense weights object is not a problem, but becomes demanding on memory as n increases. You are using virual memory (and may run out of that too) which makes your machine unresponsive. If you choose an alternative method, typically "Matrix" for symmetric or similar to symmetric sparse weights, or "LU" for asymmetric sparse weights.
>>>>
>>>>> data(house, package="spData")
>>>>> dim(house)
>>>> [1] 25357    24
>>>>> LO_nb
>>>> Neighbour list object:
>>>> Number of regions: 25357
>>>> Number of nonzero links: 74874
>>>> Percentage nonzero weights: 0.01164489
>>>> Average number of links: 2.952794
>>>>> lw <- spdep::nb2listw(LO_nb)
>>>>> system.time(res <- spatialreg::lagsarlm(log(price) ~ TLA + frontage +
>>>> + rooms + yrbuilt, data=house, listw=lw, method="Matrix"))
>>>>  user  system elapsed
>>>> 0.606   0.011   0.631
>>>>
>>>> so less than 1 second on a standard laptop for similar to symmetric very sparse weights and ~ 25000 observations. Less sparse weights take somewhat longer. The function does not (maybe yet) prevent users trying to do things that are not advisable, because maybe they have 128GB RAM or more, and want to use eigenvalues rather than sparse matrix methods.
>>>>
>>>> Hope this clarifies,
>>>>
>>>> Roger
>>>>
>>>>>
>>>>> I tried to run a model based on the dataset „columbus“, and there I did not have any problems (but there are way fever entries and variables). I also compared the format of the required inputs, but everything seemed to be equivalent to the inputs used for the „columbus“ model.
>>>>>
>>>>> Do you have any idea what might be the reason for the extremely long (respectively infinite, it has not terminated yet) computation time? Any suggestions are greatly appreciated.
>>>>>
>>>>> If you would like to, I can also upload the corresponding code. However, the code includes some MAT-Files as I got the data in MATLAB. I do not yet attach them here because I read that attachments in another format than PDF are not desired as they may contain malicious software.
>>>>>
>>>>> Thank you for your help in advance!
>>>>>
>>>>> Best regards,
>>>>>
>>>>> Raphael Mesaric
>>>>> _______________________________________________
>>>>> R-sig-Geo mailing list
>>>>> [hidden email]
>>>>> https://stat.ethz.ch/mailman/listinfo/r-sig-geo
>>>>>
>>>>
>>>> --
>>>> Roger Bivand
>>>> Department of Economics, Norwegian School of Economics,
>>>> Helleveien 30, N-5045 Bergen, Norway.
>>>> voice: +47 55 95 93 55; e-mail: [hidden email]
>>>> https://orcid.org/0000-0003-2392-6140
>>>> https://scholar.google.no/citations?user=AWeghB0AAAAJ&hl=en
>>>
>>
>> [[alternative HTML version deleted]]
>>
>> _______________________________________________
>> R-sig-Geo mailing list
>> [hidden email] <mailto:[hidden email]>
>> https://stat.ethz.ch/mailman/listinfo/r-sig-geo <https://stat.ethz.ch/mailman/listinfo/r-sig-geo>
>>
>
> --
> Roger Bivand
> Department of Economics, Norwegian School of Economics,
> Helleveien 30, N-5045 Bergen, Norway.
> voice: +47 55 95 93 55 <tel:+47%2055%2095%2093%2055>; e-mail: [hidden email] <mailto:[hidden email]>
> https://orcid.org/0000-0003-2392-6140 <https://orcid.org/0000-0003-2392-6140>
> https://scholar.google.no/citations?user=AWeghB0AAAAJ&hl=en <https://scholar.google.no/citations?user=AWeghB0AAAAJ&hl=en>
        [[alternative HTML version deleted]]

_______________________________________________
R-sig-Geo mailing list
[hidden email]
https://stat.ethz.ch/mailman/listinfo/r-sig-geo

Re: error to extract soil grids data

Mon, 05/06/2019 - 14:49

Hi Eliza,

The gsif.isric.org tutorials have been migrated to:
https://envirometrix.github.io/PredictiveSoilMapping/

Soil texture fractions and bulk density you can now download directly
from e.g.:

https://landgis.opengeohub.org/#/?base=Stamen%20(OpenStreetMap)&center=-12.8118,-54.3164&zoom=5&opacity=80&layer=sol_bulkdens.fineearth_usda.4a1h_m&depth=30

If you need only Brasil, I recommend using the LandGIS Web Coverage
Service (https://github.com/Envirometrix/LandGISmaps#accessing-data) e.g.:

download.file("https://geoserver.opengeohub.org/landgisgeoserver/ows?service=WCS&version=2.0.1&request=GetCoverage&coverageId=predicted250m:sol_bulkdens.fineearth_usda.4a1h_m_250m_b30..30cm_1950..2017_v0.2&subset=Lat(-34,6)&subset=Long(-74,-28)&scalefactor=0.25",
"brazil_sol_bulkdens.fineearth_usda.4a1h_m_250m_b30..30cm_1950..2017_v0.2.tif")
#  % Total    % Received % Xferd  Average Speed   Time    Time     Time
#Current
#                                 Dload  Upload   Total   Spent    Left
#Speed
#100 25.8M    0 25.8M    0     0  1098k      0 --:--:--  0:00:24
#--:--:-- 2095k
library(raster)
#Loading required package: sp
plot(raster("brazil_sol_bulkdens.fineearth_usda.4a1h_m_1km_b30..30cm_1950..2017_v0.2.tif"))

Clay content
(https://maps.opengeohub.org/layers/predicted250m:sol_clay.wfraction_usda.3a1a1a_m_250m_b30..30cm_1950..2017_v0.2)
and bulk density
(https://maps.opengeohub.org/layers/predicted250m:sol_bulkdens.fineearth_usda.4a1h_m_250m_b30..30cm_1950..2017_v0.2)
are also available via our Geonode.

Having said all this, we have recently derived soil hydraulic parameters
(water content at various suction pressures) for whole world directly
from lab data hence without using any PTF (read more in:
https://twitter.com/opengeohub/status/1115317556343181312), including
the available water capacity (https://doi.org/10.5281/zenodo.2629148).
Which soil hydraulic parameters are you aiming at? Will be curious to
learn from you if any of these layers / instructions help you (please
send a private message).

HTH,

Tom Hengl
https://opengeohub.org/about


On 5/6/19 6:15 PM, Mª Eliza Turek wrote:
> Dear all,
>
>   I need to extract texture and bulk density data for Brazil, in order to
> convert it to soil hydraulic parameters using PTF functions. I read in soil
> grids website that for this kind of evaluation is better to use "tile"
> data. How can I perform that? I Couldn't find a manual for that. (I can't
> access http://gsif.isric.org/doku.php?id=wiki:tutorial_soilgrid).
> Also, I am starting to use soil grids and found an error that seems to be
> very easy but it stopped my work. It seems to be something referent to the
> tiff image. I am trying to use the following code, but after the last
> command I get the error: Error in .local(.Object, ...) :
>   MissingRequired:TIFF directory is missing required "StripByteCounts" field
>
> Can anyone help me? Thank you!
>
>> # SoilGrids tutorial
>> # Reference: [http://gsif.isric.org/doku.php?id=wiki:tutorial_soilgrids]
>> # [hidden email]
>> rm(list=ls())
>> library(RCurl)
> Loading required package: bitops
>> library(rgdal)
> Loading required package: sp
> rgdal: version: 1.4-3, (SVN revision 828)
>   Geospatial Data Abstraction Library extensions to R successfully loaded
>   Loaded GDAL runtime: GDAL 2.2.3, released 2017/11/20
>   Path to GDAL shared files:
> C:/Users/User/Documents/R/win-library/3.5/rgdal/gdal
>   GDAL binary built with GEOS: TRUE
>   Loaded PROJ.4 runtime: Rel. 4.9.3, 15 August 2016, [PJ_VERSION: 493]
>   Path to PROJ.4 shared files:
> C:/Users/User/Documents/R/win-library/3.5/rgdal/proj
>   Linking to sp version: 1.3-1
>> library(GSIF)
> GSIF version 0.5-5 (2019-01-04)
> URL: http://gsif.r-forge.r-project.org/
>> library(raster)
>> library(plotKML)
> plotKML version 0.5-9 (2019-01-04)
> URL: http://plotkml.r-forge.r-project.org/
>> library(XML)
>> library(lattice)
>> library(aqp)
> This is aqp 1.17
>
> Attaching package: ‘aqp’
>
> The following objects are masked from ‘package:raster’:
>
>      metadata, metadata<-, union
>
> The following object is masked from ‘package:base’:
>
>      union
>
>> library(soiltexture)
> soiltexture 1.5.1 (git revision: 4b25ba2). For help type:
> help(pack='soiltexture')
>> ## GDAL paths:
>> if(.Platform$OS.type == "windows"){
> +   gdal.dir <- shortPathName("C:/Program files/GDAL")
> +   gdal_translate <- paste0(gdal.dir, "/gdal_translate.exe")
> +   gdalwarp <- paste0(gdal.dir, "/gdalwarp.exe")
> +   gdalinfo <- paste0(gdal.dir, "/gdalinfo.exe")
> + } else {
> +   gdal_translate = "gdal_translate"
> +   gdalwarp = "gdalwarp"
> +   gdalinfo = "gdalinfo"
> + }
>> ## (a) FTP download:
>> ## location of soilgrids:
>> sg.ftp <- "ftp://ftp.soilgrids.org/data/recent/"
>> filenames = getURL(sg.ftp, ftp.use.epsv = FALSE, dirlistonly = TRUE)
>> filenames = strsplit(filenames, "\r*\n")[[1]]
>> filenames[1:5]
> [1] "ACDWRB_M_ss_250m.tif" "ACDWRB_M_ss_250m.xml" "AWCh1_M_sl1_250m.tif"
> "AWCh1_M_sl1_250m.xml" "AWCh1_M_sl2_250m.tif"
>> ## download to a local directory:
>> ORC.name <- filenames[grep(filenames,
> pattern=glob2rx("AWCh2_M_sl2_250m.tif$"))]
>> ORC.name
> [1] "AWCh2_M_sl2_250m.tif"
>> try(download.file(paste(sg.ftp, ORC.name, sep=""), ORC.name))
> trying URL 'ftp://ftp.soilgrids.org/data/recent/AWCh2_M_sl2_250m.tif'
> downloaded 919.3 MB
>
>> ## check that everything is OK:
>> GDALinfo(ORC.name)
> Error in .local(.Object, ...) :
>    MissingRequired:TIFF directory is missing required "StripByteCounts" field
> ---------------------------------------------------------------------
>
>
>
_______________________________________________
R-sig-Geo mailing list
[hidden email]
https://stat.ethz.ch/mailman/listinfo/r-sig-geo

error to extract soil grids data

Mon, 05/06/2019 - 11:15
Dear all,

 I need to extract texture and bulk density data for Brazil, in order to
convert it to soil hydraulic parameters using PTF functions. I read in soil
grids website that for this kind of evaluation is better to use "tile"
data. How can I perform that? I Couldn't find a manual for that. (I can't
access http://gsif.isric.org/doku.php?id=wiki:tutorial_soilgrid).
Also, I am starting to use soil grids and found an error that seems to be
very easy but it stopped my work. It seems to be something referent to the
tiff image. I am trying to use the following code, but after the last
command I get the error: Error in .local(.Object, ...) :
 MissingRequired:TIFF directory is missing required "StripByteCounts" field

Can anyone help me? Thank you!

> # SoilGrids tutorial
> # Reference: [http://gsif.isric.org/doku.php?id=wiki:tutorial_soilgrids]
> # [hidden email]
> rm(list=ls())
> library(RCurl)
Loading required package: bitops
> library(rgdal)
Loading required package: sp
rgdal: version: 1.4-3, (SVN revision 828)
 Geospatial Data Abstraction Library extensions to R successfully loaded
 Loaded GDAL runtime: GDAL 2.2.3, released 2017/11/20
 Path to GDAL shared files:
C:/Users/User/Documents/R/win-library/3.5/rgdal/gdal
 GDAL binary built with GEOS: TRUE
 Loaded PROJ.4 runtime: Rel. 4.9.3, 15 August 2016, [PJ_VERSION: 493]
 Path to PROJ.4 shared files:
C:/Users/User/Documents/R/win-library/3.5/rgdal/proj
 Linking to sp version: 1.3-1
> library(GSIF)
GSIF version 0.5-5 (2019-01-04)
URL: http://gsif.r-forge.r-project.org/
> library(raster)
> library(plotKML)
plotKML version 0.5-9 (2019-01-04)
URL: http://plotkml.r-forge.r-project.org/
> library(XML)
> library(lattice)
> library(aqp)
This is aqp 1.17

Attaching package: ‘aqp’

The following objects are masked from ‘package:raster’:

    metadata, metadata<-, union

The following object is masked from ‘package:base’:

    union

> library(soiltexture)
soiltexture 1.5.1 (git revision: 4b25ba2). For help type:
help(pack='soiltexture')
> ## GDAL paths:
> if(.Platform$OS.type == "windows"){
+   gdal.dir <- shortPathName("C:/Program files/GDAL")
+   gdal_translate <- paste0(gdal.dir, "/gdal_translate.exe")
+   gdalwarp <- paste0(gdal.dir, "/gdalwarp.exe")
+   gdalinfo <- paste0(gdal.dir, "/gdalinfo.exe")
+ } else {
+   gdal_translate = "gdal_translate"
+   gdalwarp = "gdalwarp"
+   gdalinfo = "gdalinfo"
+ }
> ## (a) FTP download:
> ## location of soilgrids:
> sg.ftp <- "ftp://ftp.soilgrids.org/data/recent/"
> filenames = getURL(sg.ftp, ftp.use.epsv = FALSE, dirlistonly = TRUE)
> filenames = strsplit(filenames, "\r*\n")[[1]]
> filenames[1:5]
[1] "ACDWRB_M_ss_250m.tif" "ACDWRB_M_ss_250m.xml" "AWCh1_M_sl1_250m.tif"
"AWCh1_M_sl1_250m.xml" "AWCh1_M_sl2_250m.tif"
> ## download to a local directory:
> ORC.name <- filenames[grep(filenames,
pattern=glob2rx("AWCh2_M_sl2_250m.tif$"))]
> ORC.name
[1] "AWCh2_M_sl2_250m.tif"
> try(download.file(paste(sg.ftp, ORC.name, sep=""), ORC.name))
trying URL 'ftp://ftp.soilgrids.org/data/recent/AWCh2_M_sl2_250m.tif'
downloaded 919.3 MB

> ## check that everything is OK:
> GDALinfo(ORC.name)
Error in .local(.Object, ...) :
  MissingRequired:TIFF directory is missing required "StripByteCounts" field
---------------------------------------------------------------------



--
Maria Eliza Turek
PhD Student - Graduate Program of Environmental Engineering  - PPGEA
Federal University of Paraná - UFPR

        [[alternative HTML version deleted]]

_______________________________________________
R-sig-Geo mailing list
[hidden email]
https://stat.ethz.ch/mailman/listinfo/r-sig-geo

Re: Fwd: Question spdep package - lagsarlm not terminating

Mon, 05/06/2019 - 10:20
On Mon, 6 May 2019, Raphael Mesaric via R-sig-Geo wrote:

>
>> Dear Roger,
>>
>> Thank you very much for your fast response.
>>
>> My weight matrix stems from a rectangular (but not square) grid. So, I
>> think I will have to use the „LU“ method.

No, this is a misunderstanding. All weights matrices used for fitting
models are square nxn matrices by definition. If your grid had r rows and
c columns, n = r * c, and the weights matrix has n rows and n columns.

>>
>> However, by doing that several other error messages popped up. The
>> first one was the following:
>>
>> Error in spatialreg::errorsarlm(formula = formula, data = data, listw =
>> listw, :
>>   NAs in lagged dependent variable
>> In addition: Warning messages:
>> 1: Function errorsarlm moved to the spatialreg package
>> 2: In lag.listw(listw, y, zero.policy = zero.policy) :
>>   NAs in lagged values
>>
>> And this happened even though I do actually not have NaN values in my
>> dataset (I removed them in advance). When I completely reload all
>> variables, it works sometimes for a few tries but after a while the
>> error messages pops up again. Do you know why this might happen? And
>> could I theoretically use a dataset with NaN values and the function
>> omits them?
>> By default, observations with missing values are dropped and the weights
object is subsetted to match. This may produce no-neighbour observations.
You can choose to set their spatially lagged values to zero by using
zero.policy=TRUE, which is the probable cause of your error. It could very
well be that your weights object itself contains no-neighbour observations
if not created using spdep::nb2listw() - which will alert you to the
problem.

>>
>> The second error message was the following:
>>
>> Error in solve.default(-(mat), tol.solve = tol.solve) :
>>   system is computationally singular: reciprocal condition number =
>>   2.71727e-26
>>
>> I found a reference online (related to glm) that this might be linked
>> to explanatory variables which have a high correlation. Would
>> eliminating some of the variables do the job here as well?
>> Please do not use references online (especially if you do not link to
them). The information you need is in the help page, referring to the
tol.solve= argument. Indeed, your covariates are scaled such that their
are either colinear (unlikely), or that the numerical values of the
coefficient variances are very different from the spatial coefficient. You
may need to re-scale the response or covariates in order to invert the
matrix needed to yield the variances of the coefficients.

>>
>> Then I have two other questions, where I couldn’t find any answers
>> online:
>>
>> Is there an option to eliminate specific rows of a nb object? I tried
>> the following command:
>>
>> rook234x259_2 <- rook234x259[index],

spdep::subset.nb() only. Your suggestion only creates a big mess, as
subsetting reduces the number of rows, and neighbours are indexed between
1 and n. Consequently, at least some of the remaining vectors will point
to neighbours with ids > n, and many of the others will point to the wrong
neighbours. You can convert to a sparse matrix, subset using "[", and then
convert back, but the subset method should work.

>>
>> where index is a vector with the row numbers I would like to keep. But
>> this converts the nb to a list object, and there is no list2nb function
>> (as much as I am aware of). And the option „subset“ does not only
>> remove the rows, but also the corresponding cells which leads to a
>> different neighborhood matrix.

I do not know what you mean, of course it has to remove links in both
directions.

>>
>>
>> The last question refers to the package spgwr. I would like to run this
>> model as well. The gwr function runs successfully, but when calling the
>> variable where I stored the result, I get the following error message
>> instead of my coefficients:
>>
>> Error in abs(coef.se <- xm[, cs.ind, drop = FALSE]) :
>>   non-numeric argument to mathematical function
>> In addition: Warning messages:
>> 1: In print.gwr(x) : NAs in coefficients dropped
>> 2: In cbind(CM, coefficients(x$lm)) :
>>   number of rows of result is not a multiple of vector length (arg 2)
>> I always advise against using GWR for anything other than studies
exploring its weaknesses - do not use in research or production. Just
showing the error message without a small reproducible example using a
built-in data set gives nothing to go on. Does the error occur when using
print() on a fitted object? Maybe NAs in coefficients
suggests colinearity in your covariates, possibly after weighting.

>> Again, there aren’t any NaN values in my dataset, so I can’t really
>> imagine where the NAs in the coefficients come from.

But it is terribly easy to introduce them numerically, so they are coming
from what you are doing.

Roger

>>
>>
>> Thank you very much for your help!
>>
>> Best regards,
>>
>> Raphael
>>
>>
>>> Am 05.05.2019 um 15:21 schrieb Roger Bivand <[hidden email]>:
>>>
>>> On Sat, 4 May 2019, Raphael Mesaric via R-sig-Geo wrote:
>>>
>>>> Dear all,
>>>>
>>>> I have a question with regards to the function „lagsarlm" from the package spdep. My problem is that the function is not terminating. Of course, I have quite a big grid (depending on the selection either 34000 or 60600 entries) and I also have a lot of explanatory variables (about 40). But I am still wondering whether there is something wrong.
>>>
>>> If you read the help page (the function is in the spatialreg package and will be dropped from spdep shortly), and look at the references (Bivand et al. 2013), you will see that the default value if the method= argument is "eigen". For small numbers of observations, solving the eigenproblem of a dense weights object is not a problem, but becomes demanding on memory as n increases. You are using virual memory (and may run out of that too) which makes your machine unresponsive. If you choose an alternative method, typically "Matrix" for symmetric or similar to symmetric sparse weights, or "LU" for asymmetric sparse weights.
>>>
>>>> data(house, package="spData")
>>>> dim(house)
>>> [1] 25357    24
>>>> LO_nb
>>> Neighbour list object:
>>> Number of regions: 25357
>>> Number of nonzero links: 74874
>>> Percentage nonzero weights: 0.01164489
>>> Average number of links: 2.952794
>>>> lw <- spdep::nb2listw(LO_nb)
>>>> system.time(res <- spatialreg::lagsarlm(log(price) ~ TLA + frontage +
>>> + rooms + yrbuilt, data=house, listw=lw, method="Matrix"))
>>>   user  system elapsed
>>>  0.606   0.011   0.631
>>>
>>> so less than 1 second on a standard laptop for similar to symmetric very sparse weights and ~ 25000 observations. Less sparse weights take somewhat longer. The function does not (maybe yet) prevent users trying to do things that are not advisable, because maybe they have 128GB RAM or more, and want to use eigenvalues rather than sparse matrix methods.
>>>
>>> Hope this clarifies,
>>>
>>> Roger
>>>
>>>>
>>>> I tried to run a model based on the dataset „columbus“, and there I did not have any problems (but there are way fever entries and variables). I also compared the format of the required inputs, but everything seemed to be equivalent to the inputs used for the „columbus“ model.
>>>>
>>>> Do you have any idea what might be the reason for the extremely long (respectively infinite, it has not terminated yet) computation time? Any suggestions are greatly appreciated.
>>>>
>>>> If you would like to, I can also upload the corresponding code. However, the code includes some MAT-Files as I got the data in MATLAB. I do not yet attach them here because I read that attachments in another format than PDF are not desired as they may contain malicious software.
>>>>
>>>> Thank you for your help in advance!
>>>>
>>>> Best regards,
>>>>
>>>> Raphael Mesaric
>>>> _______________________________________________
>>>> R-sig-Geo mailing list
>>>> [hidden email]
>>>> https://stat.ethz.ch/mailman/listinfo/r-sig-geo
>>>>
>>>
>>> --
>>> Roger Bivand
>>> Department of Economics, Norwegian School of Economics,
>>> Helleveien 30, N-5045 Bergen, Norway.
>>> voice: +47 55 95 93 55; e-mail: [hidden email]
>>> https://orcid.org/0000-0003-2392-6140
>>> https://scholar.google.no/citations?user=AWeghB0AAAAJ&hl=en
>>
>
> [[alternative HTML version deleted]]
>
> _______________________________________________
> R-sig-Geo mailing list
> [hidden email]
> https://stat.ethz.ch/mailman/listinfo/r-sig-geo
> --
Roger Bivand
Department of Economics, Norwegian School of Economics,
Helleveien 30, N-5045 Bergen, Norway.
voice: +47 55 95 93 55; e-mail: [hidden email]
https://orcid.org/0000-0003-2392-6140
https://scholar.google.no/citations?user=AWeghB0AAAAJ&hl=en
_______________________________________________
R-sig-Geo mailing list
[hidden email]
https://stat.ethz.ch/mailman/listinfo/r-sig-geo
Roger Bivand
Department of Economics
Norwegian School of Economics
Helleveien 30
N-5045 Bergen, Norway

Fwd: Question spdep package - lagsarlm not terminating

Mon, 05/06/2019 - 09:35

> Dear Roger,
>
> Thank you very much for your fast response.
>
> My weight matrix stems from a rectangular (but not square) grid. So, I think I will have to use the „LU“ method.
>
> However, by doing that several other error messages popped up. The first one was the following:
>
> Error in spatialreg::errorsarlm(formula = formula, data = data, listw = listw,  :
>   NAs in lagged dependent variable
> In addition: Warning messages:
> 1: Function errorsarlm moved to the spatialreg package
> 2: In lag.listw(listw, y, zero.policy = zero.policy) :
>   NAs in lagged values
>
> And this happened even though I do actually not have NaN values in my dataset (I removed them in advance). When I completely reload all variables, it works sometimes for a few tries but after a while the error messages pops up again. Do you know why this might happen? And could I theoretically use a dataset with NaN values and the function omits them?
>
>
> The second error message was the following:
>
> Error in solve.default(-(mat), tol.solve = tol.solve) :
>   system is computationally singular: reciprocal condition number = 2.71727e-26
>
> I found a reference online (related to glm) that this might be linked to explanatory variables which have a high correlation. Would eliminating some of the variables do the job here as well?
>
>
> Then I have two other questions, where I couldn’t find any answers online:
>
> Is there an option to eliminate specific rows of a nb object? I tried the following command:
>
> rook234x259_2 <- rook234x259[index],
>
> where index is a vector with the row numbers I would like to keep. But this converts the nb to a list object, and there is no list2nb function (as much as I am aware of). And the option „subset“ does not only remove the rows, but also the corresponding cells which leads to a different neighborhood matrix.
>
>
> The last question refers to the package spgwr. I would like to run this model as well. The gwr function runs successfully, but when calling the variable where I stored the result, I get the following error message instead of my coefficients:
>
> Error in abs(coef.se <- xm[, cs.ind, drop = FALSE]) :
>   non-numeric argument to mathematical function
> In addition: Warning messages:
> 1: In print.gwr(x) : NAs in coefficients dropped
> 2: In cbind(CM, coefficients(x$lm)) :
>   number of rows of result is not a multiple of vector length (arg 2)
>
> Again, there aren’t any NaN values in my dataset, so I can’t really imagine where the NAs in the coefficients come from.
>
>
> Thank you very much for your help!
>
> Best regards,
>
> Raphael
>
>
>> Am 05.05.2019 um 15:21 schrieb Roger Bivand <[hidden email]>:
>>
>> On Sat, 4 May 2019, Raphael Mesaric via R-sig-Geo wrote:
>>
>>> Dear all,
>>>
>>> I have a question with regards to the function „lagsarlm" from the package spdep. My problem is that the function is not terminating. Of course, I have quite a big grid (depending on the selection either 34000 or 60600 entries) and I also have a lot of explanatory variables (about 40). But I am still wondering whether there is something wrong.
>>
>> If you read the help page (the function is in the spatialreg package and will be dropped from spdep shortly), and look at the references (Bivand et al. 2013), you will see that the default value if the method= argument is "eigen". For small numbers of observations, solving the eigenproblem of a dense weights object is not a problem, but becomes demanding on memory as n increases. You are using virual memory (and may run out of that too) which makes your machine unresponsive. If you choose an alternative method, typically "Matrix" for symmetric or similar to symmetric sparse weights, or "LU" for asymmetric sparse weights.
>>
>>> data(house, package="spData")
>>> dim(house)
>> [1] 25357    24
>>> LO_nb
>> Neighbour list object:
>> Number of regions: 25357
>> Number of nonzero links: 74874
>> Percentage nonzero weights: 0.01164489
>> Average number of links: 2.952794
>>> lw <- spdep::nb2listw(LO_nb)
>>> system.time(res <- spatialreg::lagsarlm(log(price) ~ TLA + frontage +
>> + rooms + yrbuilt, data=house, listw=lw, method="Matrix"))
>>   user  system elapsed
>>  0.606   0.011   0.631
>>
>> so less than 1 second on a standard laptop for similar to symmetric very sparse weights and ~ 25000 observations. Less sparse weights take somewhat longer. The function does not (maybe yet) prevent users trying to do things that are not advisable, because maybe they have 128GB RAM or more, and want to use eigenvalues rather than sparse matrix methods.
>>
>> Hope this clarifies,
>>
>> Roger
>>
>>>
>>> I tried to run a model based on the dataset „columbus“, and there I did not have any problems (but there are way fever entries and variables). I also compared the format of the required inputs, but everything seemed to be equivalent to the inputs used for the „columbus“ model.
>>>
>>> Do you have any idea what might be the reason for the extremely long (respectively infinite, it has not terminated yet) computation time? Any suggestions are greatly appreciated.
>>>
>>> If you would like to, I can also upload the corresponding code. However, the code includes some MAT-Files as I got the data in MATLAB. I do not yet attach them here because I read that attachments in another format than PDF are not desired as they may contain malicious software.
>>>
>>> Thank you for your help in advance!
>>>
>>> Best regards,
>>>
>>> Raphael Mesaric
>>> _______________________________________________
>>> R-sig-Geo mailing list
>>> [hidden email]
>>> https://stat.ethz.ch/mailman/listinfo/r-sig-geo
>>>
>>
>> --
>> Roger Bivand
>> Department of Economics, Norwegian School of Economics,
>> Helleveien 30, N-5045 Bergen, Norway.
>> voice: +47 55 95 93 55; e-mail: [hidden email]
>> https://orcid.org/0000-0003-2392-6140
>> https://scholar.google.no/citations?user=AWeghB0AAAAJ&hl=en
>
        [[alternative HTML version deleted]]

_______________________________________________
R-sig-Geo mailing list
[hidden email]
https://stat.ethz.ch/mailman/listinfo/r-sig-geo

Re: Spatial join/intersection in R

Mon, 05/06/2019 - 06:25


On 5/6/19 12:23 PM, Edzer Pebesma wrote:
> # raster does the same as stars:

That was too quick: raster::extract returns cell values,
stars::st_intersects returns cell indexes. If we fill the raster with
values 100:1 rather than 1:100, we see

library(stars)
# Loading required package: abind
# Loading required package: sf
# Linking to GEOS 3.7.0, GDAL 2.4.0, PROJ 5.2.0

# create 10 x 10 raster
st_as_stars(matrix(100:1, 10, 10)) %>%
        st_set_dimensions(names = c("x", "y")) %>%
        st_set_dimensions("x", values = 0:9) %>%
        st_set_dimensions("y", values = 9:0) -> r

# three points: at node, edge, and inside cell:
p3 = st_sfc(st_point(c(1,1)), st_point(c(1, 1.5)), st_point(c(1.1, 1.8)))

st_intersects(r, p3, transpose = TRUE)
# Sparse geometry binary predicate list of length 3, where the predicate
was `intersects'
#  1: 82
#  2: 72
#  3: 72
library(raster)
# Loading required package: sp
extract(as(r, "Raster"), as(p3, "Spatial"))
# [1] 19 29 29


--
Edzer Pebesma
Institute for Geoinformatics
Heisenbergstrasse 2, 48151 Muenster, Germany
Phone: +49 251 8333081

_______________________________________________
R-sig-Geo mailing list
[hidden email]
https://stat.ethz.ch/mailman/listinfo/r-sig-geo

pEpkey.asc (2K) Download Attachment

Re: Spatial join/intersection in R

Mon, 05/06/2019 - 05:23
I guess the problem is that if you represent a regular raster with
polygons, and the polygons won't tell you for points falling on polygon
edges or nodes that they should intersect with only one, rather than
with two (edge) or four (nodes) polygons.

The reprex below shows the difference of doing an intersection of three
points with a raster represented as polygons (at the end), or as a stars
object (begin). The stars object considers only the left and upper edge
and upper-left corner as part of the grid cell; the resulting plot is
attached.

library(stars)
# Loading required package: abind
# Loading required package: sf
# Linking to GEOS 3.7.0, GDAL 2.4.0, PROJ 5.2.0

# create 10 x 10 raster
st_as_stars(matrix(1:100, 10, 10)) %>%
        st_set_dimensions(names = c("x", "y")) %>%
        st_set_dimensions("x", values = 0:9) %>%
        st_set_dimensions("y", values = 9:0) -> r

# three points: at node, edge, and inside cell:
p3 = st_sfc(st_point(c(1,1)), st_point(c(1, 1.5)), st_point(c(1.1, 1.8)))

image(r, axes = TRUE, text_values = TRUE)
plot(p3, col = 'green', add = TRUE)

# query three points:
st_intersects(r, p3, transpose = TRUE)
# Sparse geometry binary predicate list of length 3, where the predicate
was `intersects'
#  1: 82
#  2: 72
#  3: 72

# query three points with same raster as polygons:
st_intersects(p3, st_as_sf(r))
# Sparse geometry binary predicate list of length 3, where the predicate
was `intersects'
#  1: 71, 72, 81, 82
#  2: 71, 72
#  3: 72

# raster does the same as stars:
library(raster)
# Loading required package: sp
extract(as(r, "Raster"), as(p3, "Spatial"))
# [1] 82 72 72


On 5/5/19 4:49 AM, Roozbeh Valavi wrote:
> Thank you for your explanation, Tom.
>
> Let me explain the case clearly. 
> The point layer is a very dense grid field collection across Great
> Britain. I wanted to do spatial cross-validation by specifying each
> point in one of the polygons (and assign a group of the polygons to a
> fold). Since the points have been sampled on a regular grid they
> sometimes fall exactly on the border of the polygons. If I use any sort
> of intersection or spatial join, the points on the borders cause a problem. 
>
> Currently, I use st_jitter() in sf package give a bit of randomness to
> the points. I wanted to see if there is another systematic way to avoid
> this.
>
> Thanks,
> Roozbeh
>
> PhD Candidate
> The Quantitative & Applied Ecology Group <http://qaeco.com/>
> School of BioSciences | Faculty of Science
> The University of Melbourne, VIC 3010, Australia
> M: +61 423 283 238  E: [hidden email]
> <mailto:[hidden email]>
>
>
> On Wed, May 1, 2019 at 3:16 PM Tom Philippi <[hidden email]
> <mailto:[hidden email]>> wrote:
>
>     Roozbeh--
>
>     What do you want to have happen with those points? Given the units
>     in your figure, you could unambiguously assign them to the
>     upper-right by adding a small value (0.000001) to both the X and Y
>     coordinates of your points.  Whether that is a sensible thing to do
>     depends on what your data are.
>
>     How or why do the points fall exactly on your polygon boundaries? 
>     What process puts them exactly on the polygon (grid cell)
>     boundaries?  It appears your polygon boundaries are a grid at
>     multiples of 0.05, although not all potential cells in a rectangle
>     are shown.  Are there many other points not on polygon boundaries
>     not shown in your figure, or are points only along grid lines?  If
>     the latter, does it make sense to assign them to _polygons_ rather
>     than line segments?
>
>     If you're doing a sample draw or measuring locations with only a few
>     decimal points of accuracy, and generating your polygon boundaries
>     at exact multiples of 0.05, then it might make sense to shift all of
>     your points + 1/10th of your point location resolution so that their
>     coordinates never fall on polygon boundaries.  There _might_ be
>     situations where instead of always adding (shifting up and right),
>     you should randomly add or subtract in each direction for each
>     point, but I'm stuck viewing this as points in sTomample units so I
>     can't think of such a case.
>
>     Tom
>     .  
>
>     On Tue, Apr 30, 2019 at 7:18 PM Roozbeh Valavi
>     <[hidden email]
>     <mailto:[hidden email]>> wrote:
>
>         Dear members,
>
>         I want to do a spatial join/intersection in R. The problem is
>         that some of my points are lying exactly at the border of the
>         adjacent polygons (see the figure attached). So these points are
>         either falling in both or none of the polygons! Is there any way
>         to avoid this?
>
>         Thanks in advance.
>         Roozbeh
>
>
>         image.png
>
>
>
>         --
>         *Roozbeh Valavi*
>         PhD Candidate
>         The Quantitative & Applied Ecology Group <http://qaeco.com/>
>         School of BioSciences | Faculty of Science
>         The University of Melbourne, VIC 3010, Australia
>         Mobile: +61 423 283 238
>         _______________________________________________
>         R-sig-Geo mailing list
>         [hidden email] <mailto:[hidden email]>
>         https://stat.ethz.ch/mailman/listinfo/r-sig-geo
>
>
> _______________________________________________
> R-sig-Geo mailing list
> [hidden email]
> https://stat.ethz.ch/mailman/listinfo/r-sig-geo
> --
Edzer Pebesma
Institute for Geoinformatics
Heisenbergstrasse 2, 48151 Muenster, Germany
Phone: +49 251 8333081

_______________________________________________
R-sig-Geo mailing list
[hidden email]
https://stat.ethz.ch/mailman/listinfo/r-sig-geo

Rplots.pdf (7K) Download Attachment
pEpkey.asc (2K) Download Attachment

Mark variograms: 3D(xyt) points + 1D marks

Mon, 05/06/2019 - 05:22
Dear all,

Are there packages that produce empirical *mark* variograms of 3D data (xyt) with 1D marks? I'm aware of the useful mvstpp package for creating spatial and temporal mark variograms but it doesn't completely fit my data structure.

Thanks, Tim.

        [[alternative HTML version deleted]]

_______________________________________________
R-sig-Geo mailing list
[hidden email]
https://stat.ethz.ch/mailman/listinfo/r-sig-geo

Re: Question spdep package - lagsarlm not terminating

Sun, 05/05/2019 - 08:21
On Sat, 4 May 2019, Raphael Mesaric via R-sig-Geo wrote:

> Dear all,
>
> I have a question with regards to the function „lagsarlm" from the
> package spdep. My problem is that the function is not terminating. Of
> course, I have quite a big grid (depending on the selection either 34000
> or 60600 entries) and I also have a lot of explanatory variables (about
> 40). But I am still wondering whether there is something wrong.

If you read the help page (the function is in the spatialreg package and
will be dropped from spdep shortly), and look at the references (Bivand et
al. 2013), you will see that the default value if the method= argument is
"eigen". For small numbers of observations, solving the eigenproblem of a
dense weights object is not a problem, but becomes demanding on memory as
n increases. You are using virual memory (and may run out of that too)
which makes your machine unresponsive. If you choose an alternative
method, typically "Matrix" for symmetric or similar to symmetric sparse
weights, or "LU" for asymmetric sparse weights.

> data(house, package="spData")
> dim(house)
[1] 25357    24
> LO_nb
Neighbour list object:
Number of regions: 25357
Number of nonzero links: 74874
Percentage nonzero weights: 0.01164489
Average number of links: 2.952794
> lw <- spdep::nb2listw(LO_nb)
> system.time(res <- spatialreg::lagsarlm(log(price) ~ TLA + frontage +
+ rooms + yrbuilt, data=house, listw=lw, method="Matrix"))
    user  system elapsed
   0.606   0.011   0.631

so less than 1 second on a standard laptop for similar to symmetric very
sparse weights and ~ 25000 observations. Less sparse weights take somewhat
longer. The function does not (maybe yet) prevent users trying to do
things that are not advisable, because maybe they have 128GB RAM or more,
and want to use eigenvalues rather than sparse matrix methods.

Hope this clarifies,

Roger

>
> I tried to run a model based on the dataset „columbus“, and there I did
> not have any problems (but there are way fever entries and variables). I
> also compared the format of the required inputs, but everything seemed
> to be equivalent to the inputs used for the „columbus“ model.
>
> Do you have any idea what might be the reason for the extremely long
> (respectively infinite, it has not terminated yet) computation time? Any
> suggestions are greatly appreciated.
>
> If you would like to, I can also upload the corresponding code. However,
> the code includes some MAT-Files as I got the data in MATLAB. I do not
> yet attach them here because I read that attachments in another format
> than PDF are not desired as they may contain malicious software.
>
> Thank you for your help in advance!
>
> Best regards,
>
> Raphael Mesaric
> _______________________________________________
> R-sig-Geo mailing list
> [hidden email]
> https://stat.ethz.ch/mailman/listinfo/r-sig-geo
> --
Roger Bivand
Department of Economics, Norwegian School of Economics,
Helleveien 30, N-5045 Bergen, Norway.
voice: +47 55 95 93 55; e-mail: [hidden email]
https://orcid.org/0000-0003-2392-6140
https://scholar.google.no/citations?user=AWeghB0AAAAJ&hl=en
_______________________________________________
R-sig-Geo mailing list
[hidden email]
https://stat.ethz.ch/mailman/listinfo/r-sig-geo
Roger Bivand
Department of Economics
Norwegian School of Economics
Helleveien 30
N-5045 Bergen, Norway

Re: Strange spatial reference system netCDF

Sun, 05/05/2019 - 08:12
Great, you were right. After adding the ubuntugis-unstable repository to my
system I got the same setup: Linking to GEOS 3.7.0, GDAL 2.4.0, PROJ 5.2.0
And now everithing is fine!
Thanks a lot
Now I'm working to save this new R object in a new netCDF file but I will
see it on my own, no problem

--
Maurizio Marchi,
PhD Forest Science - Ecological Mathematics
Skype ID: maurizioxyz
linux user 552742
#EUFGIS national Focal point for Italy
#Annals of Silvicultural Research Associated Editor
https://scholar.google.it/citations?hl=en&user=_2X6fu8AAAAJ
<https://scholar.google.it/citations?hl=en&user=_2X6fu8AAAAJ*>

        [[alternative HTML version deleted]]

_______________________________________________
R-sig-Geo mailing list
[hidden email]
https://stat.ethz.ch/mailman/listinfo/r-sig-geo

Re: Spatial join/intersection in R

Sat, 05/04/2019 - 21:49
Thank you for your explanation, Tom.
Let me explain the case clearly. The point layer is a very dense grid field collection across Great Britain. I wanted to do spatial cross-validation by specifying each point in one of the polygons (and assign a group of the polygons to a fold). Since the points have been sampled on a regular grid they sometimes fall exactly on the border of the polygons. If I use any sort of intersection or spatial join, the points on the borders cause a problem. 
Currently, I use st_jitter() in sf package give a bit of randomness to the points. I wanted to see if there is another systematic way to avoid this.
Thanks,Roozbeh
PhD Candidate
The Quantitative & Applied Ecology Group
School of BioSciences | Faculty of ScienceThe University of Melbourne, VIC 3010, AustraliaM: +61 423 283 238  E: [hidden email]


On Wed, May 1, 2019 at 3:16 PM Tom Philippi <[hidden email]> wrote:
Roozbeh--
What do you want to have happen with those points? Given the units in your figure, you could unambiguously assign them to the upper-right by adding a small value (0.000001) to both the X and Y coordinates of your points.  Whether that is a sensible thing to do depends on what your data are.
How or why do the points fall exactly on your polygon boundaries?  What process puts them exactly on the polygon (grid cell) boundaries?  It appears your polygon boundaries are a grid at multiples of 0.05, although not all potential cells in a rectangle are shown.  Are there many other points not on polygon boundaries not shown in your figure, or are points only along grid lines?  If the latter, does it make sense to assign them to _polygons_ rather than line segments?
If you're doing a sample draw or measuring locations with only a few decimal points of accuracy, and generating your polygon boundaries at exact multiples of 0.05, then it might make sense to shift all of your points + 1/10th of your point location resolution so that their coordinates never fall on polygon boundaries.  There _might_ be situations where instead of always adding (shifting up and right), you should randomly add or subtract in each direction for each point, but I'm stuck viewing this as points in sTomample units so I can't think of such a case.
Tom.  
On Tue, Apr 30, 2019 at 7:18 PM Roozbeh Valavi <[hidden email]> wrote:
Dear members,

I want to do a spatial join/intersection in R. The problem is that some of my points are lying exactly at the border of the adjacent polygons (see the figure attached). So these points are either falling in both or none of the polygons! Is there any way to avoid this?
Thanks in advance.Roozbeh





--
Roozbeh ValaviPhD Candidate
The Quantitative & Applied Ecology Group
School of BioSciences | Faculty of ScienceThe University of Melbourne, VIC 3010, AustraliaMobile: +61 423 283 238
_______________________________________________
R-sig-Geo mailing list
[hidden email]
https://stat.ethz.ch/mailman/listinfo/r-sig-geo

_______________________________________________
R-sig-Geo mailing list
[hidden email]
https://stat.ethz.ch/mailman/listinfo/r-sig-geo

Question spdep package - lagsarlm not terminating

Sat, 05/04/2019 - 15:22
Dear all,

I have a question with regards to the function „lagsarlm" from the package spdep. My problem is that the function is not terminating. Of course, I have quite a big grid (depending on the selection either 34000 or 60600 entries) and I also have a lot of explanatory variables (about 40). But I am still wondering whether there is something wrong.

I tried to run a model based on the dataset „columbus“, and there I did not have any problems (but there are way fever entries and variables). I also compared the format of the required inputs, but everything seemed to be equivalent to the inputs used for the „columbus“ model.

Do you have any idea what might be the reason for the extremely long (respectively infinite, it has not terminated yet) computation time? Any suggestions are greatly appreciated.

If you would like to, I can also upload the corresponding code. However, the code includes some MAT-Files as I got the data in MATLAB. I do not yet attach them here because I read that attachments in another format than PDF are not desired as they may contain malicious software.

Thank you for your help in advance!

Best regards,

Raphael Mesaric
_______________________________________________
R-sig-Geo mailing list
[hidden email]
https://stat.ethz.ch/mailman/listinfo/r-sig-geo

Re: R-sig-Geo Digest, Vol 189, Issue 3

Sat, 05/04/2019 - 12:24
Thanks, I'll check it out.

Il giorno sab 4 mag 2019 alle 18:40 Edzer Pebesma <
[hidden email]> ha scritto:

> The problem might be in the PROJ version; PROJ has seen a lot of
> development the last few years. I'm on
>
> Linking to GEOS 3.7.0, GDAL 2.4.0, PROJ 5.2.0
>
> Since you are on ubuntu, it should be pretty easy to get that too; see
> sf github pages for installing from ppa:ubuntugis-unstable:
> https://github.com/r-spatial/sf
>
> On 5/4/19 5:19 PM, Maurizio Marchi wrote:
> > Dear Prof.
> > Many thanks for your quick reply. Anyway I still have some differences
> > with your output, as well as some errors in the R console. Actually,
> > practically speaking, in my .png plot (attached) the origin of the axes
> > is still in Germany and the left and right borders of the image are
> > still rectilinear.
> >
> > *Here my R output using your code with the sessionInfo() at the end. I
> > hope it helps*
> >
> > R version 3.6.0 (2019-04-26) -- "Planting of a Tree" Copyright (C) 2019
> > The R Foundation for Statistical Computing Platform: x86_64-pc-linux-gnu
> > (64-bit) R is free software and comes with ABSOLUTELY NO WARRANTY. You
> > are welcome to redistribute it under certain conditions. Type
> > 'license()' or 'licence()' for distribution details. Natural language
> > support but running in an English locale R is a collaborative project
> > with many contributors. Type 'contributors()' for more information and
> > 'citation()' on how to cite R or R packages in publications. Type
> > 'demo()' for some demos, 'help()' for on-line help, or 'help.start()'
> > for an HTML browser interface to help. Type 'q()' to quit R. > f =
> > 'tas_rcp85_land-rcm_eur_12km_01_mon_198012-208011.nc
> > <http://tas_rcp85_land-rcm_eur_12km_01_mon_198012-208011.nc>' >
> > library(stars) Loading required package: abind Loading required package:
> > sf Linking to GEOS 3.6.2, GDAL 2.2.3, PROJ 4.9.3 > r = read_ncdf(f,
> > ncsub = cbind(start = c(1, 1, 1, 1), count = c(418, 406, 3, 1)),
> > eps=1e-3) > rx = read_stars(f, proxy = TRUE) # only for the crs! Warning
> > messages: 1: In CPL_read_gdal(as.character(x), as.character(options),
> > as.character(driver), : GDAL Message 1: The dataset has several
> > variables that could be identified as vector fields, but not all share
> > the same primary dimension. Consequently they will be ignored. 2: In
> > CPL_read_gdal(as.character(x), as.character(options),
> > as.character(driver), : GDAL Message 1: dimension #1 (time) is not a
> > Time dimension. 3: In CPL_read_gdal(as.character(x),
> > as.character(options), as.character(driver), : GDAL Message 1: dimension
> > #0 (ensemble_member) is not a Time dimension. > st_crs(r) = st_crs(rx) >
> > r0 = stars:::st_transform_proj.stars(r, 4326) > plot(r0[,,,1], border =
> > NA, axes = TRUE, reset = FALSE) Warning messages: 1: In
> > classInt::classIntervals(na.omit(values), min(nbreaks - 1, n.unq), : N
> > is large, and some styles will run very slowly; sampling imposed 2: In
> > st_is_longlat(x) : bounding box has potentially an invalid value range
> > for longlat data 3: In st_is_longlat(x) : bounding box has potentially
> > an invalid value range for longlat data 4: In st_is_longlat(x) :
> > bounding box has potentially an invalid value range for longlat data 5:
> > In st_is_longlat(x) : bounding box has potentially an invalid value
> > range for longlat data 6: In st_is_longlat(x) : bounding box has
> > potentially an invalid value range for longlat data >
> > library(rnaturalearth) > plot(ne_coastline(returnclass = "sf"), add =
> > TRUE, col = 'orange') Warning message: In
> > plot.sf(ne_coastline(returnclass = "sf"), add = TRUE, col = "orange") :
> > ignoring all but the first attribute > sessionInfo() R version 3.6.0
> > (2019-04-26) Platform: x86_64-pc-linux-gnu (64-bit) Running under:
> > Ubuntu 18.04.2 LTS Matrix products: default BLAS:
> > /usr/lib/x86_64-linux-gnu/blas/libblas.so.3.7.1 LAPACK:
> > /usr/lib/x86_64-linux-gnu/lapack/liblapack.so.3.7.1 locale: [1]
> > LC_CTYPE=en_GB.UTF-8 LC_NUMERIC=C LC_TIME=it_IT.UTF-8
> > LC_COLLATE=en_GB.UTF-8 LC_MONETARY=it_IT.UTF-8 LC_MESSAGES=en_GB.UTF-8
> > [7] LC_PAPER=it_IT.UTF-8 LC_NAME=C LC_ADDRESS=C LC_TELEPHONE=C
> > LC_MEASUREMENT=it_IT.UTF-8 LC_IDENTIFICATION=C attached base packages:
> > [1] stats graphics grDevices utils datasets methods base other attached
> > packages: [1] rnaturalearth_0.1.0 stars_0.3-1 sf_0.7-4 abind_1.4-5
> > loaded via a namespace (and not attached): [1] Rcpp_1.0.1 magrittr_1.5
> > units_0.6-3 tidyselect_0.2.5 lattice_0.20-38 R6_2.4.0 rlang_0.3.4 [8]
> > rnaturalearthdata_0.2.0 dplyr_0.8.0.1 tools_3.6.0 parallel_3.6.0
> > grid_3.6.0 KernSmooth_2.23-15 ncmeta_0.0.4 [15] e1071_1.7-1 DBI_1.0.0
> > class_7.3-15 assertthat_0.2.1 tibble_2.1.1 RNetCDF_1.9-1 crayon_1.3.4
> > [22] tidyr_0.8.3 purrr_0.3.2 PCICt_0.5-4.1 glue_1.3.1 sp_1.3-1
> > compiler_3.6.0 pillar_1.3.1 [29] classInt_0.3-3 pkgconfig_2.0.2
> >
> >
> > --
> > Maurizio Marchi,
> > PhD Forest Science - Ecological Mathematics
> > Skype ID: maurizioxyz
> > linux user 552742
> > #EUFGIS national Focal point for Italy
> > #Annals of Silvicultural Research Associated Editor
> > https://scholar.google.it/citations?hl=en&user=_2X6fu8AAAAJ
> > <https://scholar.google.it/citations?hl=en&user=_2X6fu8AAAAJ*>
> >
> >
> >
> >
> > Il giorno sab 4 mag 2019 alle ore 12:10 <[hidden email]
> > <mailto:[hidden email]>> ha scritto:
> >
> >     Send R-sig-Geo mailing list submissions to
> >             [hidden email] <mailto:[hidden email]>
> >
> >     To subscribe or unsubscribe via the World Wide Web, visit
> >             https://stat.ethz.ch/mailman/listinfo/r-sig-geo
> >     or, via email, send a message with subject or body 'help' to
> >             [hidden email]
> >     <mailto:[hidden email]>
> >
> >     You can reach the person managing the list at
> >             [hidden email]
> >     <mailto:[hidden email]>
> >
> >     When replying, please edit your Subject line so it is more specific
> >     than "Re: Contents of R-sig-Geo digest..."
> >
> >
> >     Today's Topics:
> >
> >        1. Re: Strange spatial reference system netCDF (Edzer Pebesma)
> >
> >
>  ----------------------------------------------------------------------
> >
> >     Message: 1
> >     Date: Fri, 3 May 2019 16:58:01 +0200
> >     From: Edzer Pebesma <[hidden email]
> >     <mailto:[hidden email]>>
> >     To: [hidden email] <mailto:[hidden email]>
> >     Subject: Re: [R-sig-Geo] Strange spatial reference system netCDF
> >     Message-ID: <[hidden email]
> >     <mailto:[hidden email]>>
> >     Content-Type: text/plain; charset="utf-8"
> >
> >     Dear Maurizio,
> >
> >     an attempt to do something along this line is:
> >
> >     f = "tas_rcp85_land-rcm_eur_12km_01_mon_198012-208011.nc
> >     <http://tas_rcp85_land-rcm_eur_12km_01_mon_198012-208011.nc>"
> >     library(stars)
> >     r = read_ncdf(f, ncsub = cbind(start = c(1, 1, 1, 1), count = c(418,
> >     406, 3, 1)), eps=1e-3)
> >     rx = read_stars(f, proxy = TRUE) # only for the crs!
> >     st_crs(r) = st_crs(rx)
> >     r0 = stars:::st_transform_proj.stars(r, 4326)
> >     png("x.png")
> >     plot(r0[,,,1], border = NA, axes = TRUE, reset = FALSE)
> >     library(rnaturalearth)
> >     plot(ne_coastline(returnclass = "sf"), add = TRUE, col = 'orange')
> >
> >     Note that the output object time stamps respect the 360-day calendar
> >     (using pkg PCICt).
> >
> >     You'll find some discussion, and the output image, here:
> >     https://github.com/r-spatial/stars/issues/175
> >
> >     Feel free to post follow-up questions here, or to github.
> >
> >     On 5/2/19 6:49 PM, Maurizio Marchi wrote:
> >     > Dear list,
> >     > I'm working with large netCDF files from the UK Climate
> >     Projections portal (
> >     > https://www.metoffice.gov.uk/research/collaboration/ukcp). More in
> >     detail
> >     > I'm reading with the ncdf4 library this
> >     >
> >     <
> http://data.ceda.ac.uk/badc/ukcp18/data/land-rcm/eur/12km/rcp85/01/tas/mon/latest/
> >
> >     > file. Once loaded O can easily handle it but it is a strange
> reference
> >     > system as follow:
> >     > rotated_latitude_longitude[]
> >     >             grid_mapping_name: rotated_latitude_longitude
> >     >             longitude_of_prime_meridian: 0
> >     >             earth_radius: *6371229*
> >     >             grid_north_pole_latitude: 39.25
> >     >             grid_north_pole_longitude: 198
> >     >             north_pole_grid_longitude: 0
> >     > If I load the .nc file in QGIS I see that the SR is "+proj=longlat
> >     > +a=6371229 +b=6371229 +no_defs" where the values associated to *a*
> and
> >     > *b* variables
> >     > are the earth radius. Even if QGIS can read it, the raw file isn't
> >     > projected properly (it seems tha QGIS is not able to handle such
> >     > projection).
> >     > Finally, using the ncdf4+raster libraries, I can easily generate e
> >     raster
> >     > image but then I'm not able to understand I could I re project
> >     this raster
> >     > in WGS84 reference system.
> >     > Any helps?
> >     > Cheers
> >     >
> >     > --
> >     > Maurizio Marchi,
> >     > PhD Forest Science - Ecological Mathematics
> >     > Skype ID: maurizioxyz
> >     > linux user 552742
> >     > #EUFGIS national Focal point for Italy
> >     > #Annals of Silvicultural Research Associated Editor
> >     > https://scholar.google.it/citations?hl=en&user=_2X6fu8AAAAJ
> >     > <https://scholar.google.it/citations?hl=en&user=_2X6fu8AAAAJ*>
> >     >
> >     >       [[alternative HTML version deleted]]
> >     >
> >     > _______________________________________________
> >     > R-sig-Geo mailing list
> >     > [hidden email] <mailto:[hidden email]>
> >     > https://stat.ethz.ch/mailman/listinfo/r-sig-geo
> >     >
> >
> >     --
> >     Edzer Pebesma
> >     Institute for Geoinformatics
> >     Heisenbergstrasse 2, 48151 Muenster, Germany
> >     Phone: +49 251 8333081
> >
> >     -------------- next part --------------
> >     A non-text attachment was scrubbed...
> >     Name: pEpkey.asc
> >     Type: application/pgp-keys
> >     Size: 2472 bytes
> >     Desc: not available
> >     URL:
> >     <
> https://stat.ethz.ch/pipermail/r-sig-geo/attachments/20190503/34eee3d8/attachment-0001.bin
> >
> >
> >
> >     ------------------------------
> >
> >     Subject: Digest Footer
> >
> >     _______________________________________________
> >     R-sig-Geo mailing list
> >     [hidden email] <mailto:[hidden email]>
> >     https://stat.ethz.ch/mailman/listinfo/r-sig-geo
> >
> >
> >     ------------------------------
> >
> >     End of R-sig-Geo Digest, Vol 189, Issue 3
> >     *****************************************
> >
>
> --
> Edzer Pebesma
> Institute for Geoinformatics
> Heisenbergstrasse 2, 48151 Muenster, Germany
> Phone: +49 251 8333081
> --
--
Maurizio Marchi,
PhD Forest Science - Ecological Mathematics
Skype ID: maurizioxyz
linux user 552742
#EUFGIS national Focal point for Italy
#Annals of Silvicultural Research Associated Editor
https://scholar.google.it/citations?hl=en&user=_2X6fu8AAAAJ
<https://scholar.google.it/citations?hl=en&user=_2X6fu8AAAAJ*>

        [[alternative HTML version deleted]]

_______________________________________________
R-sig-Geo mailing list
[hidden email]
https://stat.ethz.ch/mailman/listinfo/r-sig-geo

Re: R-sig-Geo Digest, Vol 189, Issue 3

Sat, 05/04/2019 - 11:40
The problem might be in the PROJ version; PROJ has seen a lot of
development the last few years. I'm on

Linking to GEOS 3.7.0, GDAL 2.4.0, PROJ 5.2.0

Since you are on ubuntu, it should be pretty easy to get that too; see
sf github pages for installing from ppa:ubuntugis-unstable:
https://github.com/r-spatial/sf

On 5/4/19 5:19 PM, Maurizio Marchi wrote:
> Dear Prof.
> Many thanks for your quick reply. Anyway I still have some differences
> with your output, as well as some errors in the R console. Actually,
> practically speaking, in my .png plot (attached) the origin of the axes
> is still in Germany and the left and right borders of the image are
> still rectilinear.
>
> *Here my R output using your code with the sessionInfo() at the end. I
> hope it helps*
>
> R version 3.6.0 (2019-04-26) -- "Planting of a Tree" Copyright (C) 2019
> The R Foundation for Statistical Computing Platform: x86_64-pc-linux-gnu
> (64-bit) R is free software and comes with ABSOLUTELY NO WARRANTY. You
> are welcome to redistribute it under certain conditions. Type
> 'license()' or 'licence()' for distribution details. Natural language
> support but running in an English locale R is a collaborative project
> with many contributors. Type 'contributors()' for more information and
> 'citation()' on how to cite R or R packages in publications. Type
> 'demo()' for some demos, 'help()' for on-line help, or 'help.start()'
> for an HTML browser interface to help. Type 'q()' to quit R. > f =
> 'tas_rcp85_land-rcm_eur_12km_01_mon_198012-208011.nc
> <http://tas_rcp85_land-rcm_eur_12km_01_mon_198012-208011.nc>' >
> library(stars) Loading required package: abind Loading required package:
> sf Linking to GEOS 3.6.2, GDAL 2.2.3, PROJ 4.9.3 > r = read_ncdf(f,
> ncsub = cbind(start = c(1, 1, 1, 1), count = c(418, 406, 3, 1)),
> eps=1e-3) > rx = read_stars(f, proxy = TRUE) # only for the crs! Warning
> messages: 1: In CPL_read_gdal(as.character(x), as.character(options),
> as.character(driver), : GDAL Message 1: The dataset has several
> variables that could be identified as vector fields, but not all share
> the same primary dimension. Consequently they will be ignored. 2: In
> CPL_read_gdal(as.character(x), as.character(options),
> as.character(driver), : GDAL Message 1: dimension #1 (time) is not a
> Time dimension. 3: In CPL_read_gdal(as.character(x),
> as.character(options), as.character(driver), : GDAL Message 1: dimension
> #0 (ensemble_member) is not a Time dimension. > st_crs(r) = st_crs(rx) >
> r0 = stars:::st_transform_proj.stars(r, 4326) > plot(r0[,,,1], border =
> NA, axes = TRUE, reset = FALSE) Warning messages: 1: In
> classInt::classIntervals(na.omit(values), min(nbreaks - 1, n.unq), : N
> is large, and some styles will run very slowly; sampling imposed 2: In
> st_is_longlat(x) : bounding box has potentially an invalid value range
> for longlat data 3: In st_is_longlat(x) : bounding box has potentially
> an invalid value range for longlat data 4: In st_is_longlat(x) :
> bounding box has potentially an invalid value range for longlat data 5:
> In st_is_longlat(x) : bounding box has potentially an invalid value
> range for longlat data 6: In st_is_longlat(x) : bounding box has
> potentially an invalid value range for longlat data >
> library(rnaturalearth) > plot(ne_coastline(returnclass = "sf"), add =
> TRUE, col = 'orange') Warning message: In
> plot.sf(ne_coastline(returnclass = "sf"), add = TRUE, col = "orange") :
> ignoring all but the first attribute > sessionInfo() R version 3.6.0
> (2019-04-26) Platform: x86_64-pc-linux-gnu (64-bit) Running under:
> Ubuntu 18.04.2 LTS Matrix products: default BLAS:
> /usr/lib/x86_64-linux-gnu/blas/libblas.so.3.7.1 LAPACK:
> /usr/lib/x86_64-linux-gnu/lapack/liblapack.so.3.7.1 locale: [1]
> LC_CTYPE=en_GB.UTF-8 LC_NUMERIC=C LC_TIME=it_IT.UTF-8
> LC_COLLATE=en_GB.UTF-8 LC_MONETARY=it_IT.UTF-8 LC_MESSAGES=en_GB.UTF-8
> [7] LC_PAPER=it_IT.UTF-8 LC_NAME=C LC_ADDRESS=C LC_TELEPHONE=C
> LC_MEASUREMENT=it_IT.UTF-8 LC_IDENTIFICATION=C attached base packages:
> [1] stats graphics grDevices utils datasets methods base other attached
> packages: [1] rnaturalearth_0.1.0 stars_0.3-1 sf_0.7-4 abind_1.4-5
> loaded via a namespace (and not attached): [1] Rcpp_1.0.1 magrittr_1.5
> units_0.6-3 tidyselect_0.2.5 lattice_0.20-38 R6_2.4.0 rlang_0.3.4 [8]
> rnaturalearthdata_0.2.0 dplyr_0.8.0.1 tools_3.6.0 parallel_3.6.0
> grid_3.6.0 KernSmooth_2.23-15 ncmeta_0.0.4 [15] e1071_1.7-1 DBI_1.0.0
> class_7.3-15 assertthat_0.2.1 tibble_2.1.1 RNetCDF_1.9-1 crayon_1.3.4
> [22] tidyr_0.8.3 purrr_0.3.2 PCICt_0.5-4.1 glue_1.3.1 sp_1.3-1
> compiler_3.6.0 pillar_1.3.1 [29] classInt_0.3-3 pkgconfig_2.0.2
>
>
> --
> Maurizio Marchi,
> PhD Forest Science - Ecological Mathematics
> Skype ID: maurizioxyz
> linux user 552742
> #EUFGIS national Focal point for Italy
> #Annals of Silvicultural Research Associated Editor
> https://scholar.google.it/citations?hl=en&user=_2X6fu8AAAAJ
> <https://scholar.google.it/citations?hl=en&user=_2X6fu8AAAAJ*>
>
>
>
>
> Il giorno sab 4 mag 2019 alle ore 12:10 <[hidden email]
> <mailto:[hidden email]>> ha scritto:
>
>     Send R-sig-Geo mailing list submissions to
>             [hidden email] <mailto:[hidden email]>
>
>     To subscribe or unsubscribe via the World Wide Web, visit
>             https://stat.ethz.ch/mailman/listinfo/r-sig-geo
>     or, via email, send a message with subject or body 'help' to
>             [hidden email]
>     <mailto:[hidden email]>
>
>     You can reach the person managing the list at
>             [hidden email]
>     <mailto:[hidden email]>
>
>     When replying, please edit your Subject line so it is more specific
>     than "Re: Contents of R-sig-Geo digest..."
>
>
>     Today's Topics:
>
>        1. Re: Strange spatial reference system netCDF (Edzer Pebesma)
>
>     ----------------------------------------------------------------------
>
>     Message: 1
>     Date: Fri, 3 May 2019 16:58:01 +0200
>     From: Edzer Pebesma <[hidden email]
>     <mailto:[hidden email]>>
>     To: [hidden email] <mailto:[hidden email]>
>     Subject: Re: [R-sig-Geo] Strange spatial reference system netCDF
>     Message-ID: <[hidden email]
>     <mailto:[hidden email]>>
>     Content-Type: text/plain; charset="utf-8"
>
>     Dear Maurizio,
>
>     an attempt to do something along this line is:
>
>     f = "tas_rcp85_land-rcm_eur_12km_01_mon_198012-208011.nc
>     <http://tas_rcp85_land-rcm_eur_12km_01_mon_198012-208011.nc>"
>     library(stars)
>     r = read_ncdf(f, ncsub = cbind(start = c(1, 1, 1, 1), count = c(418,
>     406, 3, 1)), eps=1e-3)
>     rx = read_stars(f, proxy = TRUE) # only for the crs!
>     st_crs(r) = st_crs(rx)
>     r0 = stars:::st_transform_proj.stars(r, 4326)
>     png("x.png")
>     plot(r0[,,,1], border = NA, axes = TRUE, reset = FALSE)
>     library(rnaturalearth)
>     plot(ne_coastline(returnclass = "sf"), add = TRUE, col = 'orange')
>
>     Note that the output object time stamps respect the 360-day calendar
>     (using pkg PCICt).
>
>     You'll find some discussion, and the output image, here:
>     https://github.com/r-spatial/stars/issues/175
>
>     Feel free to post follow-up questions here, or to github.
>
>     On 5/2/19 6:49 PM, Maurizio Marchi wrote:
>     > Dear list,
>     > I'm working with large netCDF files from the UK Climate
>     Projections portal (
>     > https://www.metoffice.gov.uk/research/collaboration/ukcp). More in
>     detail
>     > I'm reading with the ncdf4 library this
>     >
>     <http://data.ceda.ac.uk/badc/ukcp18/data/land-rcm/eur/12km/rcp85/01/tas/mon/latest/>
>     > file. Once loaded O can easily handle it but it is a strange reference
>     > system as follow:
>     > rotated_latitude_longitude[]
>     >             grid_mapping_name: rotated_latitude_longitude
>     >             longitude_of_prime_meridian: 0
>     >             earth_radius: *6371229*
>     >             grid_north_pole_latitude: 39.25
>     >             grid_north_pole_longitude: 198
>     >             north_pole_grid_longitude: 0
>     > If I load the .nc file in QGIS I see that the SR is "+proj=longlat
>     > +a=6371229 +b=6371229 +no_defs" where the values associated to *a* and
>     > *b* variables
>     > are the earth radius. Even if QGIS can read it, the raw file isn't
>     > projected properly (it seems tha QGIS is not able to handle such
>     > projection).
>     > Finally, using the ncdf4+raster libraries, I can easily generate e
>     raster
>     > image but then I'm not able to understand I could I re project
>     this raster
>     > in WGS84 reference system.
>     > Any helps?
>     > Cheers
>     >
>     > --
>     > Maurizio Marchi,
>     > PhD Forest Science - Ecological Mathematics
>     > Skype ID: maurizioxyz
>     > linux user 552742
>     > #EUFGIS national Focal point for Italy
>     > #Annals of Silvicultural Research Associated Editor
>     > https://scholar.google.it/citations?hl=en&user=_2X6fu8AAAAJ
>     > <https://scholar.google.it/citations?hl=en&user=_2X6fu8AAAAJ*>
>     >
>     >       [[alternative HTML version deleted]]
>     >
>     > _______________________________________________
>     > R-sig-Geo mailing list
>     > [hidden email] <mailto:[hidden email]>
>     > https://stat.ethz.ch/mailman/listinfo/r-sig-geo
>     >
>
>     --
>     Edzer Pebesma
>     Institute for Geoinformatics
>     Heisenbergstrasse 2, 48151 Muenster, Germany
>     Phone: +49 251 8333081
>
>     -------------- next part --------------
>     A non-text attachment was scrubbed...
>     Name: pEpkey.asc
>     Type: application/pgp-keys
>     Size: 2472 bytes
>     Desc: not available
>     URL:
>     <https://stat.ethz.ch/pipermail/r-sig-geo/attachments/20190503/34eee3d8/attachment-0001.bin>
>
>
>     ------------------------------
>
>     Subject: Digest Footer
>
>     _______________________________________________
>     R-sig-Geo mailing list
>     [hidden email] <mailto:[hidden email]>
>     https://stat.ethz.ch/mailman/listinfo/r-sig-geo
>
>
>     ------------------------------
>
>     End of R-sig-Geo Digest, Vol 189, Issue 3
>     *****************************************
> --
Edzer Pebesma
Institute for Geoinformatics
Heisenbergstrasse 2, 48151 Muenster, Germany
Phone: +49 251 8333081

_______________________________________________
R-sig-Geo mailing list
[hidden email]
https://stat.ethz.ch/mailman/listinfo/r-sig-geo

pEpkey.asc (2K) Download Attachment

Re: Strange spatial reference system netCDF

Fri, 05/03/2019 - 09:58
Dear Maurizio,

an attempt to do something along this line is:

f = "tas_rcp85_land-rcm_eur_12km_01_mon_198012-208011.nc"
library(stars)
r = read_ncdf(f, ncsub = cbind(start = c(1, 1, 1, 1), count = c(418,
406, 3, 1)), eps=1e-3)
rx = read_stars(f, proxy = TRUE) # only for the crs!
st_crs(r) = st_crs(rx)
r0 = stars:::st_transform_proj.stars(r, 4326)
png("x.png")
plot(r0[,,,1], border = NA, axes = TRUE, reset = FALSE)
library(rnaturalearth)
plot(ne_coastline(returnclass = "sf"), add = TRUE, col = 'orange')

Note that the output object time stamps respect the 360-day calendar
(using pkg PCICt).

You'll find some discussion, and the output image, here:
https://github.com/r-spatial/stars/issues/175

Feel free to post follow-up questions here, or to github.

On 5/2/19 6:49 PM, Maurizio Marchi wrote:
> Dear list,
> I'm working with large netCDF files from the UK Climate Projections portal (
> https://www.metoffice.gov.uk/research/collaboration/ukcp). More in detail
> I'm reading with the ncdf4 library this
> <http://data.ceda.ac.uk/badc/ukcp18/data/land-rcm/eur/12km/rcp85/01/tas/mon/latest/>
> file. Once loaded O can easily handle it but it is a strange reference
> system as follow:
> rotated_latitude_longitude[]
>             grid_mapping_name: rotated_latitude_longitude
>             longitude_of_prime_meridian: 0
>             earth_radius: *6371229*
>             grid_north_pole_latitude: 39.25
>             grid_north_pole_longitude: 198
>             north_pole_grid_longitude: 0
> If I load the .nc file in QGIS I see that the SR is "+proj=longlat
> +a=6371229 +b=6371229 +no_defs" where the values associated to *a* and
> *b* variables
> are the earth radius. Even if QGIS can read it, the raw file isn't
> projected properly (it seems tha QGIS is not able to handle such
> projection).
> Finally, using the ncdf4+raster libraries, I can easily generate e raster
> image but then I'm not able to understand I could I re project this raster
> in WGS84 reference system.
> Any helps?
> Cheers
>
> --
> Maurizio Marchi,
> PhD Forest Science - Ecological Mathematics
> Skype ID: maurizioxyz
> linux user 552742
> #EUFGIS national Focal point for Italy
> #Annals of Silvicultural Research Associated Editor
> https://scholar.google.it/citations?hl=en&user=_2X6fu8AAAAJ
> <https://scholar.google.it/citations?hl=en&user=_2X6fu8AAAAJ*>
>
> [[alternative HTML version deleted]]
>
> _______________________________________________
> R-sig-Geo mailing list
> [hidden email]
> https://stat.ethz.ch/mailman/listinfo/r-sig-geo
> --
Edzer Pebesma
Institute for Geoinformatics
Heisenbergstrasse 2, 48151 Muenster, Germany
Phone: +49 251 8333081

_______________________________________________
R-sig-Geo mailing list
[hidden email]
https://stat.ethz.ch/mailman/listinfo/r-sig-geo

pEpkey.asc (2K) Download Attachment

Strange spatial reference system netCDF

Thu, 05/02/2019 - 11:49
Dear list,
I'm working with large netCDF files from the UK Climate Projections portal (
https://www.metoffice.gov.uk/research/collaboration/ukcp). More in detail
I'm reading with the ncdf4 library this
<http://data.ceda.ac.uk/badc/ukcp18/data/land-rcm/eur/12km/rcp85/01/tas/mon/latest/>
file. Once loaded O can easily handle it but it is a strange reference
system as follow:
rotated_latitude_longitude[]
            grid_mapping_name: rotated_latitude_longitude
            longitude_of_prime_meridian: 0
            earth_radius: *6371229*
            grid_north_pole_latitude: 39.25
            grid_north_pole_longitude: 198
            north_pole_grid_longitude: 0
If I load the .nc file in QGIS I see that the SR is "+proj=longlat
+a=6371229 +b=6371229 +no_defs" where the values associated to *a* and
*b* variables
are the earth radius. Even if QGIS can read it, the raw file isn't
projected properly (it seems tha QGIS is not able to handle such
projection).
Finally, using the ncdf4+raster libraries, I can easily generate e raster
image but then I'm not able to understand I could I re project this raster
in WGS84 reference system.
Any helps?
Cheers

--
Maurizio Marchi,
PhD Forest Science - Ecological Mathematics
Skype ID: maurizioxyz
linux user 552742
#EUFGIS national Focal point for Italy
#Annals of Silvicultural Research Associated Editor
https://scholar.google.it/citations?hl=en&user=_2X6fu8AAAAJ
<https://scholar.google.it/citations?hl=en&user=_2X6fu8AAAAJ*>

        [[alternative HTML version deleted]]

_______________________________________________
R-sig-Geo mailing list
[hidden email]
https://stat.ethz.ch/mailman/listinfo/r-sig-geo

Re: Spatial join/intersection in R

Wed, 05/01/2019 - 00:15
Roozbeh--
What do you want to have happen with those points? Given the units in your figure, you could unambiguously assign them to the upper-right by adding a small value (0.000001) to both the X and Y coordinates of your points.  Whether that is a sensible thing to do depends on what your data are.
How or why do the points fall exactly on your polygon boundaries?  What process puts them exactly on the polygon (grid cell) boundaries?  It appears your polygon boundaries are a grid at multiples of 0.05, although not all potential cells in a rectangle are shown.  Are there many other points not on polygon boundaries not shown in your figure, or are points only along grid lines?  If the latter, does it make sense to assign them to _polygons_ rather than line segments?
If you're doing a sample draw or measuring locations with only a few decimal points of accuracy, and generating your polygon boundaries at exact multiples of 0.05, then it might make sense to shift all of your points + 1/10th of your point location resolution so that their coordinates never fall on polygon boundaries.  There _might_ be situations where instead of always adding (shifting up and right), you should randomly add or subtract in each direction for each point, but I'm stuck viewing this as points in sTomample units so I can't think of such a case.
Tom.  
On Tue, Apr 30, 2019 at 7:18 PM Roozbeh Valavi <[hidden email]> wrote:
Dear members,

I want to do a spatial join/intersection in R. The problem is that some of my points are lying exactly at the border of the adjacent polygons (see the figure attached). So these points are either falling in both or none of the polygons! Is there any way to avoid this?
Thanks in advance.Roozbeh





--
Roozbeh ValaviPhD Candidate
The Quantitative & Applied Ecology Group
School of BioSciences | Faculty of ScienceThe University of Melbourne, VIC 3010, AustraliaMobile: +61 423 283 238
_______________________________________________
R-sig-Geo mailing list
[hidden email]
https://stat.ethz.ch/mailman/listinfo/r-sig-geo

_______________________________________________
R-sig-Geo mailing list
[hidden email]
https://stat.ethz.ch/mailman/listinfo/r-sig-geo

Spatial join/intersection in R

Tue, 04/30/2019 - 21:18
Dear members,

I want to do a spatial join/intersection in R. The problem is that some of my points are lying exactly at the border of the adjacent polygons (see the figure attached). So these points are either falling in both or none of the polygons! Is there any way to avoid this?
Thanks in advance.Roozbeh





--
Roozbeh ValaviPhD Candidate
The Quantitative & Applied Ecology Group
School of BioSciences | Faculty of ScienceThe University of Melbourne, VIC 3010, AustraliaMobile: +61 423 283 238

_______________________________________________
R-sig-Geo mailing list
[hidden email]
https://stat.ethz.ch/mailman/listinfo/r-sig-geo

Re: raster - units assignment error in reading NetCDF4

Tue, 04/30/2019 - 07:28
It looks like this is a curvilinear grid; you could read & plot it with

library(stars)
r = read_stars("L8_OLI_2017_03_18_00_09_19_093086_L2R.nc", curvilinear =
c("lon", "lat"))
plot(r[1], border = NA)

Printing r shows the units of each of the variables. The plotting takes
a while because the data grid is not aligned with the pixel grid of the
plotting device.


On 4/30/19 12:43 PM, Alexandre Castagna wrote:
> Hi group,
>
> I'm trying to read a NetCDF4 file, but I get the following error message:
>
> fl <- 'L8_OLI_2017_03_18_00_09_19_093086_L2R.nc'
> r <- raster(fl, varname = 'rhos_443')
>
> Error in (function (cl, name, valueClass)  :
>   assignment of an object of class “numeric” is not valid for @‘unit’ in an
> object of class “.SingleLayerData”; is(value, "character") is not TRUE
>
> This data is OLI Landsat 8 atmospheric 'corrected' imagery with the ACOLITE
> software (link <https://github.com/acolite/acolite>).
> In a previous version of the software, exploring the file after nc_open
> function revealed that units was numeric ('num' qualifier was shown in R
> object print); but since, the developer has committed a change for all
> units to be written as characters. The unit in question is '1' (unitless,
> for bihemispherical reflectance). A new exploration after the update to
> check the units type does not show the qualifier 'num' anymore, but the
> error keep the same. Maybe R is translating '1' to 1 automatically?
>
> I can go around and create a raster directly from data read with ncdf4
> package, like:
>
> ncfl <- nc_open(fl)
> r     <- raster(t(ncvar_get(ncfl, 'rhos_443')))
>
> But for a number of reasons that is less ideal. A temporary link to
> download the example data file is: https://we.tl/t-nENAV7tmBg
>
> Kind regards,
>
> Alexandre Castagna Mourão e Lima
>
> [[alternative HTML version deleted]]
>
> _______________________________________________
> R-sig-Geo mailing list
> [hidden email]
> https://stat.ethz.ch/mailman/listinfo/r-sig-geo
> --
Edzer Pebesma
Institute for Geoinformatics
Heisenbergstrasse 2, 48151 Muenster, Germany
Phone: +49 251 8333081

_______________________________________________
R-sig-Geo mailing list
[hidden email]
https://stat.ethz.ch/mailman/listinfo/r-sig-geo

pEpkey.asc (2K) Download Attachment

raster - units assignment error in reading NetCDF4

Tue, 04/30/2019 - 05:43
Hi group,

I'm trying to read a NetCDF4 file, but I get the following error message:

fl <- 'L8_OLI_2017_03_18_00_09_19_093086_L2R.nc'
r <- raster(fl, varname = 'rhos_443')

Error in (function (cl, name, valueClass)  :
  assignment of an object of class “numeric” is not valid for @‘unit’ in an
object of class “.SingleLayerData”; is(value, "character") is not TRUE

This data is OLI Landsat 8 atmospheric 'corrected' imagery with the ACOLITE
software (link <https://github.com/acolite/acolite>).
In a previous version of the software, exploring the file after nc_open
function revealed that units was numeric ('num' qualifier was shown in R
object print); but since, the developer has committed a change for all
units to be written as characters. The unit in question is '1' (unitless,
for bihemispherical reflectance). A new exploration after the update to
check the units type does not show the qualifier 'num' anymore, but the
error keep the same. Maybe R is translating '1' to 1 automatically?

I can go around and create a raster directly from data read with ncdf4
package, like:

ncfl <- nc_open(fl)
r     <- raster(t(ncvar_get(ncfl, 'rhos_443')))

But for a number of reasons that is less ideal. A temporary link to
download the example data file is: https://we.tl/t-nENAV7tmBg

Kind regards,

Alexandre Castagna Mourão e Lima

        [[alternative HTML version deleted]]

_______________________________________________
R-sig-Geo mailing list
[hidden email]
https://stat.ethz.ch/mailman/listinfo/r-sig-geo

Pages