**for**

__archive__**R-sig-geo**, not a forum. It is not possible to post through Nabble - you may not start a new thread nor follow up an existing thread. If you wish to post, but are not subscribed to the list through the list homepage, subscribe first

**(through the list homepage, not via Nabble)**and post from your subscribed email address. Until 2015-06-20, subscribers could post through Nabble, but policy has been changed as too many non-subscribers misunderstood the interface.

*Updated:*2 hours 39 min ago

### Re: Question niche based model whith R

Ecological niche factor analysis. A few of them are available in the

'adehabitatHS' package:

https://cran.r-project.org/package=adehabitatHS

Detailed explanations are available in the vignette (section 3: Design I

studies):

https://cran.r-project.org/web/packages/adehabitatHS/vignettes/adehabitatHS.pdf

Hope this helps,

Mathieu.

On 04/10/2017 05:12 AM, Soufianou Abou via R-sig-Geo wrote:

> Hello dears,I work on the modeling of the ecological niche of striga, I have presence data (data) and bioclimatic variables and other variables around.Indeed, my question how should I proceed to model the ecological niche with software R? Pending receipt of my request, please accept my best regards.SADDA Abou-Soufianou -------------------------------------- DoctorantUniversité Dan Dicko Dankoulodo de Maradi,120 avenue Maman Koraou,ADS, MaradiBP 465 Maradi/Niger E-mail: [hidden email] : (+227)96269987

> [[alternative HTML version deleted]]

>

> _______________________________________________

> R-sig-Geo mailing list

> [hidden email]

> https://stat.ethz.ch/mailman/listinfo/r-sig-geo

>

--

Mathieu Basille

[hidden email] | http://ase-research.org/basille

+1 954-577-6314 | University of Florida FLREC

« Le tout est de tout dire, et je manque de mots

Et je manque de temps, et je manque d'audace. »

— Paul Éluard

_______________________________________________

R-sig-Geo mailing list

[hidden email]

https://stat.ethz.ch/mailman/listinfo/r-sig-geo

### How to perform integration of .nc file in Raster R Package

Hi,

I have a .nc file with 2 variables q an u and 4 dimensions namely lat, lon, pressure level and time. This is a sample data of specific humidity and wind direction from ERA_interim.

How can i calculate the integration of product of variables q and u at each grid cell ( ie the vapour flux) for all time steps i e how to solve following equation in R to get int.

I have tried to make stacks of q and u for 3 levels as qs and us using raster package but got lost on calculating the integration value.

The sample data file
**qu.nc** (529 KB) is
here.

Any help to write a function to solve this integration would be greatly appreciated.

required (raster)

ncfl<-"qu.nc" #.nc datafile

ncf<-stack(ncfl)

print(ncf)

n <- 3 # Pressure levels 1000,950,900

for (i in 1:n) {

names <- paste("u", i, sep=".")

assign(names, brick(ncfl, varname ="u", lvar = 3, level

= i) )

}

for (i in 1:n) {

names <- paste("q", i, sep=".")

assign(names, brick(ncfl, varname ="q", lvar = 3, level

= i) )

}

qs<-stack(q.1,q.2,q.3)

us<-stack(u.1,u.2,u.3)

Thank you,

as

_______________________________________________

R-sig-Geo mailing list

[hidden email]

https://stat.ethz.ch/mailman/listinfo/r-sig-geo

### MODIS package and gdalUtils

I get the following error from running to following code using the MODIS package and gdalUtils. Thanks.

names(s) <- as.Date(modis.hdf[[1]], "A%

Error in `names<-`(`*tmp*`, value = c(NA_real_, NA_real_)) :

incorrect number of layer names

Here is the whole code:

library(MODIS)

library(gdalUtils)

MODISoptions(gdalPath="C:/OSGeo4W64/bin")

#install.packages("MODIS", repos ="http://R-Forge.R-project.org")

setwd("J:/chapter 11")

getwd()

gdal_setInstallation(search_path = "C:/OSGeo4W64/bin", rescan = TRUE,ignore.full_scan = TRUE, verbose = FALSE)

MODISoptions(localArcPath = getwd(), outDirPath =getwd())

MODIS:::checkTools('GDAL')

MODIS:::checkTools('MRT')

viname <- "ndvi"

product <- "MOD13Q1"

ofilename <- paste0(product, "_", viname, "_brick.grd")

ofilename

pth <- paste0(getwd(), "/raster_data/", product)

pth

fileout <- paste(pth, "/", ofilename, sep="")

fileout

if (!file.exists(fileout)) {

if (!file.exists(pth)) {

print("the outfolder does not exist and will be created")

print(pth)

dir.create(pth, recursive = TRUE)

}

}

tileH <- 19

tileV <- 10

begin <- "2015.01.01"

end <- "2015.01.20"

modis.hdf <- getHdf(product = product, begin=begin, end=end, tileH=tileH, tileV=tileV, checkIntegrity=TRUE)

modis.hdf

for (i in 1:length(modis.hdf[[1]])) {

ifl <- unlist(strsplit(unlist(strsplit(modis.hdf[[1]][i], c("[/]"))) [6], "[.]")) [1]#5

ifl

print(ifl)

fn <- paste("cvi_", ifl, ".tif", sep="")

print(fn)

if (is.na(ifl) | file.exists(fn)) {

print("file exists or is not available on the server")

} else {

sds <- get_subdatasets(modis.hdf[[1]][i])

}

}

tmp <- rasterTmpFile()

extension(tmp) <- "tif"

library(gdalUtils)

gdal_translate(sds[1], dst_dataset=tmp)

ndvi <- raster(tmp)/10000

writeRaster(ndvi, filename=paste0("ndvi_", ifl, ".tif", sep=""), dataType="INT2U", format="GTiff", overwrite=TRUE)

tmp2 <- rasterTmpFile()

extension(tmp2) <- "tif"

gdal_translate(sds[12], dst_dataset=tmp2)

rel <- crop(x=raster(tmp2), y=raster(tmp2),

filename=paste("rel_", ifl, ".tif", sep=""),

dataType="INT2U", format="GTiff")

f_cleanvi <- function(vi, rel) {

i <- (rel <= 1)

res <- matrix(NA, length(i), 1)

if (sum(i, na.rm=TRUE) > 0) {

i <- which(i)

res[i] <- vi[i]

}

res

}

clvi <- overlay(ndvi, rel, fun=f_cleanvi, filename=fn, dataType="INT2U", overwrite=TRUE)

clvi

rm(ndvi, rel, clvi, tmp, tmp2)

ofl <- list.files(pattern=glob2rx("rel*.tif"))

class(ofl)

ofl

modis.hdf[[1]]

s <- do.call("brick", lapply(ofl, raster))

names(s) <- as.Date(modis.hdf[[1]], "A%Y%j")

[[alternative HTML version deleted]]

_______________________________________________

R-sig-Geo mailing list

[hidden email]

https://stat.ethz.ch/mailman/listinfo/r-sig-geo

### Re: raster and oceancolor L2 netcdf data

Just saw this thread, and wanted to share some code I've got, in case it's

useful, though I'll give the disclaimer I haven't looked carefully at the

focal dataset here, though it sounds somewhat similar.

Basically, I'm working with modeled data for NY Harbor, across multiple

time points, with multiple depth horizons. It's also not a regular grid...

so I need look at the data as points or polygons. (it looks like a

spiderweb across the area...)

I've got code for pulling the data from a remote NetCDF source, assigning

values to x/y points, and plotting it with base R here:

https://github.com/mltConsEcol/R_NYHOPS_NetCDF_Access/blob/master/R/Working_RCode_MLT.R

You can disregard Ln 91 onward for now, but the rest might be of some use.

You can assign the data, with the x/y coords to a spatial (or simple

features) object - just havent included the code here yet (and sorry, in

the midst of travel so I'm appending this right now).

The next step will involve pulling x coords and y coords for polygons

associated with the points, which are represented in arrays (see lines

~66-71 of the code to get a sense of the structure), and then formatting

those such that they can be convert to spatial polygons/simple features

objects... In the mean-time though, I've just used Voronoi polygons to

roughly approximate represent the areas associated with the points for

visualization (not included in the code, just done quickly in QGIS).

Again - hope this is of some use and not just noise... I'm all ears for

better ways to deal w/ the data too.

Best,

Mike

On Wed, Apr 12, 2017 at 6:02 PM, Warner, David <[hidden email]> wrote:

> Thanks Loïc! I will read this. I am not completely wedded to doing what I

> want in R either. Python would be ok but I have to learn more about

> programming with Python.

>

> Cheers,

> Dave

>

>

>

> On Wed, Apr 12, 2017 at 4:12 PM, Loïc Dutrieux <

> [hidden email]

> > wrote:

>

> > I've done some work recently on ocean color L2 binning/mapping, see the

> > discussion on the ocean color forum https://oceancolor.gsfc.nasa.g

> > ov/forum/oceancolor/topic_show.pl?tid=6550 and the code in the gist

> > (which I'll update). It's not a R solution, but could be useful still.

> >

> > Cheers,

> > Loïc

> >

> > On 12/04/17 12:05, Warner, David wrote:

> >

> >> Mike

> >> I had not really thought about order of operations to be honest. I just

> >> noticed early on when I was attempting to use raster approach that the

> >> data

> >> were not mapped as hoped or orthorectified. I certainly don't need to

> >> remap before calculating chlor-a on a daily basis as all the bands I

> need

> >> for a single day are aligned (if not mapped the way I wish). In the

> end I

> >> do need the data correctly mapped as I want to do matchups with data

> >> collected with an LRAUV.

> >>

> >> I am planning on using locally calibrated coefficients. I will check

> out

> >> your package! I initially wanted to use L3 data but I and a colleague

> >> determined that there was for some reason poor agreement between the L3

> >> data and our in situ matchup data even though at L2 there is good

> >> agreement. This colleague has typically done the heavy lifting using

> >> ENVI,

> >> which I don't have and would rather not learn if what I want to do can

> be

> >> done in R.

> >>

> >> It looks like I can create a raster with vect2rast.SpatialPoints() from

> >> the

> >> plotKML package quite easily but the default settings for cell.size lead

> >> to

> >> loss of data (I think). You can set a cell.size but I am not sure if it

> >> works correctly without having multiple values per cell or not. Or what

> >> it

> >> does if you have multiple values per cell. There is some functionality

> >> that allows you to pick the first, last, the min, the max, or the mean

> if

> >> there are multiple values for the same grid cell but I can't get that

> to

> >> work without Saga GIS.

> >>

> >> Cheers and thanks,

> >> Dave

> >>

> >> On Wed, Apr 12, 2017 at 8:57 AM, Michael Sumner <[hidden email]>

> >> wrote:

> >>

> >> Glad it's some help, but it sounds like you intend to calculate after

> >>> mapping (?) which is definitely not the right way to go. Calculate

> >>> chlorophyll and then map, that's how Seadas does it, even though the

> >>> remapping is the hard part. And apologies if I misread, just checking.

> >>>

> >>> I have two algorithms in my roc package on GitHub in case they help

> >>> understanding how the calcs get done. Presumably you'll have locally

> >>> calibrated parameters for a local algo?

> >>>

> >>> If you want to aggregate into a local map I think it's fair to group-by

> >>> directly on L2 pixels coords and then sum into a geographic map,

> without

> >>> worrying about swath-as-image at all. We've road tested doing this but

> >>> want

> >>> the entire southern ocean eventually so it needs a bit of unrelated

> >>> preparation for the raw files.

> >>>

> >>> I'd be happy to explore an R solution off list if you're interested. L2

> >>> is

> >>> surprisingly easy and efficient in R via GDAL.

> >>>

> >>> (This is also a good example for future workflows for the planned stars

> >>> package imo.)

> >>>

> >>> Cheers, Mike

> >>>

> >>> On Wed, Apr 12, 2017, 22:35 Warner, David <[hidden email]> wrote:

> >>>

> >>> Thanks Mike!

> >>>>

> >>>> The goal is to estimate daily chlorophyll via band ratio polynomial

> >>>> equation for hundreds of days of data (hundreds of data files).

> Sounds

> >>>> like rather than finding a way to orthorectify in R I should learn to

> >>>> batch

> >>>> reproject using SeaDAS, which does produce a product that is in

> geotiff

> >>>> format, is orthorectified, and has readily mappable. I was trying to

> >>>> avoid

> >>>> that as the help and documentation available for doing that seems much

> >>>> less

> >>>> abundant. One file at a time is easy using the SeaDAS gui.

> >>>>

> >>>> Thanks very, very much for the other tricks. Not surprisingly,

> ggplot2

> >>>> comes through again with plots that look right!

> >>>> Cheers,

> >>>> Dave

> >>>>

> >>>>

> >>>>

> >>>> On Wed, Apr 12, 2017 at 7:01 AM, Michael Sumner <[hidden email]>

> >>>> wrote:

> >>>>

> >>>> You can't georeference these data without remapping the data,

> >>>> essentially

> >>>> treating the pixels as points. They have no natural regular grid form,

> >>>> except possibly a unique satellite-perspective one. The data are in 2D

> >>>> array form, but they have explicit "geolocation arrays", i.e. a

> >>>> longitude

> >>>> and latitude for every cell and not based on a regular mapping.

> >>>>

> >>>> R does not have tools for this directly from these data, though it can

> >>>> be

> >>>> treated as a resampling or modelling problem.

> >>>> You can use raster to get at the values of the locations easily

> enough,

> >>>> here's a couple of plotting options in case it's useful:

> >>>>

> >>>> u <- "https://github.com/dmwarn/Tethys/blob/master/

> >>>> A2016244185500.L2_LAC_OC.x.nc?raw=true"

> >>>> f <- basename(f)

> >>>> download.file(u, f, mode = "wb")

> >>>>

> >>>> library(raster)

> >>>> ## use raster to treat as raw point data, on long-lat locations

> >>>> rrs <- raster(f, varname = "geophysical_data/Rrs_412")

> >>>> longitude <- raster(f, varname = "navigation_data/longitude")

> >>>> latitude <- raster(f, varname = "navigation_data/latitude")

> >>>>

> >>>> ## plot in grid space, and add the geolocation space as a graticule

> >>>> plot(rrs)

> >>>> contour(longitude, add = TRUE)

> >>>> contour(latitude, add = TRUE)

> >>>>

> >>>> ## raw scaling against rrs values

> >>>> scl <- function(x) (x - min(x, na.rm = TRUE))/diff(range(x, na.rm =

> >>>> TRUE))

> >>>> plot(values(longitude), values(latitude), pch = ".", col =

> >>>> topo.colors(56)[scl(values(rrs)) * 55 + 1])

> >>>>

> >>>> ## ggplot

> >>>> library(ggplot2)

> >>>> d <- data.frame(x = values(longitude), y = values(latitude), rrs =

> >>>> values(rrs))

> >>>> ggplot(d, aes(x = x, y = y, colour = rrs)) + geom_point(pch = ".")

> >>>>

> >>>> ## might as well discard the missing values (depends on the other vars

> >>>> in

> >>>> the file)

> >>>> d <- d[!is.na(d$rrs), ]

> >>>> ggplot(d, aes(x = x, y = y, colour = rrs)) + geom_point(pch = 19, cex

> =

> >>>> 0.1)

> >>>>

> >>>> There are some MODIS and GDAL based packages that might be of use,

> but I

> >>>> haven't yet seen any R tool that does this remapping task at scale. (I

> >>>> believe the MODIS tools and the best warping tools in GDAL use

> >>>> thin-plate

> >>>> spline techniques).

> >>>>

> >>>> Some applications would use the observations as points (i.e. the

> ocean

> >>>> colour L3 bins as a daily aggregate from L2) and others use a direct

> >>>> remapping of the data as an array, for say high-resolution sea ice

> >>>> imagery.

> >>>>

> >>>> You might not need to do anything particularly complicated though,

> >>>> what's

> >>>> the goal?

> >>>>

> >>>> Cheers, Mike.

> >>>>

> >>>> On Wed, Apr 12, 2017, 20:06 Warner, David <[hidden email]> wrote:

> >>>>

> >>>> Greetings all

> >>>>

> >>>> I am trying to develop R code for processing L2 data (netcdf v4 files)

> >>>> from

> >>>> the Ocean Biology Processing Group.

> >>>>

> >>>> The data file I am working with to develop the code is at

> >>>> https://github.com/dmwarn/Tethys/blob/master/

> >>>> A2016244185500.L2_LAC_OC.x.nc

> >>>> and represents primarily Lake Michigan in the United States. The data

> >>>> were

> >>>> extracted from a global dataset by the oceancolor L1 and L2 data

> server,

> >>>> not by me.

> >>>>

> >>>> I have been using the code below to try to get the data into R and

> ready

> >>>> for processing but am having problems with dimensions and/or

> >>>> orthorectification. The

> >>>>

> >>>> #extent of the scene obtained using nc_open and ncvar_get

> >>>> nc <- nc_open('A2016214184500.L2_LAC_OC.x.nc')

> >>>> lon <- ncvar_get(nc, "navigation_data/longitude")

> >>>> lat <- ncvar_get(nc, "navigation_data/latitude")

> >>>> minx <- min(lon)

> >>>> maxx <- max(lon)

> >>>> miny <- min(lat)

> >>>> maxy <- max(lat)

> >>>>

> >>>> #create extent object

> >>>> myext <- extent(-90.817, -81.92438, 40.46493, 47.14244)

> >>>>

> >>>> #create raster

> >>>> rrs.412 <- raster('A2016214184500.L2_LAC_OC.x.nc', var

> >>>> ="geophysical_data/Rrs_412" ,

> >>>> ext=myext)

> >>>> rrs.412

> >>>>

> >>>>> rrs.412

> >>>>>

> >>>> class : RasterLayer

> >>>> dimensions : 644, 528, 340032 (nrow, ncol, ncell)

> >>>> resolution : 1, 1 (x, y)

> >>>> extent : 0.5, 528.5, 0.5, 644.5 (xmin, xmax, ymin, ymax)

> >>>> coord. ref. : NA

> >>>> data source : /Users/dmwarner/Documents/MODIS/OC/

> >>>> A2016214184500.L2_LAC_OC.x.nc

> >>>> names : Remote.sensing.reflectance.at.412.nm

> >>>> zvar : geophysical_data/Rrs_412

> >>>>

> >>>> In spite of having tried to assign an extent, the raster extent is in

> >>>> rows

> >>>> and columns.

> >>>>

> >>>> Further, plotting the raster reveals that it is flipped on x axis and

> >>>> somewhat rotated relative to what it should look like. Even when

> >>>> flipped,

> >>>> it is still not orthorectified.

> >>>>

> >>>> How do I get the raster to have the correct extent and also get it

> >>>> orthorectified?

> >>>> Thanks,

> >>>> Dave Warner

> >>>>

> >>>> --

> >>>> David Warner

> >>>> Research Fisheries Biologist

> >>>> U.S.G.S. Great Lakes Science Center

> >>>> 1451 Green Road

> >>>> Ann Arbor, MI 48105

> >>>> 734-214-9392 <(734)%20214-9392>

> >>>>

> >>>> [[alternative HTML version deleted]]

> >>>>

> >>>> _______________________________________________

> >>>> R-sig-Geo mailing list

> >>>> [hidden email]

> >>>> https://stat.ethz.ch/mailman/listinfo/r-sig-geo

> >>>>

> >>>> --

> >>>> Dr. Michael Sumner

> >>>> Software and Database Engineer

> >>>> Australian Antarctic Division

> >>>> 203 Channel Highway

> >>>> Kingston Tasmania 7050 Australia

> >>>>

> >>>>

> >>>>

> >>>>

> >>>> --

> >>>> David Warner

> >>>> Research Fisheries Biologist

> >>>> U.S.G.S. Great Lakes Science Center

> >>>> 1451 Green Road

> >>>> Ann Arbor, MI 48105

> >>>> 734-214-9392

> >>>>

> >>>> --

> >>> Dr. Michael Sumner

> >>> Software and Database Engineer

> >>> Australian Antarctic Division

> >>> 203 Channel Highway

> >>> Kingston Tasmania 7050 Australia

> >>>

> >>>

> >>>

> >>

> >>

> > _______________________________________________

> > R-sig-Geo mailing list

> > [hidden email]

> > https://stat.ethz.ch/mailman/listinfo/r-sig-geo

> >

> >

>

>

> --

> David Warner

> Research Fisheries Biologist

> U.S.G.S. Great Lakes Science Center

> 1451 Green Road

> Ann Arbor, MI 48105

> 734-214-9392

>

> [[alternative HTML version deleted]]

>

> _______________________________________________

> R-sig-Geo mailing list

> [hidden email]

> https://stat.ethz.ch/mailman/listinfo/r-sig-geo

>

[[alternative HTML version deleted]]

_______________________________________________

R-sig-Geo mailing list

[hidden email]

https://stat.ethz.ch/mailman/listinfo/r-sig-geo

### Re: raster and oceancolor L2 netcdf data

want in R either. Python would be ok but I have to learn more about

programming with Python.

Cheers,

Dave

On Wed, Apr 12, 2017 at 4:12 PM, Loïc Dutrieux <[hidden email]

> wrote:

> I've done some work recently on ocean color L2 binning/mapping, see the

> discussion on the ocean color forum https://oceancolor.gsfc.nasa.g

> ov/forum/oceancolor/topic_show.pl?tid=6550 and the code in the gist

> (which I'll update). It's not a R solution, but could be useful still.

>

> Cheers,

> Loïc

>

> On 12/04/17 12:05, Warner, David wrote:

>

>> Mike

>> I had not really thought about order of operations to be honest. I just

>> noticed early on when I was attempting to use raster approach that the

>> data

>> were not mapped as hoped or orthorectified. I certainly don't need to

>> remap before calculating chlor-a on a daily basis as all the bands I need

>> for a single day are aligned (if not mapped the way I wish). In the end I

>> do need the data correctly mapped as I want to do matchups with data

>> collected with an LRAUV.

>>

>> I am planning on using locally calibrated coefficients. I will check out

>> your package! I initially wanted to use L3 data but I and a colleague

>> determined that there was for some reason poor agreement between the L3

>> data and our in situ matchup data even though at L2 there is good

>> agreement. This colleague has typically done the heavy lifting using

>> ENVI,

>> which I don't have and would rather not learn if what I want to do can be

>> done in R.

>>

>> It looks like I can create a raster with vect2rast.SpatialPoints() from

>> the

>> plotKML package quite easily but the default settings for cell.size lead

>> to

>> loss of data (I think). You can set a cell.size but I am not sure if it

>> works correctly without having multiple values per cell or not. Or what

>> it

>> does if you have multiple values per cell. There is some functionality

>> that allows you to pick the first, last, the min, the max, or the mean if

>> there are multiple values for the same grid cell but I can't get that to

>> work without Saga GIS.

>>

>> Cheers and thanks,

>> Dave

>>

>> On Wed, Apr 12, 2017 at 8:57 AM, Michael Sumner <[hidden email]>

>> wrote:

>>

>> Glad it's some help, but it sounds like you intend to calculate after

>>> mapping (?) which is definitely not the right way to go. Calculate

>>> chlorophyll and then map, that's how Seadas does it, even though the

>>> remapping is the hard part. And apologies if I misread, just checking.

>>>

>>> I have two algorithms in my roc package on GitHub in case they help

>>> understanding how the calcs get done. Presumably you'll have locally

>>> calibrated parameters for a local algo?

>>>

>>> If you want to aggregate into a local map I think it's fair to group-by

>>> directly on L2 pixels coords and then sum into a geographic map, without

>>> worrying about swath-as-image at all. We've road tested doing this but

>>> want

>>> the entire southern ocean eventually so it needs a bit of unrelated

>>> preparation for the raw files.

>>>

>>> I'd be happy to explore an R solution off list if you're interested. L2

>>> is

>>> surprisingly easy and efficient in R via GDAL.

>>>

>>> (This is also a good example for future workflows for the planned stars

>>> package imo.)

>>>

>>> Cheers, Mike

>>>

>>> On Wed, Apr 12, 2017, 22:35 Warner, David <[hidden email]> wrote:

>>>

>>> Thanks Mike!

>>>>

>>>> The goal is to estimate daily chlorophyll via band ratio polynomial

>>>> equation for hundreds of days of data (hundreds of data files). Sounds

>>>> like rather than finding a way to orthorectify in R I should learn to

>>>> batch

>>>> reproject using SeaDAS, which does produce a product that is in geotiff

>>>> format, is orthorectified, and has readily mappable. I was trying to

>>>> avoid

>>>> that as the help and documentation available for doing that seems much

>>>> less

>>>> abundant. One file at a time is easy using the SeaDAS gui.

>>>>

>>>> Thanks very, very much for the other tricks. Not surprisingly, ggplot2

>>>> comes through again with plots that look right!

>>>> Cheers,

>>>> Dave

>>>>

>>>>

>>>>

>>>> On Wed, Apr 12, 2017 at 7:01 AM, Michael Sumner <[hidden email]>

>>>> wrote:

>>>>

>>>> You can't georeference these data without remapping the data,

>>>> essentially

>>>> treating the pixels as points. They have no natural regular grid form,

>>>> except possibly a unique satellite-perspective one. The data are in 2D

>>>> array form, but they have explicit "geolocation arrays", i.e. a

>>>> longitude

>>>> and latitude for every cell and not based on a regular mapping.

>>>>

>>>> R does not have tools for this directly from these data, though it can

>>>> be

>>>> treated as a resampling or modelling problem.

>>>> You can use raster to get at the values of the locations easily enough,

>>>> here's a couple of plotting options in case it's useful:

>>>>

>>>> u <- "https://github.com/dmwarn/Tethys/blob/master/

>>>> A2016244185500.L2_LAC_OC.x.nc?raw=true"

>>>> f <- basename(f)

>>>> download.file(u, f, mode = "wb")

>>>>

>>>> library(raster)

>>>> ## use raster to treat as raw point data, on long-lat locations

>>>> rrs <- raster(f, varname = "geophysical_data/Rrs_412")

>>>> longitude <- raster(f, varname = "navigation_data/longitude")

>>>> latitude <- raster(f, varname = "navigation_data/latitude")

>>>>

>>>> ## plot in grid space, and add the geolocation space as a graticule

>>>> plot(rrs)

>>>> contour(longitude, add = TRUE)

>>>> contour(latitude, add = TRUE)

>>>>

>>>> ## raw scaling against rrs values

>>>> scl <- function(x) (x - min(x, na.rm = TRUE))/diff(range(x, na.rm =

>>>> TRUE))

>>>> plot(values(longitude), values(latitude), pch = ".", col =

>>>> topo.colors(56)[scl(values(rrs)) * 55 + 1])

>>>>

>>>> ## ggplot

>>>> library(ggplot2)

>>>> d <- data.frame(x = values(longitude), y = values(latitude), rrs =

>>>> values(rrs))

>>>> ggplot(d, aes(x = x, y = y, colour = rrs)) + geom_point(pch = ".")

>>>>

>>>> ## might as well discard the missing values (depends on the other vars

>>>> in

>>>> the file)

>>>> d <- d[!is.na(d$rrs), ]

>>>> ggplot(d, aes(x = x, y = y, colour = rrs)) + geom_point(pch = 19, cex =

>>>> 0.1)

>>>>

>>>> There are some MODIS and GDAL based packages that might be of use, but I

>>>> haven't yet seen any R tool that does this remapping task at scale. (I

>>>> believe the MODIS tools and the best warping tools in GDAL use

>>>> thin-plate

>>>> spline techniques).

>>>>

>>>> Some applications would use the observations as points (i.e. the ocean

>>>> colour L3 bins as a daily aggregate from L2) and others use a direct

>>>> remapping of the data as an array, for say high-resolution sea ice

>>>> imagery.

>>>>

>>>> You might not need to do anything particularly complicated though,

>>>> what's

>>>> the goal?

>>>>

>>>> Cheers, Mike.

>>>>

>>>> On Wed, Apr 12, 2017, 20:06 Warner, David <[hidden email]> wrote:

>>>>

>>>> Greetings all

>>>>

>>>> I am trying to develop R code for processing L2 data (netcdf v4 files)

>>>> from

>>>> the Ocean Biology Processing Group.

>>>>

>>>> The data file I am working with to develop the code is at

>>>> https://github.com/dmwarn/Tethys/blob/master/

>>>> A2016244185500.L2_LAC_OC.x.nc

>>>> and represents primarily Lake Michigan in the United States. The data

>>>> were

>>>> extracted from a global dataset by the oceancolor L1 and L2 data server,

>>>> not by me.

>>>>

>>>> I have been using the code below to try to get the data into R and ready

>>>> for processing but am having problems with dimensions and/or

>>>> orthorectification. The

>>>>

>>>> #extent of the scene obtained using nc_open and ncvar_get

>>>> nc <- nc_open('A2016214184500.L2_LAC_OC.x.nc')

>>>> lon <- ncvar_get(nc, "navigation_data/longitude")

>>>> lat <- ncvar_get(nc, "navigation_data/latitude")

>>>> minx <- min(lon)

>>>> maxx <- max(lon)

>>>> miny <- min(lat)

>>>> maxy <- max(lat)

>>>>

>>>> #create extent object

>>>> myext <- extent(-90.817, -81.92438, 40.46493, 47.14244)

>>>>

>>>> #create raster

>>>> rrs.412 <- raster('A2016214184500.L2_LAC_OC.x.nc', var

>>>> ="geophysical_data/Rrs_412" ,

>>>> ext=myext)

>>>> rrs.412

>>>>

>>>>> rrs.412

>>>>>

>>>> class : RasterLayer

>>>> dimensions : 644, 528, 340032 (nrow, ncol, ncell)

>>>> resolution : 1, 1 (x, y)

>>>> extent : 0.5, 528.5, 0.5, 644.5 (xmin, xmax, ymin, ymax)

>>>> coord. ref. : NA

>>>> data source : /Users/dmwarner/Documents/MODIS/OC/

>>>> A2016214184500.L2_LAC_OC.x.nc

>>>> names : Remote.sensing.reflectance.at.412.nm

>>>> zvar : geophysical_data/Rrs_412

>>>>

>>>> In spite of having tried to assign an extent, the raster extent is in

>>>> rows

>>>> and columns.

>>>>

>>>> Further, plotting the raster reveals that it is flipped on x axis and

>>>> somewhat rotated relative to what it should look like. Even when

>>>> flipped,

>>>> it is still not orthorectified.

>>>>

>>>> How do I get the raster to have the correct extent and also get it

>>>> orthorectified?

>>>> Thanks,

>>>> Dave Warner

>>>>

>>>> --

>>>> David Warner

>>>> Research Fisheries Biologist

>>>> U.S.G.S. Great Lakes Science Center

>>>> 1451 Green Road

>>>> Ann Arbor, MI 48105

>>>> 734-214-9392 <(734)%20214-9392>

>>>>

>>>> [[alternative HTML version deleted]]

>>>>

>>>> _______________________________________________

>>>> R-sig-Geo mailing list

>>>> [hidden email]

>>>> https://stat.ethz.ch/mailman/listinfo/r-sig-geo

>>>>

>>>> --

>>>> Dr. Michael Sumner

>>>> Software and Database Engineer

>>>> Australian Antarctic Division

>>>> 203 Channel Highway

>>>> Kingston Tasmania 7050 Australia

>>>>

>>>>

>>>>

>>>>

>>>> --

>>>> David Warner

>>>> Research Fisheries Biologist

>>>> U.S.G.S. Great Lakes Science Center

>>>> 1451 Green Road

>>>> Ann Arbor, MI 48105

>>>> 734-214-9392

>>>>

>>>> --

>>> Dr. Michael Sumner

>>> Software and Database Engineer

>>> Australian Antarctic Division

>>> 203 Channel Highway

>>> Kingston Tasmania 7050 Australia

>>>

>>>

>>>

>>

>>

> _______________________________________________

> R-sig-Geo mailing list

> [hidden email]

> https://stat.ethz.ch/mailman/listinfo/r-sig-geo

>

>

--

David Warner

Research Fisheries Biologist

U.S.G.S. Great Lakes Science Center

1451 Green Road

Ann Arbor, MI 48105

734-214-9392

[[alternative HTML version deleted]]

_______________________________________________

R-sig-Geo mailing list

[hidden email]

https://stat.ethz.ch/mailman/listinfo/r-sig-geo

### Re: raster and oceancolor L2 netcdf data

discussion on the ocean color forum

https://oceancolor.gsfc.nasa.gov/forum/oceancolor/topic_show.pl?tid=6550

and the code in the gist (which I'll update). It's not a R solution, but

could be useful still.

Cheers,

Loïc

On 12/04/17 12:05, Warner, David wrote:

> Mike

> I had not really thought about order of operations to be honest. I just

> noticed early on when I was attempting to use raster approach that the data

> were not mapped as hoped or orthorectified. I certainly don't need to

> remap before calculating chlor-a on a daily basis as all the bands I need

> for a single day are aligned (if not mapped the way I wish). In the end I

> do need the data correctly mapped as I want to do matchups with data

> collected with an LRAUV.

>

> I am planning on using locally calibrated coefficients. I will check out

> your package! I initially wanted to use L3 data but I and a colleague

> determined that there was for some reason poor agreement between the L3

> data and our in situ matchup data even though at L2 there is good

> agreement. This colleague has typically done the heavy lifting using ENVI,

> which I don't have and would rather not learn if what I want to do can be

> done in R.

>

> It looks like I can create a raster with vect2rast.SpatialPoints() from the

> plotKML package quite easily but the default settings for cell.size lead to

> loss of data (I think). You can set a cell.size but I am not sure if it

> works correctly without having multiple values per cell or not. Or what it

> does if you have multiple values per cell. There is some functionality

> that allows you to pick the first, last, the min, the max, or the mean if

> there are multiple values for the same grid cell but I can't get that to

> work without Saga GIS.

>

> Cheers and thanks,

> Dave

>

> On Wed, Apr 12, 2017 at 8:57 AM, Michael Sumner <[hidden email]> wrote:

>

>> Glad it's some help, but it sounds like you intend to calculate after

>> mapping (?) which is definitely not the right way to go. Calculate

>> chlorophyll and then map, that's how Seadas does it, even though the

>> remapping is the hard part. And apologies if I misread, just checking.

>>

>> I have two algorithms in my roc package on GitHub in case they help

>> understanding how the calcs get done. Presumably you'll have locally

>> calibrated parameters for a local algo?

>>

>> If you want to aggregate into a local map I think it's fair to group-by

>> directly on L2 pixels coords and then sum into a geographic map, without

>> worrying about swath-as-image at all. We've road tested doing this but want

>> the entire southern ocean eventually so it needs a bit of unrelated

>> preparation for the raw files.

>>

>> I'd be happy to explore an R solution off list if you're interested. L2 is

>> surprisingly easy and efficient in R via GDAL.

>>

>> (This is also a good example for future workflows for the planned stars

>> package imo.)

>>

>> Cheers, Mike

>>

>> On Wed, Apr 12, 2017, 22:35 Warner, David <[hidden email]> wrote:

>>

>>> Thanks Mike!

>>>

>>> The goal is to estimate daily chlorophyll via band ratio polynomial

>>> equation for hundreds of days of data (hundreds of data files). Sounds

>>> like rather than finding a way to orthorectify in R I should learn to batch

>>> reproject using SeaDAS, which does produce a product that is in geotiff

>>> format, is orthorectified, and has readily mappable. I was trying to avoid

>>> that as the help and documentation available for doing that seems much less

>>> abundant. One file at a time is easy using the SeaDAS gui.

>>>

>>> Thanks very, very much for the other tricks. Not surprisingly, ggplot2

>>> comes through again with plots that look right!

>>> Cheers,

>>> Dave

>>>

>>>

>>>

>>> On Wed, Apr 12, 2017 at 7:01 AM, Michael Sumner <[hidden email]>

>>> wrote:

>>>

>>> You can't georeference these data without remapping the data, essentially

>>> treating the pixels as points. They have no natural regular grid form,

>>> except possibly a unique satellite-perspective one. The data are in 2D

>>> array form, but they have explicit "geolocation arrays", i.e. a longitude

>>> and latitude for every cell and not based on a regular mapping.

>>>

>>> R does not have tools for this directly from these data, though it can be

>>> treated as a resampling or modelling problem.

>>> You can use raster to get at the values of the locations easily enough,

>>> here's a couple of plotting options in case it's useful:

>>>

>>> u <- "https://github.com/dmwarn/Tethys/blob/master/

>>> A2016244185500.L2_LAC_OC.x.nc?raw=true"

>>> f <- basename(f)

>>> download.file(u, f, mode = "wb")

>>>

>>> library(raster)

>>> ## use raster to treat as raw point data, on long-lat locations

>>> rrs <- raster(f, varname = "geophysical_data/Rrs_412")

>>> longitude <- raster(f, varname = "navigation_data/longitude")

>>> latitude <- raster(f, varname = "navigation_data/latitude")

>>>

>>> ## plot in grid space, and add the geolocation space as a graticule

>>> plot(rrs)

>>> contour(longitude, add = TRUE)

>>> contour(latitude, add = TRUE)

>>>

>>> ## raw scaling against rrs values

>>> scl <- function(x) (x - min(x, na.rm = TRUE))/diff(range(x, na.rm = TRUE))

>>> plot(values(longitude), values(latitude), pch = ".", col =

>>> topo.colors(56)[scl(values(rrs)) * 55 + 1])

>>>

>>> ## ggplot

>>> library(ggplot2)

>>> d <- data.frame(x = values(longitude), y = values(latitude), rrs =

>>> values(rrs))

>>> ggplot(d, aes(x = x, y = y, colour = rrs)) + geom_point(pch = ".")

>>>

>>> ## might as well discard the missing values (depends on the other vars in

>>> the file)

>>> d <- d[!is.na(d$rrs), ]

>>> ggplot(d, aes(x = x, y = y, colour = rrs)) + geom_point(pch = 19, cex =

>>> 0.1)

>>>

>>> There are some MODIS and GDAL based packages that might be of use, but I

>>> haven't yet seen any R tool that does this remapping task at scale. (I

>>> believe the MODIS tools and the best warping tools in GDAL use thin-plate

>>> spline techniques).

>>>

>>> Some applications would use the observations as points (i.e. the ocean

>>> colour L3 bins as a daily aggregate from L2) and others use a direct

>>> remapping of the data as an array, for say high-resolution sea ice imagery.

>>>

>>> You might not need to do anything particularly complicated though, what's

>>> the goal?

>>>

>>> Cheers, Mike.

>>>

>>> On Wed, Apr 12, 2017, 20:06 Warner, David <[hidden email]> wrote:

>>>

>>> Greetings all

>>>

>>> I am trying to develop R code for processing L2 data (netcdf v4 files)

>>> from

>>> the Ocean Biology Processing Group.

>>>

>>> The data file I am working with to develop the code is at

>>> https://github.com/dmwarn/Tethys/blob/master/

>>> A2016244185500.L2_LAC_OC.x.nc

>>> and represents primarily Lake Michigan in the United States. The data

>>> were

>>> extracted from a global dataset by the oceancolor L1 and L2 data server,

>>> not by me.

>>>

>>> I have been using the code below to try to get the data into R and ready

>>> for processing but am having problems with dimensions and/or

>>> orthorectification. The

>>>

>>> #extent of the scene obtained using nc_open and ncvar_get

>>> nc <- nc_open('A2016214184500.L2_LAC_OC.x.nc')

>>> lon <- ncvar_get(nc, "navigation_data/longitude")

>>> lat <- ncvar_get(nc, "navigation_data/latitude")

>>> minx <- min(lon)

>>> maxx <- max(lon)

>>> miny <- min(lat)

>>> maxy <- max(lat)

>>>

>>> #create extent object

>>> myext <- extent(-90.817, -81.92438, 40.46493, 47.14244)

>>>

>>> #create raster

>>> rrs.412 <- raster('A2016214184500.L2_LAC_OC.x.nc', var

>>> ="geophysical_data/Rrs_412" ,

>>> ext=myext)

>>> rrs.412

>>>> rrs.412

>>> class : RasterLayer

>>> dimensions : 644, 528, 340032 (nrow, ncol, ncell)

>>> resolution : 1, 1 (x, y)

>>> extent : 0.5, 528.5, 0.5, 644.5 (xmin, xmax, ymin, ymax)

>>> coord. ref. : NA

>>> data source : /Users/dmwarner/Documents/MODIS/OC/

>>> A2016214184500.L2_LAC_OC.x.nc

>>> names : Remote.sensing.reflectance.at.412.nm

>>> zvar : geophysical_data/Rrs_412

>>>

>>> In spite of having tried to assign an extent, the raster extent is in rows

>>> and columns.

>>>

>>> Further, plotting the raster reveals that it is flipped on x axis and

>>> somewhat rotated relative to what it should look like. Even when flipped,

>>> it is still not orthorectified.

>>>

>>> How do I get the raster to have the correct extent and also get it

>>> orthorectified?

>>> Thanks,

>>> Dave Warner

>>>

>>> --

>>> David Warner

>>> Research Fisheries Biologist

>>> U.S.G.S. Great Lakes Science Center

>>> 1451 Green Road

>>> Ann Arbor, MI 48105

>>> 734-214-9392 <(734)%20214-9392>

>>>

>>> [[alternative HTML version deleted]]

>>>

>>> _______________________________________________

>>> R-sig-Geo mailing list

>>> [hidden email]

>>> https://stat.ethz.ch/mailman/listinfo/r-sig-geo

>>>

>>> --

>>> Dr. Michael Sumner

>>> Software and Database Engineer

>>> Australian Antarctic Division

>>> 203 Channel Highway

>>> Kingston Tasmania 7050 Australia

>>>

>>>

>>>

>>>

>>> --

>>> David Warner

>>> Research Fisheries Biologist

>>> U.S.G.S. Great Lakes Science Center

>>> 1451 Green Road

>>> Ann Arbor, MI 48105

>>> 734-214-9392

>>>

>> --

>> Dr. Michael Sumner

>> Software and Database Engineer

>> Australian Antarctic Division

>> 203 Channel Highway

>> Kingston Tasmania 7050 Australia

>>

>>

>

>

_______________________________________________

R-sig-Geo mailing list

[hidden email]

https://stat.ethz.ch/mailman/listinfo/r-sig-geo

### Re: dnearneigh() from spdep: Points with the exact same location are not considered neighbours.

Your patched version now works as expected!

And thank you for your suggestions, I am currently looking into all of that.

Best

Maël

On 12/04/2017 09:00, Roger Bivand wrote:

> On Wed, 12 Apr 2017, Maël Le Noc wrote:

>

>> Dear Roger,

>> Thank you for your answer, (And sorry for the HTML posting).

>>

>> The issue persists if I specify "GE" for the lower bound, but only when

>> the parameter latlong is set to TRUE (see example below).

>

> Thanks, very useful. The Great Circle distance measure returned

> NotANumber for zero distance, because of an unprotected division by

> zero. I've committed a patched source version to R-Forge. Look on

>

> https://r-forge.r-project.org/R/?group_id=182

>

> later today for a version with today's date and Rev: 693 - should show

> up mid to late evening CEST. Please say whether this performs as

> expected.

>

>>

>>

>> Regarding the nature of my data, it is a series of record of Jews

>> arrested during the Holocaust in Italy. Those are point data, and some

>> people have been arrested at the same place and at the same time (hence

>> my problem). I am trying to assess spatial autocorrelation for a binary

>> attribute (whether they survived the Holocaust or not), and I plan to

>> use a Join-count method, for which I need a spatial weight matrix. Is

>> using Join-count on such a dataset wrong ?

>

> Join-count should be OK, but if you have covariates you could try to

> remove the mean model first and only then see whether there is a

> spatially structured random effect, for example with hglm, R2BayesX,

> INLA, or similar. For hglm see for example:

>

> https://journal.r-project.org/archive/2015/RJ-2015-017/index.html

>

> The data you most likely do not have (addresses with residents at risk

> of arrest but not arrested) would also help, giving you a risk of

> arrest measure by address. There is also a spatial probit literature

> that might be relevant; if you have timestamps, you will likely find

> that operational factors play in, with arrests in a small area at the

> same time.

>

> Hope this helps,

>

> Roger

>

>>

>> Best

>>

>>

>>

>> Code:

>>

>> library(data.table)

>> library(spdep)

>> pointstable <- data.table(XCoord=c(13.667029,13.667029,13.667028),

>> YCoord=c(42.772396,42.772396,42.772396))

>> print(pointstable)

>> coords <-cbind(pointstable$XCoord, pointstable$YCoord)

>> nbLocal<- dnearneigh(coords, d1=0, d2=25, longlat = TRUE, bound =

>> c("GE", "LE"))

>> summary(nbLocal)

>> nbLocal<- dnearneigh(coords, d1=0, d2=25, longlat = FALSE, bound =

>> c("GE", "LE"))

>> summary(nbLocal)

>>

>>

>> Output:

>>> print(pointstable)

>> XCoord YCoord

>> 1: 13.66703 42.7724

>> 2: 13.66703 42.7724

>> 3: 13.66703 42.7724

>>

>>> nbLocal<- dnearneigh(coords, d1=0, d2=25, longlat = TRUE, bound =

>> c("GE", "LE"))

>>> summary(nbLocal)

>> Neighbour list object:

>> Number of regions: 3

>> Number of nonzero links: 4

>> Percentage nonzero weights: 44.44444

>> Average number of links: 1.333333

>> Link number distribution:

>>

>> 1 2

>> 2 1

>> 2 least connected regions:

>> 1 2 with 1 link

>> 1 most connected region:

>> 3 with 2 links

>>

>>> nbLocal<- dnearneigh(coords, d1=0, d2=25, longlat = FALSE, bound =

>> c("GE", "LE"))

>>> summary(nbLocal)

>> Neighbour list object:

>> Number of regions: 3

>> Number of nonzero links: 6

>> Percentage nonzero weights: 66.66667

>> Average number of links: 2

>> Link number distribution:

>>

>> 2

>> 3

>> 3 least connected regions:

>> 1 2 3 with 2 links

>> 3 most connected regions:

>> 1 2 3 with 2 links

>>

>>

>>

>> On 12/04/2017 02:27, Roger Bivand wrote:

>>> Do not post HTML-mail, only plain text. Your example is not

>>> reproducible

>>> because you used HTML-mail.

>>>

>>> Please read the help file, the bounds are described as being between

>>> lower (greater than) and upper (less than or equal to) bounds. Since

>>> the

>>> distance between identical points is strictly zero, they are not

>>> neighbours because the distance must be > d1 and <= d2. If d1 is <

>>> 0, it

>>> is reset to 0, as it is assumed that a negative lower bound is a user

>>> error (and it would break the underlying compiled code).

>>>

>>> In any case, no reasonable cross-sectional spatial process has

>>> duplicated point (nugget) observations in situations in which spatial

>>> weights would be used (spatio-temporal panels will have, but then time

>>> differs).

>>>

>>> Hope this clarifies,

>>>

>>> Roger

>>>

>>> On Wed, 12 Apr 2017, Maël Le Noc via R-sig-Geo wrote:

>>>

>>>> Dear List

>>>>

>>>> As I was working on a project, I realized that when I use dnearneigh

>>>> from spdep, two (or more) points that have the exact same coordinates

>>>> are not considered neighbours and thus are not linked (even when the

>>>> lower bound is put to 0 or even to -1). See below for an example.

>>>> (However this does not happen if the parameter longlat is set to

>>>> false)

>>>>

>>>> Does the function behave the same way for you? Am I missing something?

>>>> Is this an expected behavior? And if so, if there a way to change

>>>> that ?

>>>>

>>>> In the example below, points 1 and 2 are not connected to each

>>>> other/are

>>>> not neighbours (as you can see since the both have only one link,

>>>> to 3),

>>>> even though they have the exact same coordinates (and are thus less

>>>> than

>>>> 25km apart), while point 3 is connected to both point 1 and 2.

>>>> If I want to assess autocorrelation using, for instance

>>>> joincount.test,

>>>> this is then an issue...

>>>>

>>>>> /library(data.table) />/library(spdep) />/pointstable <-

>>>>> data.table(XCoord=c(13.667029,13.667029,13.667028),

>>>>> /YCoord=c(42.772396,42.772396,42.772396))

>>>>> /print(pointstable) / XCoord YCoord

>>>> 1: 13.667029 42.772396

>>>> 2: 13.667029 42.772396

>>>> 3: 13.667028 42.772396

>>>>> /coords <-cbind(pointstable$XCoord, pointstable$YCoord) />/nbLocal<-

>>>>> dnearneigh(coords, d1=0, d2=25, longlat = TRUE) />/nbLocal<-

>>>>> dnearneigh(coords, d1=-1, d2=25, longlat = TRUE) #both lines /produce

>>>>> the same output

>>>>> /summary(nbLocal) /Neighbour list object:

>>>> Number of regions: 3

>>>> Number of nonzero links: 4

>>>> Percentage nonzero weights: 44.44444

>>>> Average number of links: 1.333333

>>>> Link number distribution:

>>>>

>>>> 1 2

>>>> 2 1

>>>> 2 least connected regions:

>>>> 1 2 with 1 link

>>>> 1 most connected region:

>>>> 3 with 2 links

>>>>> //

>>>> Thanks

>>>> Maël

>>>>

>>>>

>>>> [[alternative HTML version deleted]]

>>>>

>>>> _______________________________________________

>>>> R-sig-Geo mailing list

>>>> [hidden email]

>>>> https://stat.ethz.ch/mailman/listinfo/r-sig-geo

>>>

>>

>>

>

_______________________________________________

R-sig-Geo mailing list

[hidden email]

https://stat.ethz.ch/mailman/listinfo/r-sig-geo

### Re: raster and oceancolor L2 netcdf data

I had not really thought about order of operations to be honest. I just

noticed early on when I was attempting to use raster approach that the data

were not mapped as hoped or orthorectified. I certainly don't need to

remap before calculating chlor-a on a daily basis as all the bands I need

for a single day are aligned (if not mapped the way I wish). In the end I

do need the data correctly mapped as I want to do matchups with data

collected with an LRAUV.

I am planning on using locally calibrated coefficients. I will check out

your package! I initially wanted to use L3 data but I and a colleague

determined that there was for some reason poor agreement between the L3

data and our in situ matchup data even though at L2 there is good

agreement. This colleague has typically done the heavy lifting using ENVI,

which I don't have and would rather not learn if what I want to do can be

done in R.

It looks like I can create a raster with vect2rast.SpatialPoints() from the

plotKML package quite easily but the default settings for cell.size lead to

loss of data (I think). You can set a cell.size but I am not sure if it

works correctly without having multiple values per cell or not. Or what it

does if you have multiple values per cell. There is some functionality

that allows you to pick the first, last, the min, the max, or the mean if

there are multiple values for the same grid cell but I can't get that to

work without Saga GIS.

Cheers and thanks,

Dave

On Wed, Apr 12, 2017 at 8:57 AM, Michael Sumner <[hidden email]> wrote:

> Glad it's some help, but it sounds like you intend to calculate after

> mapping (?) which is definitely not the right way to go. Calculate

> chlorophyll and then map, that's how Seadas does it, even though the

> remapping is the hard part. And apologies if I misread, just checking.

>

> I have two algorithms in my roc package on GitHub in case they help

> understanding how the calcs get done. Presumably you'll have locally

> calibrated parameters for a local algo?

>

> If you want to aggregate into a local map I think it's fair to group-by

> directly on L2 pixels coords and then sum into a geographic map, without

> worrying about swath-as-image at all. We've road tested doing this but want

> the entire southern ocean eventually so it needs a bit of unrelated

> preparation for the raw files.

>

> I'd be happy to explore an R solution off list if you're interested. L2 is

> surprisingly easy and efficient in R via GDAL.

>

> (This is also a good example for future workflows for the planned stars

> package imo.)

>

> Cheers, Mike

>

> On Wed, Apr 12, 2017, 22:35 Warner, David <[hidden email]> wrote:

>

>> Thanks Mike!

>>

>> The goal is to estimate daily chlorophyll via band ratio polynomial

>> equation for hundreds of days of data (hundreds of data files). Sounds

>> like rather than finding a way to orthorectify in R I should learn to batch

>> reproject using SeaDAS, which does produce a product that is in geotiff

>> format, is orthorectified, and has readily mappable. I was trying to avoid

>> that as the help and documentation available for doing that seems much less

>> abundant. One file at a time is easy using the SeaDAS gui.

>>

>> Thanks very, very much for the other tricks. Not surprisingly, ggplot2

>> comes through again with plots that look right!

>> Cheers,

>> Dave

>>

>>

>>

>> On Wed, Apr 12, 2017 at 7:01 AM, Michael Sumner <[hidden email]>

>> wrote:

>>

>> You can't georeference these data without remapping the data, essentially

>> treating the pixels as points. They have no natural regular grid form,

>> except possibly a unique satellite-perspective one. The data are in 2D

>> array form, but they have explicit "geolocation arrays", i.e. a longitude

>> and latitude for every cell and not based on a regular mapping.

>>

>> R does not have tools for this directly from these data, though it can be

>> treated as a resampling or modelling problem.

>> You can use raster to get at the values of the locations easily enough,

>> here's a couple of plotting options in case it's useful:

>>

>> u <- "https://github.com/dmwarn/Tethys/blob/master/

>> A2016244185500.L2_LAC_OC.x.nc?raw=true"

>> f <- basename(f)

>> download.file(u, f, mode = "wb")

>>

>> library(raster)

>> ## use raster to treat as raw point data, on long-lat locations

>> rrs <- raster(f, varname = "geophysical_data/Rrs_412")

>> longitude <- raster(f, varname = "navigation_data/longitude")

>> latitude <- raster(f, varname = "navigation_data/latitude")

>>

>> ## plot in grid space, and add the geolocation space as a graticule

>> plot(rrs)

>> contour(longitude, add = TRUE)

>> contour(latitude, add = TRUE)

>>

>> ## raw scaling against rrs values

>> scl <- function(x) (x - min(x, na.rm = TRUE))/diff(range(x, na.rm = TRUE))

>> plot(values(longitude), values(latitude), pch = ".", col =

>> topo.colors(56)[scl(values(rrs)) * 55 + 1])

>>

>> ## ggplot

>> library(ggplot2)

>> d <- data.frame(x = values(longitude), y = values(latitude), rrs =

>> values(rrs))

>> ggplot(d, aes(x = x, y = y, colour = rrs)) + geom_point(pch = ".")

>>

>> ## might as well discard the missing values (depends on the other vars in

>> the file)

>> d <- d[!is.na(d$rrs), ]

>> ggplot(d, aes(x = x, y = y, colour = rrs)) + geom_point(pch = 19, cex =

>> 0.1)

>>

>> There are some MODIS and GDAL based packages that might be of use, but I

>> haven't yet seen any R tool that does this remapping task at scale. (I

>> believe the MODIS tools and the best warping tools in GDAL use thin-plate

>> spline techniques).

>>

>> Some applications would use the observations as points (i.e. the ocean

>> colour L3 bins as a daily aggregate from L2) and others use a direct

>> remapping of the data as an array, for say high-resolution sea ice imagery.

>>

>> You might not need to do anything particularly complicated though, what's

>> the goal?

>>

>> Cheers, Mike.

>>

>> On Wed, Apr 12, 2017, 20:06 Warner, David <[hidden email]> wrote:

>>

>> Greetings all

>>

>> I am trying to develop R code for processing L2 data (netcdf v4 files)

>> from

>> the Ocean Biology Processing Group.

>>

>> The data file I am working with to develop the code is at

>> https://github.com/dmwarn/Tethys/blob/master/

>> A2016244185500.L2_LAC_OC.x.nc

>> and represents primarily Lake Michigan in the United States. The data

>> were

>> extracted from a global dataset by the oceancolor L1 and L2 data server,

>> not by me.

>>

>> I have been using the code below to try to get the data into R and ready

>> for processing but am having problems with dimensions and/or

>> orthorectification. The

>>

>> #extent of the scene obtained using nc_open and ncvar_get

>> nc <- nc_open('A2016214184500.L2_LAC_OC.x.nc')

>> lon <- ncvar_get(nc, "navigation_data/longitude")

>> lat <- ncvar_get(nc, "navigation_data/latitude")

>> minx <- min(lon)

>> maxx <- max(lon)

>> miny <- min(lat)

>> maxy <- max(lat)

>>

>> #create extent object

>> myext <- extent(-90.817, -81.92438, 40.46493, 47.14244)

>>

>> #create raster

>> rrs.412 <- raster('A2016214184500.L2_LAC_OC.x.nc', var

>> ="geophysical_data/Rrs_412" ,

>> ext=myext)

>> rrs.412

>> > rrs.412

>> class : RasterLayer

>> dimensions : 644, 528, 340032 (nrow, ncol, ncell)

>> resolution : 1, 1 (x, y)

>> extent : 0.5, 528.5, 0.5, 644.5 (xmin, xmax, ymin, ymax)

>> coord. ref. : NA

>> data source : /Users/dmwarner/Documents/MODIS/OC/

>> A2016214184500.L2_LAC_OC.x.nc

>> names : Remote.sensing.reflectance.at.412.nm

>> zvar : geophysical_data/Rrs_412

>>

>> In spite of having tried to assign an extent, the raster extent is in rows

>> and columns.

>>

>> Further, plotting the raster reveals that it is flipped on x axis and

>> somewhat rotated relative to what it should look like. Even when flipped,

>> it is still not orthorectified.

>>

>> How do I get the raster to have the correct extent and also get it

>> orthorectified?

>> Thanks,

>> Dave Warner

>>

>> --

>> David Warner

>> Research Fisheries Biologist

>> U.S.G.S. Great Lakes Science Center

>> 1451 Green Road

>> Ann Arbor, MI 48105

>> 734-214-9392 <(734)%20214-9392>

>>

>> [[alternative HTML version deleted]]

>>

>> _______________________________________________

>> R-sig-Geo mailing list

>> [hidden email]

>> https://stat.ethz.ch/mailman/listinfo/r-sig-geo

>>

>> --

>> Dr. Michael Sumner

>> Software and Database Engineer

>> Australian Antarctic Division

>> 203 Channel Highway

>> Kingston Tasmania 7050 Australia

>>

>>

>>

>>

>> --

>> David Warner

>> Research Fisheries Biologist

>> U.S.G.S. Great Lakes Science Center

>> 1451 Green Road

>> Ann Arbor, MI 48105

>> 734-214-9392

>>

> --

> Dr. Michael Sumner

> Software and Database Engineer

> Australian Antarctic Division

> 203 Channel Highway

> Kingston Tasmania 7050 Australia

>

>

--

David Warner

Research Fisheries Biologist

U.S.G.S. Great Lakes Science Center

1451 Green Road

Ann Arbor, MI 48105

734-214-9392

[[alternative HTML version deleted]]

_______________________________________________

R-sig-Geo mailing list

[hidden email]

https://stat.ethz.ch/mailman/listinfo/r-sig-geo

### Re: dnearneigh() from spdep: Points with the exact same location are not considered neighbours.

> Dear Roger,

> Thank you for your answer, (And sorry for the HTML posting).

>

> The issue persists if I specify "GE" for the lower bound, but only when

> the parameter latlong is set to TRUE (see example below).

Thanks, very useful. The Great Circle distance measure returned NotANumber

for zero distance, because of an unprotected division by zero. I've

committed a patched source version to R-Forge. Look on

https://r-forge.r-project.org/R/?group_id=182

later today for a version with today's date and Rev: 693 - should show up

mid to late evening CEST. Please say whether this performs as expected.

>

>

> Regarding the nature of my data, it is a series of record of Jews

> arrested during the Holocaust in Italy. Those are point data, and some

> people have been arrested at the same place and at the same time (hence

> my problem). I am trying to assess spatial autocorrelation for a binary

> attribute (whether they survived the Holocaust or not), and I plan to

> use a Join-count method, for which I need a spatial weight matrix. Is

> using Join-count on such a dataset wrong ?

Join-count should be OK, but if you have covariates you could try to

remove the mean model first and only then see whether there is a spatially

structured random effect, for example with hglm, R2BayesX, INLA, or

similar. For hglm see for example:

https://journal.r-project.org/archive/2015/RJ-2015-017/index.html

The data you most likely do not have (addresses with residents at

risk of arrest but not arrested) would also help, giving you a risk of

arrest measure by address. There is also a spatial probit literature that

might be relevant; if you have timestamps, you will likely find that

operational factors play in, with arrests in a small area at the same

time.

Hope this helps,

Roger

>

> Best

>

>

>

> Code:

>

> library(data.table)

> library(spdep)

> pointstable <- data.table(XCoord=c(13.667029,13.667029,13.667028),

> YCoord=c(42.772396,42.772396,42.772396))

> print(pointstable)

> coords <-cbind(pointstable$XCoord, pointstable$YCoord)

> nbLocal<- dnearneigh(coords, d1=0, d2=25, longlat = TRUE, bound =

> c("GE", "LE"))

> summary(nbLocal)

> nbLocal<- dnearneigh(coords, d1=0, d2=25, longlat = FALSE, bound =

> c("GE", "LE"))

> summary(nbLocal)

>

>

> Output:

>> print(pointstable)

> XCoord YCoord

> 1: 13.66703 42.7724

> 2: 13.66703 42.7724

> 3: 13.66703 42.7724

>

>> nbLocal<- dnearneigh(coords, d1=0, d2=25, longlat = TRUE, bound =

> c("GE", "LE"))

>> summary(nbLocal)

> Neighbour list object:

> Number of regions: 3

> Number of nonzero links: 4

> Percentage nonzero weights: 44.44444

> Average number of links: 1.333333

> Link number distribution:

>

> 1 2

> 2 1

> 2 least connected regions:

> 1 2 with 1 link

> 1 most connected region:

> 3 with 2 links

>

>> nbLocal<- dnearneigh(coords, d1=0, d2=25, longlat = FALSE, bound =

> c("GE", "LE"))

>> summary(nbLocal)

> Neighbour list object:

> Number of regions: 3

> Number of nonzero links: 6

> Percentage nonzero weights: 66.66667

> Average number of links: 2

> Link number distribution:

>

> 2

> 3

> 3 least connected regions:

> 1 2 3 with 2 links

> 3 most connected regions:

> 1 2 3 with 2 links

>

>

>

> On 12/04/2017 02:27, Roger Bivand wrote:

>> Do not post HTML-mail, only plain text. Your example is not reproducible

>> because you used HTML-mail.

>>

>> Please read the help file, the bounds are described as being between

>> lower (greater than) and upper (less than or equal to) bounds. Since the

>> distance between identical points is strictly zero, they are not

>> neighbours because the distance must be > d1 and <= d2. If d1 is < 0, it

>> is reset to 0, as it is assumed that a negative lower bound is a user

>> error (and it would break the underlying compiled code).

>>

>> In any case, no reasonable cross-sectional spatial process has

>> duplicated point (nugget) observations in situations in which spatial

>> weights would be used (spatio-temporal panels will have, but then time

>> differs).

>>

>> Hope this clarifies,

>>

>> Roger

>>

>> On Wed, 12 Apr 2017, Maël Le Noc via R-sig-Geo wrote:

>>

>>> Dear List

>>>

>>> As I was working on a project, I realized that when I use dnearneigh

>>> from spdep, two (or more) points that have the exact same coordinates

>>> are not considered neighbours and thus are not linked (even when the

>>> lower bound is put to 0 or even to -1). See below for an example.

>>> (However this does not happen if the parameter longlat is set to false)

>>>

>>> Does the function behave the same way for you? Am I missing something?

>>> Is this an expected behavior? And if so, if there a way to change that ?

>>>

>>> In the example below, points 1 and 2 are not connected to each other/are

>>> not neighbours (as you can see since the both have only one link, to 3),

>>> even though they have the exact same coordinates (and are thus less than

>>> 25km apart), while point 3 is connected to both point 1 and 2.

>>> If I want to assess autocorrelation using, for instance joincount.test,

>>> this is then an issue...

>>>

>>>> /library(data.table) />/library(spdep) />/pointstable <-

>>>> data.table(XCoord=c(13.667029,13.667029,13.667028),

>>>> /YCoord=c(42.772396,42.772396,42.772396))

>>>> /print(pointstable) / XCoord YCoord

>>> 1: 13.667029 42.772396

>>> 2: 13.667029 42.772396

>>> 3: 13.667028 42.772396

>>>> /coords <-cbind(pointstable$XCoord, pointstable$YCoord) />/nbLocal<-

>>>> dnearneigh(coords, d1=0, d2=25, longlat = TRUE) />/nbLocal<-

>>>> dnearneigh(coords, d1=-1, d2=25, longlat = TRUE) #both lines /produce

>>>> the same output

>>>> /summary(nbLocal) /Neighbour list object:

>>> Number of regions: 3

>>> Number of nonzero links: 4

>>> Percentage nonzero weights: 44.44444

>>> Average number of links: 1.333333

>>> Link number distribution:

>>>

>>> 1 2

>>> 2 1

>>> 2 least connected regions:

>>> 1 2 with 1 link

>>> 1 most connected region:

>>> 3 with 2 links

>>>> //

>>> Thanks

>>> Maël

>>>

>>>

>>> [[alternative HTML version deleted]]

>>>

>>> _______________________________________________

>>> R-sig-Geo mailing list

>>> [hidden email]

>>> https://stat.ethz.ch/mailman/listinfo/r-sig-geo

>>

>

> --

Roger Bivand

Department of Economics, Norwegian School of Economics,

Helleveien 30, N-5045 Bergen, Norway.

voice: +47 55 95 93 55; e-mail: [hidden email]

Editor-in-Chief of The R Journal, https://journal.r-project.org/index.html

http://orcid.org/0000-0003-2392-6140

https://scholar.google.no/citations?user=AWeghB0AAAAJ&hl=en

_______________________________________________

R-sig-Geo mailing list

[hidden email]

https://stat.ethz.ch/mailman/listinfo/r-sig-geo Roger Bivand

Department of Economics

Norwegian School of Economics

Helleveien 30

N-5045 Bergen, Norway

### Re: dnearneigh() from spdep: Points with the exact same location are not considered neighbours.

Thank you for your answer, (And sorry for the HTML posting).

The issue persists if I specify "GE" for the lower bound, but only when

the parameter latlong is set to TRUE (see example below).

Regarding the nature of my data, it is a series of record of Jews

arrested during the Holocaust in Italy. Those are point data, and some

people have been arrested at the same place and at the same time (hence

my problem). I am trying to assess spatial autocorrelation for a binary

attribute (whether they survived the Holocaust or not), and I plan to

use a Join-count method, for which I need a spatial weight matrix. Is

using Join-count on such a dataset wrong ?

Best

Code:

library(data.table)

library(spdep)

pointstable <- data.table(XCoord=c(13.667029,13.667029,13.667028),

YCoord=c(42.772396,42.772396,42.772396))

print(pointstable)

coords <-cbind(pointstable$XCoord, pointstable$YCoord)

nbLocal<- dnearneigh(coords, d1=0, d2=25, longlat = TRUE, bound =

c("GE", "LE"))

summary(nbLocal)

nbLocal<- dnearneigh(coords, d1=0, d2=25, longlat = FALSE, bound =

c("GE", "LE"))

summary(nbLocal)

Output:

> print(pointstable)

XCoord YCoord

1: 13.66703 42.7724

2: 13.66703 42.7724

3: 13.66703 42.7724

> nbLocal<- dnearneigh(coords, d1=0, d2=25, longlat = TRUE, bound =

c("GE", "LE"))

> summary(nbLocal)

Neighbour list object:

Number of regions: 3

Number of nonzero links: 4

Percentage nonzero weights: 44.44444

Average number of links: 1.333333

Link number distribution:

1 2

2 1

2 least connected regions:

1 2 with 1 link

1 most connected region:

3 with 2 links

> nbLocal<- dnearneigh(coords, d1=0, d2=25, longlat = FALSE, bound =

c("GE", "LE"))

> summary(nbLocal)

Neighbour list object:

Number of regions: 3

Number of nonzero links: 6

Percentage nonzero weights: 66.66667

Average number of links: 2

Link number distribution:

2

3

3 least connected regions:

1 2 3 with 2 links

3 most connected regions:

1 2 3 with 2 links

On 12/04/2017 02:27, Roger Bivand wrote:

> Do not post HTML-mail, only plain text. Your example is not reproducible

> because you used HTML-mail.

>

> Please read the help file, the bounds are described as being between

> lower (greater than) and upper (less than or equal to) bounds. Since the

> distance between identical points is strictly zero, they are not

> neighbours because the distance must be > d1 and <= d2. If d1 is < 0, it

> is reset to 0, as it is assumed that a negative lower bound is a user

> error (and it would break the underlying compiled code).

>

> In any case, no reasonable cross-sectional spatial process has

> duplicated point (nugget) observations in situations in which spatial

> weights would be used (spatio-temporal panels will have, but then time

> differs).

>

> Hope this clarifies,

>

> Roger

>

> On Wed, 12 Apr 2017, Maël Le Noc via R-sig-Geo wrote:

>

>> Dear List

>>

>> As I was working on a project, I realized that when I use dnearneigh

>> from spdep, two (or more) points that have the exact same coordinates

>> are not considered neighbours and thus are not linked (even when the

>> lower bound is put to 0 or even to -1). See below for an example.

>> (However this does not happen if the parameter longlat is set to false)

>>

>> Does the function behave the same way for you? Am I missing something?

>> Is this an expected behavior? And if so, if there a way to change that ?

>>

>> In the example below, points 1 and 2 are not connected to each other/are

>> not neighbours (as you can see since the both have only one link, to 3),

>> even though they have the exact same coordinates (and are thus less than

>> 25km apart), while point 3 is connected to both point 1 and 2.

>> If I want to assess autocorrelation using, for instance joincount.test,

>> this is then an issue...

>>

>>> /library(data.table) />/library(spdep) />/pointstable <-

>>> data.table(XCoord=c(13.667029,13.667029,13.667028),

>>> /YCoord=c(42.772396,42.772396,42.772396))

>>> /print(pointstable) / XCoord YCoord

>> 1: 13.667029 42.772396

>> 2: 13.667029 42.772396

>> 3: 13.667028 42.772396

>>> /coords <-cbind(pointstable$XCoord, pointstable$YCoord) />/nbLocal<-

>>> dnearneigh(coords, d1=0, d2=25, longlat = TRUE) />/nbLocal<-

>>> dnearneigh(coords, d1=-1, d2=25, longlat = TRUE) #both lines /produce

>>> the same output

>>> /summary(nbLocal) /Neighbour list object:

>> Number of regions: 3

>> Number of nonzero links: 4

>> Percentage nonzero weights: 44.44444

>> Average number of links: 1.333333

>> Link number distribution:

>>

>> 1 2

>> 2 1

>> 2 least connected regions:

>> 1 2 with 1 link

>> 1 most connected region:

>> 3 with 2 links

>>> //

>> Thanks

>> Maël

>>

>>

>> [[alternative HTML version deleted]]

>>

>> _______________________________________________

>> R-sig-Geo mailing list

>> [hidden email]

>> https://stat.ethz.ch/mailman/listinfo/r-sig-geo

>

_______________________________________________

R-sig-Geo mailing list

[hidden email]

https://stat.ethz.ch/mailman/listinfo/r-sig-geo

### Re: raster and oceancolor L2 netcdf data

mapping (?) which is definitely not the right way to go. Calculate

chlorophyll and then map, that's how Seadas does it, even though the

remapping is the hard part. And apologies if I misread, just checking.

I have two algorithms in my roc package on GitHub in case they help

understanding how the calcs get done. Presumably you'll have locally

calibrated parameters for a local algo?

If you want to aggregate into a local map I think it's fair to group-by

directly on L2 pixels coords and then sum into a geographic map, without

worrying about swath-as-image at all. We've road tested doing this but want

the entire southern ocean eventually so it needs a bit of unrelated

preparation for the raw files.

I'd be happy to explore an R solution off list if you're interested. L2 is

surprisingly easy and efficient in R via GDAL.

(This is also a good example for future workflows for the planned stars

package imo.)

Cheers, Mike

On Wed, Apr 12, 2017, 22:35 Warner, David <[hidden email]> wrote:

> Thanks Mike!

>

> The goal is to estimate daily chlorophyll via band ratio polynomial

> equation for hundreds of days of data (hundreds of data files). Sounds

> like rather than finding a way to orthorectify in R I should learn to batch

> reproject using SeaDAS, which does produce a product that is in geotiff

> format, is orthorectified, and has readily mappable. I was trying to avoid

> that as the help and documentation available for doing that seems much less

> abundant. One file at a time is easy using the SeaDAS gui.

>

> Thanks very, very much for the other tricks. Not surprisingly, ggplot2

> comes through again with plots that look right!

> Cheers,

> Dave

>

>

>

> On Wed, Apr 12, 2017 at 7:01 AM, Michael Sumner <[hidden email]>

> wrote:

>

> You can't georeference these data without remapping the data, essentially

> treating the pixels as points. They have no natural regular grid form,

> except possibly a unique satellite-perspective one. The data are in 2D

> array form, but they have explicit "geolocation arrays", i.e. a longitude

> and latitude for every cell and not based on a regular mapping.

>

> R does not have tools for this directly from these data, though it can be

> treated as a resampling or modelling problem.

> You can use raster to get at the values of the locations easily enough,

> here's a couple of plotting options in case it's useful:

>

> u <- "

> https://github.com/dmwarn/Tethys/blob/master/A2016244185500.L2_LAC_OC.x.nc?raw=true

> "

> f <- basename(f)

> download.file(u, f, mode = "wb")

>

> library(raster)

> ## use raster to treat as raw point data, on long-lat locations

> rrs <- raster(f, varname = "geophysical_data/Rrs_412")

> longitude <- raster(f, varname = "navigation_data/longitude")

> latitude <- raster(f, varname = "navigation_data/latitude")

>

> ## plot in grid space, and add the geolocation space as a graticule

> plot(rrs)

> contour(longitude, add = TRUE)

> contour(latitude, add = TRUE)

>

> ## raw scaling against rrs values

> scl <- function(x) (x - min(x, na.rm = TRUE))/diff(range(x, na.rm = TRUE))

> plot(values(longitude), values(latitude), pch = ".", col =

> topo.colors(56)[scl(values(rrs)) * 55 + 1])

>

> ## ggplot

> library(ggplot2)

> d <- data.frame(x = values(longitude), y = values(latitude), rrs =

> values(rrs))

> ggplot(d, aes(x = x, y = y, colour = rrs)) + geom_point(pch = ".")

>

> ## might as well discard the missing values (depends on the other vars in

> the file)

> d <- d[!is.na(d$rrs), ]

> ggplot(d, aes(x = x, y = y, colour = rrs)) + geom_point(pch = 19, cex =

> 0.1)

>

> There are some MODIS and GDAL based packages that might be of use, but I

> haven't yet seen any R tool that does this remapping task at scale. (I

> believe the MODIS tools and the best warping tools in GDAL use thin-plate

> spline techniques).

>

> Some applications would use the observations as points (i.e. the ocean

> colour L3 bins as a daily aggregate from L2) and others use a direct

> remapping of the data as an array, for say high-resolution sea ice imagery.

>

> You might not need to do anything particularly complicated though, what's

> the goal?

>

> Cheers, Mike.

>

> On Wed, Apr 12, 2017, 20:06 Warner, David <[hidden email]> wrote:

>

> Greetings all

>

> I am trying to develop R code for processing L2 data (netcdf v4 files) from

> the Ocean Biology Processing Group.

>

> The data file I am working with to develop the code is at

> https://github.com/dmwarn/Tethys/blob/master/A2016244185500.L2_LAC_OC.x.nc

> and represents primarily Lake Michigan in the United States. The data were

> extracted from a global dataset by the oceancolor L1 and L2 data server,

> not by me.

>

> I have been using the code below to try to get the data into R and ready

> for processing but am having problems with dimensions and/or

> orthorectification. The

>

> #extent of the scene obtained using nc_open and ncvar_get

> nc <- nc_open('A2016214184500.L2_LAC_OC.x.nc')

> lon <- ncvar_get(nc, "navigation_data/longitude")

> lat <- ncvar_get(nc, "navigation_data/latitude")

> minx <- min(lon)

> maxx <- max(lon)

> miny <- min(lat)

> maxy <- max(lat)

>

> #create extent object

> myext <- extent(-90.817, -81.92438, 40.46493, 47.14244)

>

> #create raster

> rrs.412 <- raster('A2016214184500.L2_LAC_OC.x.nc', var

> ="geophysical_data/Rrs_412" ,

> ext=myext)

> rrs.412

> > rrs.412

> class : RasterLayer

> dimensions : 644, 528, 340032 (nrow, ncol, ncell)

> resolution : 1, 1 (x, y)

> extent : 0.5, 528.5, 0.5, 644.5 (xmin, xmax, ymin, ymax)

> coord. ref. : NA

> data source : /Users/dmwarner/Documents/MODIS/OC/

> A2016214184500.L2_LAC_OC.x.nc

> names : Remote.sensing.reflectance.at.412.nm

> zvar : geophysical_data/Rrs_412

>

> In spite of having tried to assign an extent, the raster extent is in rows

> and columns.

>

> Further, plotting the raster reveals that it is flipped on x axis and

> somewhat rotated relative to what it should look like. Even when flipped,

> it is still not orthorectified.

>

> How do I get the raster to have the correct extent and also get it

> orthorectified?

> Thanks,

> Dave Warner

>

> --

> David Warner

> Research Fisheries Biologist

> U.S.G.S. Great Lakes Science Center

> 1451 Green Road

> Ann Arbor, MI 48105

> 734-214-9392 <(734)%20214-9392>

>

> [[alternative HTML version deleted]]

>

> _______________________________________________

> R-sig-Geo mailing list

> [hidden email]

> https://stat.ethz.ch/mailman/listinfo/r-sig-geo

>

> --

> Dr. Michael Sumner

> Software and Database Engineer

> Australian Antarctic Division

> 203 Channel Highway

> Kingston Tasmania 7050 Australia

>

>

>

>

> --

> David Warner

> Research Fisheries Biologist

> U.S.G.S. Great Lakes Science Center

> 1451 Green Road

> Ann Arbor, MI 48105

> 734-214-9392

> --

Dr. Michael Sumner

Software and Database Engineer

Australian Antarctic Division

203 Channel Highway

Kingston Tasmania 7050 Australia

[[alternative HTML version deleted]]

_______________________________________________

R-sig-Geo mailing list

[hidden email]

https://stat.ethz.ch/mailman/listinfo/r-sig-geo

### Re: raster and oceancolor L2 netcdf data

The goal is to estimate daily chlorophyll via band ratio polynomial

equation for hundreds of days of data (hundreds of data files). Sounds

like rather than finding a way to orthorectify in R I should learn to batch

reproject using SeaDAS, which does produce a product that is in geotiff

format, is orthorectified, and has readily mappable. I was trying to avoid

that as the help and documentation available for doing that seems much less

abundant. One file at a time is easy using the SeaDAS gui.

Thanks very, very much for the other tricks. Not surprisingly, ggplot2

comes through again with plots that look right!

Cheers,

Dave

On Wed, Apr 12, 2017 at 7:01 AM, Michael Sumner <[hidden email]> wrote:

> You can't georeference these data without remapping the data, essentially

> treating the pixels as points. They have no natural regular grid form,

> except possibly a unique satellite-perspective one. The data are in 2D

> array form, but they have explicit "geolocation arrays", i.e. a longitude

> and latitude for every cell and not based on a regular mapping.

>

> R does not have tools for this directly from these data, though it can be

> treated as a resampling or modelling problem.

> You can use raster to get at the values of the locations easily enough,

> here's a couple of plotting options in case it's useful:

>

> u <- "https://github.com/dmwarn/Tethys/blob/master/

> A2016244185500.L2_LAC_OC.x.nc?raw=true"

> f <- basename(f)

> download.file(u, f, mode = "wb")

>

> library(raster)

> ## use raster to treat as raw point data, on long-lat locations

> rrs <- raster(f, varname = "geophysical_data/Rrs_412")

> longitude <- raster(f, varname = "navigation_data/longitude")

> latitude <- raster(f, varname = "navigation_data/latitude")

>

> ## plot in grid space, and add the geolocation space as a graticule

> plot(rrs)

> contour(longitude, add = TRUE)

> contour(latitude, add = TRUE)

>

> ## raw scaling against rrs values

> scl <- function(x) (x - min(x, na.rm = TRUE))/diff(range(x, na.rm = TRUE))

> plot(values(longitude), values(latitude), pch = ".", col =

> topo.colors(56)[scl(values(rrs)) * 55 + 1])

>

> ## ggplot

> library(ggplot2)

> d <- data.frame(x = values(longitude), y = values(latitude), rrs =

> values(rrs))

> ggplot(d, aes(x = x, y = y, colour = rrs)) + geom_point(pch = ".")

>

> ## might as well discard the missing values (depends on the other vars in

> the file)

> d <- d[!is.na(d$rrs), ]

> ggplot(d, aes(x = x, y = y, colour = rrs)) + geom_point(pch = 19, cex =

> 0.1)

>

> There are some MODIS and GDAL based packages that might be of use, but I

> haven't yet seen any R tool that does this remapping task at scale. (I

> believe the MODIS tools and the best warping tools in GDAL use thin-plate

> spline techniques).

>

> Some applications would use the observations as points (i.e. the ocean

> colour L3 bins as a daily aggregate from L2) and others use a direct

> remapping of the data as an array, for say high-resolution sea ice imagery.

>

> You might not need to do anything particularly complicated though, what's

> the goal?

>

> Cheers, Mike.

>

> On Wed, Apr 12, 2017, 20:06 Warner, David <[hidden email]> wrote:

>

> Greetings all

>

> I am trying to develop R code for processing L2 data (netcdf v4 files) from

> the Ocean Biology Processing Group.

>

> The data file I am working with to develop the code is at

> https://github.com/dmwarn/Tethys/blob/master/A2016244185500.L2_LAC_OC.x.nc

> and represents primarily Lake Michigan in the United States. The data were

> extracted from a global dataset by the oceancolor L1 and L2 data server,

> not by me.

>

> I have been using the code below to try to get the data into R and ready

> for processing but am having problems with dimensions and/or

> orthorectification. The

>

> #extent of the scene obtained using nc_open and ncvar_get

> nc <- nc_open('A2016214184500.L2_LAC_OC.x.nc')

> lon <- ncvar_get(nc, "navigation_data/longitude")

> lat <- ncvar_get(nc, "navigation_data/latitude")

> minx <- min(lon)

> maxx <- max(lon)

> miny <- min(lat)

> maxy <- max(lat)

>

> #create extent object

> myext <- extent(-90.817, -81.92438, 40.46493, 47.14244)

>

> #create raster

> rrs.412 <- raster('A2016214184500.L2_LAC_OC.x.nc', var

> ="geophysical_data/Rrs_412" ,

> ext=myext)

> rrs.412

> > rrs.412

> class : RasterLayer

> dimensions : 644, 528, 340032 (nrow, ncol, ncell)

> resolution : 1, 1 (x, y)

> extent : 0.5, 528.5, 0.5, 644.5 (xmin, xmax, ymin, ymax)

> coord. ref. : NA

> data source : /Users/dmwarner/Documents/MODIS/OC/

> A2016214184500.L2_LAC_OC.x.nc

> names : Remote.sensing.reflectance.at.412.nm

> zvar : geophysical_data/Rrs_412

>

> In spite of having tried to assign an extent, the raster extent is in rows

> and columns.

>

> Further, plotting the raster reveals that it is flipped on x axis and

> somewhat rotated relative to what it should look like. Even when flipped,

> it is still not orthorectified.

>

> How do I get the raster to have the correct extent and also get it

> orthorectified?

> Thanks,

> Dave Warner

>

> --

> David Warner

> Research Fisheries Biologist

> U.S.G.S. Great Lakes Science Center

> 1451 Green Road

> Ann Arbor, MI 48105

> 734-214-9392 <(734)%20214-9392>

>

> [[alternative HTML version deleted]]

>

> _______________________________________________

> R-sig-Geo mailing list

> [hidden email]

> https://stat.ethz.ch/mailman/listinfo/r-sig-geo

>

> --

> Dr. Michael Sumner

> Software and Database Engineer

> Australian Antarctic Division

> 203 Channel Highway

> Kingston Tasmania 7050 Australia

>

>

--

David Warner

Research Fisheries Biologist

U.S.G.S. Great Lakes Science Center

1451 Green Road

Ann Arbor, MI 48105

734-214-9392

[[alternative HTML version deleted]]

_______________________________________________

R-sig-Geo mailing list

[hidden email]

https://stat.ethz.ch/mailman/listinfo/r-sig-geo

### Re: raster and oceancolor L2 netcdf data

treating the pixels as points. They have no natural regular grid form,

except possibly a unique satellite-perspective one. The data are in 2D

array form, but they have explicit "geolocation arrays", i.e. a longitude

and latitude for every cell and not based on a regular mapping.

R does not have tools for this directly from these data, though it can be

treated as a resampling or modelling problem.

You can use raster to get at the values of the locations easily enough,

here's a couple of plotting options in case it's useful:

u <- "

https://github.com/dmwarn/Tethys/blob/master/A2016244185500.L2_LAC_OC.x.nc?raw=true

"

f <- basename(f)

download.file(u, f, mode = "wb")

library(raster)

## use raster to treat as raw point data, on long-lat locations

rrs <- raster(f, varname = "geophysical_data/Rrs_412")

longitude <- raster(f, varname = "navigation_data/longitude")

latitude <- raster(f, varname = "navigation_data/latitude")

## plot in grid space, and add the geolocation space as a graticule

plot(rrs)

contour(longitude, add = TRUE)

contour(latitude, add = TRUE)

## raw scaling against rrs values

scl <- function(x) (x - min(x, na.rm = TRUE))/diff(range(x, na.rm = TRUE))

plot(values(longitude), values(latitude), pch = ".", col =

topo.colors(56)[scl(values(rrs)) * 55 + 1])

## ggplot

library(ggplot2)

d <- data.frame(x = values(longitude), y = values(latitude), rrs =

values(rrs))

ggplot(d, aes(x = x, y = y, colour = rrs)) + geom_point(pch = ".")

## might as well discard the missing values (depends on the other vars in

the file)

d <- d[!is.na(d$rrs), ]

ggplot(d, aes(x = x, y = y, colour = rrs)) + geom_point(pch = 19, cex = 0.1)

There are some MODIS and GDAL based packages that might be of use, but I

haven't yet seen any R tool that does this remapping task at scale. (I

believe the MODIS tools and the best warping tools in GDAL use thin-plate

spline techniques).

Some applications would use the observations as points (i.e. the ocean

colour L3 bins as a daily aggregate from L2) and others use a direct

remapping of the data as an array, for say high-resolution sea ice imagery.

You might not need to do anything particularly complicated though, what's

the goal?

Cheers, Mike.

On Wed, Apr 12, 2017, 20:06 Warner, David <[hidden email]> wrote:

Greetings all

I am trying to develop R code for processing L2 data (netcdf v4 files) from

the Ocean Biology Processing Group.

The data file I am working with to develop the code is at

https://github.com/dmwarn/Tethys/blob/master/A2016244185500.L2_LAC_OC.x.nc

and represents primarily Lake Michigan in the United States. The data were

extracted from a global dataset by the oceancolor L1 and L2 data server,

not by me.

I have been using the code below to try to get the data into R and ready

for processing but am having problems with dimensions and/or

orthorectification. The

#extent of the scene obtained using nc_open and ncvar_get

nc <- nc_open('A2016214184500.L2_LAC_OC.x.nc')

lon <- ncvar_get(nc, "navigation_data/longitude")

lat <- ncvar_get(nc, "navigation_data/latitude")

minx <- min(lon)

maxx <- max(lon)

miny <- min(lat)

maxy <- max(lat)

#create extent object

myext <- extent(-90.817, -81.92438, 40.46493, 47.14244)

#create raster

rrs.412 <- raster('A2016214184500.L2_LAC_OC.x.nc', var

="geophysical_data/Rrs_412" ,

ext=myext)

rrs.412

> rrs.412

class : RasterLayer

dimensions : 644, 528, 340032 (nrow, ncol, ncell)

resolution : 1, 1 (x, y)

extent : 0.5, 528.5, 0.5, 644.5 (xmin, xmax, ymin, ymax)

coord. ref. : NA

data source : /Users/dmwarner/Documents/MODIS/OC/

A2016214184500.L2_LAC_OC.x.nc

names : Remote.sensing.reflectance.at.412.nm

zvar : geophysical_data/Rrs_412

In spite of having tried to assign an extent, the raster extent is in rows

and columns.

Further, plotting the raster reveals that it is flipped on x axis and

somewhat rotated relative to what it should look like. Even when flipped,

it is still not orthorectified.

How do I get the raster to have the correct extent and also get it

orthorectified?

Thanks,

Dave Warner

--

David Warner

Research Fisheries Biologist

U.S.G.S. Great Lakes Science Center

1451 Green Road

Ann Arbor, MI 48105

734-214-9392 <(734)%20214-9392>

[[alternative HTML version deleted]]

_______________________________________________

R-sig-Geo mailing list

[hidden email]

https://stat.ethz.ch/mailman/listinfo/r-sig-geo

--

Dr. Michael Sumner

Software and Database Engineer

Australian Antarctic Division

203 Channel Highway

Kingston Tasmania 7050 Australia

[[alternative HTML version deleted]]

_______________________________________________

R-sig-Geo mailing list

[hidden email]

https://stat.ethz.ch/mailman/listinfo/r-sig-geo

### raster and oceancolor L2 netcdf data

I am trying to develop R code for processing L2 data (netcdf v4 files) from

the Ocean Biology Processing Group.

The data file I am working with to develop the code is at

https://github.com/dmwarn/Tethys/blob/master/A2016244185500.L2_LAC_OC.x.nc

and represents primarily Lake Michigan in the United States. The data were

extracted from a global dataset by the oceancolor L1 and L2 data server,

not by me.

I have been using the code below to try to get the data into R and ready

for processing but am having problems with dimensions and/or

orthorectification. The

#extent of the scene obtained using nc_open and ncvar_get

nc <- nc_open('A2016214184500.L2_LAC_OC.x.nc')

lon <- ncvar_get(nc, "navigation_data/longitude")

lat <- ncvar_get(nc, "navigation_data/latitude")

minx <- min(lon)

maxx <- max(lon)

miny <- min(lat)

maxy <- max(lat)

#create extent object

myext <- extent(-90.817, -81.92438, 40.46493, 47.14244)

#create raster

rrs.412 <- raster('A2016214184500.L2_LAC_OC.x.nc', var

="geophysical_data/Rrs_412" ,

ext=myext)

rrs.412

> rrs.412

class : RasterLayer

dimensions : 644, 528, 340032 (nrow, ncol, ncell)

resolution : 1, 1 (x, y)

extent : 0.5, 528.5, 0.5, 644.5 (xmin, xmax, ymin, ymax)

coord. ref. : NA

data source : /Users/dmwarner/Documents/MODIS/OC/

A2016214184500.L2_LAC_OC.x.nc

names : Remote.sensing.reflectance.at.412.nm

zvar : geophysical_data/Rrs_412

In spite of having tried to assign an extent, the raster extent is in rows

and columns.

Further, plotting the raster reveals that it is flipped on x axis and

somewhat rotated relative to what it should look like. Even when flipped,

it is still not orthorectified.

How do I get the raster to have the correct extent and also get it

orthorectified?

Thanks,

Dave Warner

--

David Warner

Research Fisheries Biologist

U.S.G.S. Great Lakes Science Center

1451 Green Road

Ann Arbor, MI 48105

734-214-9392

[[alternative HTML version deleted]]

_______________________________________________

R-sig-Geo mailing list

[hidden email]

https://stat.ethz.ch/mailman/listinfo/r-sig-geo

### Re: variance-covariance matrix for GMerrorsar

> Is this the link to the package?

>

> https://r-forge.r-project.org/R/?group_id=182

Yes, and:

install.packages("spdep", repos="http://R-Forge.R-project.org")

should work. But do try sphet::spreg() too, it is more flexible than the

older spdep GM functions.

Roger

>

> Chelsea

>

> On Tue, Apr 11, 2017 at 10:59 PM, Qiuhua Ma <[hidden email]> wrote:

>

>> Thanks for your quick reply. You are right. Marginal wtp should take into

>> account rho for spatial lag model.

>>

>> I still would like to use GMerrorsar. Can you please send me the source

>> package?

>>

>> Best,

>>

>> Chelsea

>>

>> On Tue, Apr 11, 2017 at 7:54 AM, Roger Bivand <[hidden email]> wrote:

>>

>>> On Tue, 11 Apr 2017, Qiuhua Ma wrote:

>>>

>>> Dear list,

>>>>

>>>> I want to use bootstrapping to derive confidence intervals for marginal

>>>> wtp after GMerrorsar command.

>>>>

>>>> It works for stsls since covariance matrix is directly available.

>>>> However,

>>>> I cannot find covariance matrix for GMerrorsar.

>>>>

>>>> For example, the following code works for stsls:

>>>>

>>>> model1.beta <- coef(model1)

>>>>

>>>> model1.vcov <- summary(model1)$var

>>>>

>>>> model1.sim <- rmultnorm(10000, mu = model1.beta, vmat = model1.vcov)

>>>>

>>>> model1.mwtp <- model1.sim * Pbar

>>>>

>>>> model1.ci <- apply(model1.mwtp, 2, quantile, c(0.025, 0.975))

>>>>

>>>

>>> The DGP for this model is (I - \rho W)^{-1} (X \beta + e), so I'm in geat

>>> doubt about whether your proposal is correct (model1.vcov is has one more

>>> row and column than X has columns, so including \rho); the first element of

>>> model1.beta is \rho.

>>>

>>>

>>>>

>>>> when I apply the same code for GMerrorsar:

>>>>

>>>>

>>>> model2.beta <- coef(model2)

>>>>

>>>> model2.vcov <- summary(model2)$var

>>>>

>>>>

>>>> model2.vcov

>>>>>

>>>>

>>>> NULL

>>>>

>>>>

>>>> How can I obtain covariance matrix for GMerrorsar?

>>>>

>>>> Reading the code, you'll see where the matrices occur. Running under

>>> debug, you can assign the outside the environment of the function if you

>>> like (use <<- ). I've added a vcov component in the returned object (source

>>> on R-Forge, I can send a source package or a Windows binary package).

>>>

>>> You should also look at sphet::spreg, which does return a var component.

>>> Please note that you should think of the DGP first and foremost, the coef

>>> and var may return the values for what you are treating as nuisance parts

>>> of the model. Getting the distribution of the willingess to pay also

>>> probably involves them and their variability.

>>>

>>> Have you considered getting the WTP marginal from a Bayesian approach?

>>>

>>> Hope this helps,

>>>

>>> Roger

>>>

>>>

>>>

>>>> Chelsea

>>>>

>>>> [[alternative HTML version deleted]]

>>>>

>>>> _______________________________________________

>>>> R-sig-Geo mailing list

>>>> [hidden email]

>>>> https://stat.ethz.ch/mailman/listinfo/r-sig-geo

>>>>

>>>>

>>> --

>>> Roger Bivand

>>> Department of Economics, Norwegian School of Economics,

>>> Helleveien 30, N-5045 Bergen, Norway.

>>> voice: +47 55 95 93 55; e-mail: [hidden email]

>>> Editor-in-Chief of The R Journal, https://journal.r-project.org/

>>> index.html

>>> http://orcid.org/0000-0003-2392-6140

>>> https://scholar.google.no/citations?user=AWeghB0AAAAJ&hl=en

>>>

>>

>>

>

--

Roger Bivand

Department of Economics, Norwegian School of Economics,

Helleveien 30, N-5045 Bergen, Norway.

voice: +47 55 95 93 55; e-mail: [hidden email]

Editor-in-Chief of The R Journal, https://journal.r-project.org/index.html

http://orcid.org/0000-0003-2392-6140

https://scholar.google.no/citations?user=AWeghB0AAAAJ&hl=en

_______________________________________________

R-sig-Geo mailing list

[hidden email]

https://stat.ethz.ch/mailman/listinfo/r-sig-geo

Roger Bivand

Department of Economics

Norwegian School of Economics

Helleveien 30

N-5045 Bergen, Norway

### Re: dnearneigh() from spdep: Points with the exact same location are not considered neighbours.

because you used HTML-mail.

Please read the help file, the bounds are described as being between lower

(greater than) and upper (less than or equal to) bounds. Since the

distance between identical points is strictly zero, they are not

neighbours because the distance must be > d1 and <= d2. If d1 is < 0, it

is reset to 0, as it is assumed that a negative lower bound is a user

error (and it would break the underlying compiled code).

In any case, no reasonable cross-sectional spatial process has duplicated

point (nugget) observations in situations in which spatial weights would

be used (spatio-temporal panels will have, but then time differs).

Hope this clarifies,

Roger

On Wed, 12 Apr 2017, Maël Le Noc via R-sig-Geo wrote:

> Dear List

>

> As I was working on a project, I realized that when I use dnearneigh

> from spdep, two (or more) points that have the exact same coordinates

> are not considered neighbours and thus are not linked (even when the

> lower bound is put to 0 or even to -1). See below for an example.

> (However this does not happen if the parameter longlat is set to false)

>

> Does the function behave the same way for you? Am I missing something?

> Is this an expected behavior? And if so, if there a way to change that ?

>

> In the example below, points 1 and 2 are not connected to each other/are

> not neighbours (as you can see since the both have only one link, to 3),

> even though they have the exact same coordinates (and are thus less than

> 25km apart), while point 3 is connected to both point 1 and 2.

> If I want to assess autocorrelation using, for instance joincount.test,

> this is then an issue...

>

>> /library(data.table) />/library(spdep) />/pointstable <- data.table(XCoord=c(13.667029,13.667029,13.667028), /YCoord=c(42.772396,42.772396,42.772396))

>> /print(pointstable) / XCoord YCoord

> 1: 13.667029 42.772396

> 2: 13.667029 42.772396

> 3: 13.667028 42.772396

>> /coords <-cbind(pointstable$XCoord, pointstable$YCoord) />/nbLocal<- dnearneigh(coords, d1=0, d2=25, longlat = TRUE) />/nbLocal<- dnearneigh(coords, d1=-1, d2=25, longlat = TRUE) #both lines /produce the same output

>> /summary(nbLocal) /Neighbour list object:

> Number of regions: 3

> Number of nonzero links: 4

> Percentage nonzero weights: 44.44444

> Average number of links: 1.333333

> Link number distribution:

>

> 1 2

> 2 1

> 2 least connected regions:

> 1 2 with 1 link

> 1 most connected region:

> 3 with 2 links

>> //

> Thanks

> Maël

>

>

> [[alternative HTML version deleted]]

>

> _______________________________________________

> R-sig-Geo mailing list

> [hidden email]

> https://stat.ethz.ch/mailman/listinfo/r-sig-geo--

Roger Bivand

Department of Economics, Norwegian School of Economics,

Helleveien 30, N-5045 Bergen, Norway.

voice: +47 55 95 93 55; e-mail: [hidden email]

Editor-in-Chief of The R Journal, https://journal.r-project.org/index.html

http://orcid.org/0000-0003-2392-6140

https://scholar.google.no/citations?user=AWeghB0AAAAJ&hl=en

_______________________________________________

R-sig-Geo mailing list

[hidden email]

https://stat.ethz.ch/mailman/listinfo/r-sig-geo Roger Bivand

Department of Economics

Norwegian School of Economics

Helleveien 30

N-5045 Bergen, Norway

### Re: variance-covariance matrix for GMerrorsar

https://r-forge.r-project.org/R/?group_id=182

Chelsea

On Tue, Apr 11, 2017 at 10:59 PM, Qiuhua Ma <[hidden email]> wrote:

> Thanks for your quick reply. You are right. Marginal wtp should take into

> account rho for spatial lag model.

>

> I still would like to use GMerrorsar. Can you please send me the source

> package?

>

> Best,

>

> Chelsea

>

> On Tue, Apr 11, 2017 at 7:54 AM, Roger Bivand <[hidden email]> wrote:

>

>> On Tue, 11 Apr 2017, Qiuhua Ma wrote:

>>

>> Dear list,

>>>

>>> I want to use bootstrapping to derive confidence intervals for marginal

>>> wtp after GMerrorsar command.

>>>

>>> It works for stsls since covariance matrix is directly available.

>>> However,

>>> I cannot find covariance matrix for GMerrorsar.

>>>

>>> For example, the following code works for stsls:

>>>

>>> model1.beta <- coef(model1)

>>>

>>> model1.vcov <- summary(model1)$var

>>>

>>> model1.sim <- rmultnorm(10000, mu = model1.beta, vmat = model1.vcov)

>>>

>>> model1.mwtp <- model1.sim * Pbar

>>>

>>> model1.ci <- apply(model1.mwtp, 2, quantile, c(0.025, 0.975))

>>>

>>

>> The DGP for this model is (I - \rho W)^{-1} (X \beta + e), so I'm in geat

>> doubt about whether your proposal is correct (model1.vcov is has one more

>> row and column than X has columns, so including \rho); the first element of

>> model1.beta is \rho.

>>

>>

>>>

>>> when I apply the same code for GMerrorsar:

>>>

>>>

>>> model2.beta <- coef(model2)

>>>

>>> model2.vcov <- summary(model2)$var

>>>

>>>

>>> model2.vcov

>>>>

>>>

>>> NULL

>>>

>>>

>>> How can I obtain covariance matrix for GMerrorsar?

>>>

>>> Reading the code, you'll see where the matrices occur. Running under

>> debug, you can assign the outside the environment of the function if you

>> like (use <<- ). I've added a vcov component in the returned object (source

>> on R-Forge, I can send a source package or a Windows binary package).

>>

>> You should also look at sphet::spreg, which does return a var component.

>> Please note that you should think of the DGP first and foremost, the coef

>> and var may return the values for what you are treating as nuisance parts

>> of the model. Getting the distribution of the willingess to pay also

>> probably involves them and their variability.

>>

>> Have you considered getting the WTP marginal from a Bayesian approach?

>>

>> Hope this helps,

>>

>> Roger

>>

>>

>>

>>> Chelsea

>>>

>>> [[alternative HTML version deleted]]

>>>

>>> _______________________________________________

>>> R-sig-Geo mailing list

>>> [hidden email]

>>> https://stat.ethz.ch/mailman/listinfo/r-sig-geo

>>>

>>>

>> --

>> Roger Bivand

>> Department of Economics, Norwegian School of Economics,

>> Helleveien 30, N-5045 Bergen, Norway.

>> voice: +47 55 95 93 55; e-mail: [hidden email]

>> Editor-in-Chief of The R Journal, https://journal.r-project.org/

>> index.html

>> http://orcid.org/0000-0003-2392-6140

>> https://scholar.google.no/citations?user=AWeghB0AAAAJ&hl=en

>>

>

>

[[alternative HTML version deleted]]

_______________________________________________

R-sig-Geo mailing list

[hidden email]

https://stat.ethz.ch/mailman/listinfo/r-sig-geo

### Re: variance-covariance matrix for GMerrorsar

account rho for spatial lag model.

I still would like to use GMerrorsar. Can you please send me the source

package?

Best,

Chelsea

On Tue, Apr 11, 2017 at 7:54 AM, Roger Bivand <[hidden email]> wrote:

> On Tue, 11 Apr 2017, Qiuhua Ma wrote:

>

> Dear list,

>>

>> I want to use bootstrapping to derive confidence intervals for marginal

>> wtp after GMerrorsar command.

>>

>> It works for stsls since covariance matrix is directly available. However,

>> I cannot find covariance matrix for GMerrorsar.

>>

>> For example, the following code works for stsls:

>>

>> model1.beta <- coef(model1)

>>

>> model1.vcov <- summary(model1)$var

>>

>> model1.sim <- rmultnorm(10000, mu = model1.beta, vmat = model1.vcov)

>>

>> model1.mwtp <- model1.sim * Pbar

>>

>> model1.ci <- apply(model1.mwtp, 2, quantile, c(0.025, 0.975))

>>

>

> The DGP for this model is (I - \rho W)^{-1} (X \beta + e), so I'm in geat

> doubt about whether your proposal is correct (model1.vcov is has one more

> row and column than X has columns, so including \rho); the first element of

> model1.beta is \rho.

>

>

>>

>> when I apply the same code for GMerrorsar:

>>

>>

>> model2.beta <- coef(model2)

>>

>> model2.vcov <- summary(model2)$var

>>

>>

>> model2.vcov

>>>

>>

>> NULL

>>

>>

>> How can I obtain covariance matrix for GMerrorsar?

>>

>> Reading the code, you'll see where the matrices occur. Running under

> debug, you can assign the outside the environment of the function if you

> like (use <<- ). I've added a vcov component in the returned object (source

> on R-Forge, I can send a source package or a Windows binary package).

>

> You should also look at sphet::spreg, which does return a var component.

> Please note that you should think of the DGP first and foremost, the coef

> and var may return the values for what you are treating as nuisance parts

> of the model. Getting the distribution of the willingess to pay also

> probably involves them and their variability.

>

> Have you considered getting the WTP marginal from a Bayesian approach?

>

> Hope this helps,

>

> Roger

>

>

>

>> Chelsea

>>

>> [[alternative HTML version deleted]]

>>

>> _______________________________________________

>> R-sig-Geo mailing list

>> [hidden email]

>> https://stat.ethz.ch/mailman/listinfo/r-sig-geo

>>

>>

> --

> Roger Bivand

> Department of Economics, Norwegian School of Economics,

> Helleveien 30, N-5045 Bergen, Norway.

> voice: +47 55 95 93 55; e-mail: [hidden email]

> Editor-in-Chief of The R Journal, https://journal.r-project.org/index.html

> http://orcid.org/0000-0003-2392-6140

> https://scholar.google.no/citations?user=AWeghB0AAAAJ&hl=en

>

[[alternative HTML version deleted]]

_______________________________________________

R-sig-Geo mailing list

[hidden email]

https://stat.ethz.ch/mailman/listinfo/r-sig-geo

### dnearneigh() from spdep: Points with the exact same location are not considered neighbours.

As I was working on a project, I realized that when I use dnearneigh

from spdep, two (or more) points that have the exact same coordinates

are not considered neighbours and thus are not linked (even when the

lower bound is put to 0 or even to -1). See below for an example.

(However this does not happen if the parameter longlat is set to false)

Does the function behave the same way for you? Am I missing something?

Is this an expected behavior? And if so, if there a way to change that ?

In the example below, points 1 and 2 are not connected to each other/are

not neighbours (as you can see since the both have only one link, to 3),

even though they have the exact same coordinates (and are thus less than

25km apart), while point 3 is connected to both point 1 and 2.

If I want to assess autocorrelation using, for instance joincount.test,

this is then an issue...

>/library(data.table) />/library(spdep) />/pointstable <- data.table(XCoord=c(13.667029,13.667029,13.667028), /YCoord=c(42.772396,42.772396,42.772396))

>/print(pointstable) / XCoord YCoord

1: 13.667029 42.772396

2: 13.667029 42.772396

3: 13.667028 42.772396

>/coords <-cbind(pointstable$XCoord, pointstable$YCoord) />/nbLocal<- dnearneigh(coords, d1=0, d2=25, longlat = TRUE) />/nbLocal<- dnearneigh(coords, d1=-1, d2=25, longlat = TRUE) #both lines /produce the same output

>/summary(nbLocal) /Neighbour list object:

Number of regions: 3

Number of nonzero links: 4

Percentage nonzero weights: 44.44444

Average number of links: 1.333333

Link number distribution:

1 2

2 1

2 least connected regions:

1 2 with 1 link

1 most connected region:

3 with 2 links

>//

Thanks

Maël

[[alternative HTML version deleted]]

_______________________________________________

R-sig-Geo mailing list

[hidden email]

https://stat.ethz.ch/mailman/listinfo/r-sig-geo

### bug in plotting rasterLayer?

I found that when I try to plot 3 rasters on top of each other, the third

raster plot causes the plot device coordinates to shift to the right.

I can reproduce this error easily:

library(raster)

f <- system.file("external/test.grd", package="raster")

r <- raster(f)

plot(r)

plot(r, add=TRUE)

plot(r, add=TRUE)

plot(r, add=TRUE)

For me, the 3rd plot with add=TRUE causes a coordinate system shift.

I am running R v3.3.3 with raster package v2.5-8.

Are others finding the same thing?

--

Pascal Title

PhD candidate | Rabosky Lab <http://www.raboskylab.org/>

Dept of Ecology and Evolutionary Biology

University of Michigan

[hidden email]

pascaltitle.weebly.com

[[alternative HTML version deleted]]

_______________________________________________

R-sig-Geo mailing list

[hidden email]

https://stat.ethz.ch/mailman/listinfo/r-sig-geo