Chapter 7 Spatial Analysis

Spatial analysis provides an expansive analytical landscape with theoretical underpinnings dating to at least the 1950s (Berry and Marble (1968)), contributing to the development of geographic information systems as well as spatial tools in more statistically oriented programming environments like R. We won’t attempt to approach any thorough coverage of these methods, and we would refer the reader for more focused consideration using R-based methods to sources such as Lovelace, Nowosad, and Muenchow (2019) and Hijmans (n.d.).

In this chapter, we’ll continue working with spatial data, and explore spatial analysis methods. We’ll look at a selection of useful geospatial abstraction and analysis methods, what are also called geoprocessing tools in a GIS. The R Spatial world has grown in recent years to include an increasingly good array of tools. This chapter will focus on vector GIS methods, and here we don’t mean the same thing as the vector data objects in R nomenclature (Python’s pandas package, which uses a similar data structure, calls these “series” instead of vectors), but instead on feature geometries and their spatial relationships based on the coordinate referencing system.

7.1 Data Frame Operations

But before we get into those specifically spatial operations, it’s important to remember that feature data at least are also data frames, so we can use the methods we already know with them. For instance, we can look at properties of variables and then filter for features that meet a criterion, like all climate station points at greater than 2,000 m elevation, or all above 38°N latitude. To be able to work with latitude as a variable, we’ll need to use remove=FALSE (the default is to remove them) to retain them when we use st_as_sf.

Adding a basemap with maptiles: We’d like to have a basemap, so we’ll create one with the maptiles package (install if you don’t have it.) Warning: the get_tiles function goes online to get the basemap data, so if you don’t have a good internet connection or the site goes down, this may fail. We can then display the basemap with tm_rgb.

For temperature, we’ll reverse a RColorBrewer palette to show a reasonable color scheme by reversing its RdBu palette with rev (which took me a long time to figure out – color schemes are much more challenging than you might think because there are many highly varied uses of color.)

library(tmap); library(RColorBrewer); library(sf); library(tidyverse); 
library(maptiles); library(igisci)
tmap_mode("plot")
newname <- unique(str_sub(sierraFeb$STATION_NAME, 1, 
                          str_locate(sierraFeb$STATION_NAME, ",")-1))
sierraFeb2000 <- st_as_sf(bind_cols(sierraFeb,STATION=newname) %>% 
                          dplyr::select(-STATION_NAME) %>% 
                          filter(ELEVATION >= 2000, !is.na(TEMPERATURE)), 
                 coords=c("LONGITUDE","LATITUDE"), crs=4326)
sierraBase <- get_tiles(sierraFeb2000)
tm_shape(sierraBase) + 
  tm_rgb(tm_mv("red", "green", "blue"), col.scale = tm_scale_rgb(maxValue = 255)) +
  tm_shape(sierraFeb2000) + 
  tm_dots(fill =  "TEMPERATURE", #midpoint=NA,
          fill.scale=tm_scale_continuous(values="hcl.blue_red"),
          size=1) +
  tm_text(text = "STATION", size=0.5, auto.placement=T, xmod=0.5, ymod=0.5) +
  tm_graticules(lines=F)
Plotting filtered data: above 2,000 m and 38°N latitude with a basemap

FIGURE 7.1: Plotting filtered data: above 2,000 m and 38°N latitude with a basemap

Now let’s include LATITUDE. Let’s see what we get by filtering for both ELEVATION >= 2000 and LATITUDE >= 38:

sierraFeb %>%
  filter(ELEVATION >= 2000 & LATITUDE >= 38)
## # A tibble: 1 × 7
##   STATION_NAME     COUNTY ELEVATION LATITUDE LONGITUDE PRECIPITATION TEMPERATURE
##   <chr>            <fct>      <dbl>    <dbl>     <dbl>         <dbl>       <dbl>
## 1 BODIE CALIFORNI… Mono       2551.     38.2     -119.          39.6        -4.4

The only one left is Bodie (Figure 7.2). Maybe that’s why Bodie, a ghost town now, has such a reputation for being so friggin’ cold (at least for California). I’ve been snowed on there in summer.

A Bodie scene, from Bodie State Historic Park (https://www.parks.ca.gov/)

FIGURE 7.2: A Bodie scene, from Bodie State Historic Park (https://www.parks.ca.gov/)

7.1.1 Using grouped summaries, and filtering by a selection

We’ve been using February Sierra climate data for demonstrating various things, but this wasn’t the original form of the data downloaded, so let’s look at the original data and use some dplyr methods to restructure it, in this case to derive annual summaries. We’ll use the monthly normals to derive annual values.

We looked at the very useful group_by ... summarize() group-summary process earlier 3.4.5. We’ll use this and a selection process to create an annual Sierra climate dataframe we can map and analyze. The California monthly normals were downloaded from the National Centers for Environmental Information at NOAA, where you can select desired parameters, monthly normals as frequency, and limit to one state.

First we’ll have a quick look at the data to see how it’s structured. These were downloaded as monthly normals for the State of California, as of 2010, so there are 12 months coded 201001:201012, with obviously the same ELEVATION, LATITUDE, and LONGITUDE, but monthly values for climate data like MLY-PRCP-NORMAL, etc.

head(read_csv(ex("sierra/908277.csv")),n=15)
## # A tibble: 15 × 11
##    STATION    STATION_NAME ELEVATION LATITUDE LONGITUDE   DATE `MLY-PRCP-NORMAL`
##    <chr>      <chr>        <chr>     <chr>    <chr>      <dbl>             <dbl>
##  1 GHCND:USC… TWENTYNINE … 602       34.12806 -116.036… 201001             13.2 
##  2 GHCND:USC… TWENTYNINE … 602       34.12806 -116.036… 201002             15.0 
##  3 GHCND:USC… TWENTYNINE … 602       34.12806 -116.036… 201003             11.4 
##  4 GHCND:USC… TWENTYNINE … 602       34.12806 -116.036… 201004              3.3 
##  5 GHCND:USC… TWENTYNINE … 602       34.12806 -116.036… 201005              2.29
##  6 GHCND:USC… TWENTYNINE … 602       34.12806 -116.036… 201006              0.25
##  7 GHCND:USC… TWENTYNINE … 602       34.12806 -116.036… 201007             12.2 
##  8 GHCND:USC… TWENTYNINE … 602       34.12806 -116.036… 201008             20.6 
##  9 GHCND:USC… TWENTYNINE … 602       34.12806 -116.036… 201009             10.2 
## 10 GHCND:USC… TWENTYNINE … 602       34.12806 -116.036… 201010              5.08
## 11 GHCND:USC… TWENTYNINE … 602       34.12806 -116.036… 201011              5.08
## 12 GHCND:USC… TWENTYNINE … 602       34.12806 -116.036… 201012             14.7 
## 13 GHCND:USC… BUTTONWILLO… 82        35.4047  -119.4731 201001             32.5 
## 14 GHCND:USC… BUTTONWILLO… 82        35.4047  -119.4731 201002             30.0 
## 15 GHCND:USC… BUTTONWILLO… 82        35.4047  -119.4731 201003             30.7 
## # ℹ 4 more variables: `MLY-SNOW-NORMAL` <dbl>, `MLY-TAVG-NORMAL` <dbl>,
## #   `MLY-TMAX-NORMAL` <dbl>, `MLY-TMIN-NORMAL` <dbl>

To get just the Sierra data, it seemed easiest to just provide a list of relevant county names to filter the counties, do a bit of field renaming, then read in a previously created selection of Sierra weather/climate stations.

sierraCounties <- st_make_valid(CA_counties) %>%
  filter(NAME %in% c("Alpine","Amador","Butte","Calaveras","El Dorado",
                     "Fresno","Inyo","Kern","Lassen","Madera","Mariposa",
                     "Mono","Nevada","Placer","Plumas","Sacramento","Shasta",
                     "Sierra","Tehama","Tulare","Tuolumne","Yuba"))
normals <- read_csv(ex("sierra/908277.csv")) %>%
  mutate(STATION = str_sub(STATION,7,str_length(STATION)))
sierraStations <- read_csv(ex("sierra/sierraStations.csv"))

To get annual values, we’ll want to use the stations as groups in a group_by %>% summarize process. For values that stay the same for a station (LONGITUDE, LATITUDE, ELEVATION), we’ll use first() to get just one of the 12 identical monthly values. For values that vary monthly, we’ll sum() the monthly precipitations and get the mean() of monthly temperatures to get appropriate annual values. We’ll also use a right_join to keep only the stations that are in the Sierra. Have a look at this script to make sure you understand what it’s doing; it’s got several elements you’ve been introduced to before, and that you should understand. At the end, we’ll use st_as_sf to make sf data out of the data frame, and retain the LONGITUDE and LATITUDE variables in case we want to use them as separate variables (Figure 7.3).

sierraAnnual <- right_join(sierraStations,normals,by="STATION") %>%
  filter(!is.na(STATION_NA)) %>% 
    dplyr::select(-STATION_NA) %>%
    group_by(STATION_NAME) %>% summarize(LONGITUDE = first(LONGITUDE),
                                         LATITUDE = first(LATITUDE),
                                         ELEVATION = first(ELEVATION),
                                         PRECIPITATION = sum(`MLY-PRCP-NORMAL`),
                                         TEMPERATURE = mean(`MLY-TAVG-NORMAL`)) %>%
    mutate(STATION_NAME = str_sub(STATION_NAME,1,str_length(STATION_NAME)-6)) %>%
  filter(PRECIPITATION > 0) %>% filter(TEMPERATURE > -100) %>%
  st_as_sf(coords = c("LONGITUDE", "LATITUDE"), crs=4326, remove=F)
ggplot() + 
  geom_sf(data=CA_counties, aes(), fill="gray") +
  geom_sf(data=sierraCounties, aes(), fill="white") + 
  geom_sf(data=sierraAnnual, aes(col=PRECIPITATION)) +
  scale_color_gradient(low="orange", high="blue") 
Sierra data

FIGURE 7.3: Sierra data

7.2 Spatial Analysis Operations

Again, there is a lot more to spatial analysis than we have time to cover. But we’ll explore some particularly useful spatial analysis operations, especially those that contribute to statistical analysis methods we’ll be looking at soon. We’ll start by continuing to look at subsetting or filtering methods, but ones that use spatial relationships to identify what we want to retain or remove from consideration.

7.2.1 Using topology to subset

Using spatial relationships can be useful in filtering our data, and there are quite a few topological relationships that can be explored. See Lovelace, Nowosad, and Muenchow (2019) for a lot more about topological operations. We’ll look at a relatively simple one that identifies whether a feature is within another one, and apply this method to filter for features within a selection of five counties, which we’ll start by identifying by name.

nSierraCo <- CA_counties %>%
  filter(NAME %in% c("Plumas","Butte","Sierra","Nevada","Yuba"))

Then we’ll use those counties to select towns (places) that occur within them …

CA_places <- st_read(ex("sierra/CA_places.shp"))
nCAsel <- lengths(st_within(CA_places, nSierraCo)) > 0 # to get TRUE & FALSE
nSierraPlaces <- CA_places[nCAsel,]

… and do the same for the sierraFeb weather stations (Figure 7.4).

sierra <- st_as_sf(read_csv(ex("sierra/sierraFeb.csv")), 
                   coords=c("LONGITUDE","LATITUDE"), crs=4326)
nCAselSta <- lengths(st_within(sierra, nSierraCo)) > 0 # to get TRUE & FALSE
nSierraStations <- sierra[nCAselSta,]
library(tmap)
tmap_mode("plot")

tm_basemap() + 
  tm_shape(nSierraCo) + tm_borders() + tm_text("NAME", size=0.7) +
#  tm_shape(nSierraStations) + tm_symbols(fill="blue") +
  tm_shape(nSierraPlaces) + tm_symbols(alpha=0.99) + 
  tm_text("AREANAME", size=0.5)
Northern Sierra stations and places

FIGURE 7.4: Northern Sierra stations and places

So far, the above is just subsetting for a map, which may be all we’re wanting to do, but we’ll apply this selection to a distance function in the next section to explore a method using a reduced data set.

7.2.2 Centroid

Related to the topological concept of relating features inside other features is creating a new feature in the middle of an existing one, specifically a point placed in the middle of a polygon: a centroid. Centroids are useful when you need point data for some kind of analysis, but that point needs to represent the polygon it’s contained within. The methods we’ll see in the code below include:

  • st_centroid() : to derive a single point for each tract that should be approximately in its middle. It must be topologically contained within it, not fooled by an annulus (“doughnut”) shape, for example.
  • st_make_valid() : to make an invalid geometry valid (or just check for it). This function has just become essential now that sf supports spherical instead of just planar data, which ends up containing “slivers” where boundaries slightly under- or overlap since they were originally built from a planar projection. The st_make_valid() appears to make minor (and typically invisible) corrections to these slivers.
  • st_bbox : reads the bounding box, or spatial extent, of any dataset, which we can then use to set the scale to focus on that area. In this case, we’ll focus on the census centroids instead of the statewide CAfreeways for instance.

Let’s see the effect of the centroids and bbox. We’ll start with all county centroids (Figure 7.5).

library(tidyverse); library(sf); library(igisci)
CA_counties <- st_make_valid(CA_counties) # needed fix for data
ggplot() +
  geom_sf(data=CA_counties) +
  geom_sf(data=st_centroid(CA_counties))
California county centroids

FIGURE 7.5: California county centroids

Here’s an example that also applies the bounding box to establish a mapping scale that covers the Bay Area while some of the data (TRI locations and CAfreeways) are state-wide (Figure 7.6).

library(tidyverse)
library(sf)
library(igisci)
BayAreaTracts <- st_make_valid(BayAreaTracts)
censusCentroids <- st_centroid(BayAreaTracts)
TRI_sp <- st_as_sf(read_csv(ex("TRI/TRI_2017_CA.csv")), 
                   coords = c("LONGITUDE", "LATITUDE"), 
                   crs=4326) # simple way to specify coordinate reference
bnd <- st_bbox(censusCentroids)
ggplot() +
 geom_sf(data = BayAreaCounties, aes(fill = NAME)) +
 geom_sf(data = censusCentroids) +
 geom_sf(data = CAfreeways, color = "grey", alpha=0.5) +
 geom_sf(data = TRI_sp, color = "yellow") +
 coord_sf(xlim = c(bnd[1], bnd[3]), ylim = c(bnd[2], bnd[4])) +
   labs(title="Bay Area Counties, Freeways and Census Tract Centroids")
Map scaled to cover Bay Area tracts using a bbox

FIGURE 7.6: Map scaled to cover Bay Area tracts using a bbox

7.2.3 Distance

Distance is a fundamental spatial measure, used to not only create spatial data (distance between points in surveying or distance to GNSS satellites) but also to analyze it. Note that we can either be referring to planar (from projected coordinate systems) or spherical (from latitude and longitude) great-circle distances. Two common spatial operations involving distance in vector spatial analysis are (a) deriving distances among features, and (b) creating buffer polygons of a specific distance away from features. In raster spatial analysis, we’ll look at deriving distances from target features to each raster cell.

But first, let’s take a brief excursion up the Nile River, using great circle distances to derive a longitudinal profile and channel slope…

7.2.3.1 Great circle distances

Earlier we created a simple river profile using Cartesian coordinates in metres for x, y, and z (elevation), but what if our xy locations are in geographic coordinates of latitude and longitude? Using the haversine method, we can derive great-circle (“as the crow flies”) distances between latitude and longitude pairs. The following function uses a haversine algorithm described at https://www.movable-type.co.uk/scripts/latlong.html (Calculate Distance, Bearing and More Between Latitutde/Longitude Points" (n.d.)) to derive these distances in metres, provided lat/long pairs in degrees as haversineD(lat1,lon1,lat2,lon2):

haversineD <- function(lat1deg,lon1deg,lat2deg,lon2deg){
  lat1 <- lat1deg/180*pi; lat2 <- lat2deg/180*pi # convert to radians
  lon1 <- lon1deg/180*pi; lon2 <- lon2deg/180*pi
  a <- sin((lat2-lat1)/2)^2 + cos(lat1)*cos(lat2)*sin((lon2-lon1)/2)^2
  c <- 2*atan2(sqrt(a),sqrt(1-a))
  c * 6.371e6 # mean earth radius is ~ 6.371 million metres
}

We’ll use these to help us derived channel slope where we need the longitudinal distance in metres along the Nile River, but we have locations in geographic coordinates (crs 4326). But first, here’s a few quick checks on the function since I remember that 1 minute (1/60 degree) of latitude is equal to 1 nautical mile (NM) or ~1.15 statute miles, and the same applies to longitude at the equator. Then longitude is half that at 60 degrees north or south.

paste("1'lat=",haversineD(30,120,30+1/60,120)/0.3048/5280,"miles at 30°N")
paste("1'lat=",haversineD(30,120,30+1/60,120)/0.3048/6076,"NM at 30°N")
paste("1'lon=",haversineD(0,0,0,1/60)/0.3048/6076,"NM at the equator")
paste("1'lon=",haversineD(60,0,60,1/60)/0.3048/6076,"NM at 60°N")
## [1] "1'lat= 1.15155540233128 miles at 30°N"
## [1] "1'lat= 1.000693305515 NM at 30°N"
## [1] "1'lon= 1.00069330551494 NM at the equator"
## [1] "1'lon= 0.500346651434429 NM at 60°N"

Thanks to Stephanie Kate in my Fall 2020 Environmental Data Science course at SFSU, we have geographic coordinates and elevations in metres for a series of points along the Nile River, including various tributaries. We’ll focus on the main to Blue Nile and generate a map (Figure 7.7) and longitudinal profile (Figure 7.8), using great circle distances along the channel.

library(igisci); library(tidyverse)
bNile <- read_csv(ex("Nile/NilePoints.csv")) %>% 
  filter(nile %in% c("main","blue")) %>%
  arrange(elev_meter) %>%
  mutate(d=0, longd=0, s=1e-6) # initiate distance, longitudinal distance & slope
tmap_mode("plot")
for(i in 2:length(bNile$x)){
  bNile$d[i] <- haversineD(bNile$y[i],bNile$x[i],bNile$y[i-1],bNile$x[i-1])
  bNile$longd[i] <- bNile$longd[i-1] + bNile$d[i]
  bNile$s[i-1] <- (bNile$elev_meter[i]-bNile$elev_meter[i-1])/bNile$d[i]
}
library(sf); library(tmap); library(maptiles); library(tmaptools)
Nile_sf <- st_as_sf(bNile,coords=c("x","y"),crs=4326) |>
  mutate(s_pct = s*100)
tm_basemap(server ="Esri.NatGeoWorldMap") +
  tm_graticules(alpha=0.5) + 
  tm_shape(Nile_sf) +
    tm_dots(size=0.3, fill="elev_meter",
            fill.scale=tm_scale_continuous(values="brewer.yl_or_rd"),
            fill.legend=tm_legend(reverse=T))
Nile River points, colored by channel slope

FIGURE 7.7: Nile River points, colored by channel slope

ggplot(bNile, aes(longd,elev_meter)) + geom_line(aes(col=s), linewidth=1.5) +
  scale_color_gradient(low="green", high="red") +
  ggtitle("Nile River to Blue Nile longitudinal profile")
Nile River channel slope as range of colors from green to red, with great circle channel distances derived using the haversine method

FIGURE 7.8: Nile River channel slope as range of colors from green to red, with great circle channel distances derived using the haversine method

7.2.3.2 Distances among features

The sf st_distance and terra distance functions derive distances between features, either among features of one data set object or between all features in one and all features in another data set.

To see how this works, we’ll look at a purposefully small dataset: a selection of Marble Mountains soil CO2 sampling sites and in-cave water quality sampling sites. We’ll start by reading in the data, and filtering to just get cave water samples and a small set of nearby soil CO2 sampling sites:

library(igisci); library(sf); library(tidyverse)
soilCO2all <- st_read(ex("marbles/co2july95.shp"))
cave_H2Osamples <- st_read(ex("marbles/samples.shp")) %>%
  filter((SAMPLES_ID >= 50) & (SAMPLES_ID < 60)) # these are the cave samples
soilCO2 <- soilCO2all %>% filter(LOC > 20) # soil CO2 samples in the area

Figure 7.9 shows the six selected soil CO2 samples.

library(tmap); library(maptiles)
tmap_mode("plot")
bbox <- tmaptools::bb(soilCO2, ext=1.1) # extends the map area a bit from the points
tm_basemap(server="OpenTopoMap") + tm_graticules(alpha=0.5) + 
  tm_shape(soilCO2, bbox=bbox) + 
    tm_symbols(fill="CO2_", fill.scale=tm_scale_continuous(values="parks.denali")) +
  tm_shape(soilCO2) + tm_text("LOC", size=0.5, col="orange")
Selection of soil CO2 sampling sites, July 1995

FIGURE 7.9: Selection of soil CO2 sampling sites, July 1995

If you just provide the six soil CO2 sample points as a single feature data set input to the st_distance function, it returns a matrix of distances between each, with a diagonal of zeros where the distance would be to itself:

st_distance(soilCO2)
## Units: [m]
##           1         2          3          4        5         6
## 1    0.0000  370.5894 1155.64981 1207.62172 3333.529 1439.2366
## 2  370.5894    0.0000  973.73559 1039.45041 3067.106 1248.9347
## 3 1155.6498  973.7356    0.00000   75.14093 2207.019  283.6722
## 4 1207.6217 1039.4504   75.14093    0.00000 2174.493  237.7024
## 5 3333.5290 3067.1061 2207.01903 2174.49276    0.000 1938.2985
## 6 1439.2366 1248.9347  283.67215  237.70236 1938.299    0.0000

Then we’ll look at distances between this same set of soil CO2 samples with water samples collected in caves, where the effect of elevated soil CO2 values might influence solution processes reflected in cave waters (Figure 7.10).

library(tmap)
tmap_mode("plot")
bbox <- tmaptools::bb(st_union(cave_H2Osamples, soilCO2),ext=1.1)
tm_basemap("OpenTopoMap") + #tm_graticules() +
  tm_shape(cave_H2Osamples, bbox=bbox) + 
    tm_symbols(fill="CATOT", fill.scale=tm_scale_continuous(values="brewer.blues"))+ 
  tm_shape(soilCO2) +
    tm_symbols(fill="CO2_",  fill.scale=tm_scale_continuous(values="brewer.reds"))
Selection of soil CO2 and in-cave water samples

FIGURE 7.10: Selection of soil CO2 and in-cave water samples

soilwater <- st_distance(soilCO2, cave_H2Osamples)
soilwater
## Units: [m]
##          [,1]     [,2]     [,3]      [,4]      [,5]      [,6]      [,7]
## [1,] 2384.210 2271.521 2169.301 1985.6093 1980.0314 2004.1697 1905.9543
## [2,] 2029.748 1920.783 1826.579 1815.2653 1820.0496 1835.9465 1798.1216
## [3,] 1611.381 1480.688 1334.287  842.8228  846.3210  863.1026  849.7313
## [4,] 1637.998 1506.938 1357.582  781.9741  782.4215  801.6262  776.0732
## [5,] 1590.469 1578.730 1532.465 1526.3315 1567.5677 1520.1399 1820.3651
## [6,] 1506.358 1375.757 1220.000  567.8210  578.1919  589.1443  638.8615

In this case, the six soil CO2 samples are the rows, and the seven cave water sample locations are the columns. We aren’t really relating the values but just looking at distances. An analysis of this data might not be very informative because the caves aren’t very near the soil samples, and conduit cave hydrology doesn’t lend itself to looking at euclidean distance, but the purpose of this example is just to comprehend the results of the sf::st_distance or similarly the terra::distance function.

7.2.3.3 Distance to the nearest feature, abstracted from distance matrix

However, let’s process the matrix a bit to find the distance from each soil CO2 sample to the closest cave water sample:

soilCO2d <- soilCO2 %>% mutate(dist2cave = min(soilwater[1,]))
for (i in 1:length(soilwater[,1])) soilCO2d[i,"dist2cave"] <- min(soilwater[i,])
soilCO2d %>% dplyr::select(DESCRIPTIO, dist2cave) %>% st_set_geometry(NULL)
##                                              DESCRIPTIO     dist2cave
## 1                Super Sink moraine, low point W of 241 1905.9543 [m]
## 2      Small sink, first in string on hike to SuperSink 1798.1216 [m]
## 3        meadow on bench above Mbl Valley, 25' E of PCT  842.8228 [m]
## 4        red fir forest S of upper meadow, 30' W of PCT  776.0732 [m]
## 5            saddle betw Lower Skyhigh (E) & Frying Pan 1520.1399 [m]
## 6 E end of wide soil-filled 10m fissure N of lower camp  567.8210 [m]

… or since we can also look at distances to lines or polygons, we can find the distance from CO2 sampling locations to the closest stream (Figure 7.11),

library(igisci); library(sf); library(tidyverse)
soilCO2all <- st_read(ex("marbles/co2july95.shp"))
streams <- st_read(ex("marbles/streams.shp"))
strCO2 <- st_distance(soilCO2all, streams)
strCO2d <- soilCO2all %>% mutate(dist2stream = min(strCO2[1,]))
for (i in 1:length(strCO2[,1])) strCO2d[i,"dist2stream"] <- min(strCO2[i,])
tm_basemap("OpenTopoMap") + tm_graticules() + 
  tm_shape(streams) + tm_lines(col="blue") +
  tm_shape(strCO2d) + tm_symbols(fill="dist2stream")
Distance from CO2 samples to closest streams (not including lakes)

FIGURE 7.11: Distance from CO2 samples to closest streams (not including lakes)

7.2.3.4 Nearest feature detection

As we just saw, the matrix derived from st_distance when applied to feature data sets could be used for further analyses, such as this distance to the nearest place. But there’s another approach to this using a specific function that identifies the nearest feature: st_nearest_feature, so we’ll look at this with the previously filtered northern Sierra places and weather stations. We start by using st_nearest_feature to create an index vector of the nearest place to each station, and grab its location geometry:

nearest_Place <- st_nearest_feature(nSierraStations, nSierraPlaces)
near_Place_loc <- nSierraPlaces$geometry[nearest_Place]

Then add the distance to that nearest place to our nSierraStations sf. Note that we’re using st_distance with two vector inputs as before, but ending up with just one distance per feature (instead of a matrix) with the setting by_element=TRUE

nSierraStations <- nSierraStations %>%
  mutate(d2Place = st_distance(nSierraStations, near_Place_loc, by_element=TRUE),
         d2Place = units::drop_units(d2Place))

… and of course map the results (Figure 7.12):

tm_basemap("OpenStreetMap") + tm_graticules(alpha=0.5) + 
  tm_shape(nSierraCo) + tm_borders() + tm_text("NAME") +
  tm_shape(nSierraStations) + tm_symbols(fill="d2Place") +
  tm_shape(nSierraPlaces) + tm_symbols(fill="red", fill_alpha=0.5, size=0.5, shape=24)
Distance to towns (places) from weather stations

FIGURE 7.12: Distance to towns (places) from weather stations

That’s probably enough examples of using distance to nearest feature so we can see how it works, but to see an example with a larger data set, an example in Appendix A7.2 looks at the proximity of TRI sites to health data at census tracts.

7.2.4 Buffers

Creating buffers, or polygons defining the area within some distance of a feature, is commonly used in GIS. Since you need to specify that distance (or read it from a variable for each feature), you need to know what the horizontal distance units are for your data. If GCS, these will be decimal degrees and 1 degree is a long way, about 111 km (or 69 miles), though that differs for longitude where 1 degree of longitude is 111 km * cos(latitude). If in UTM, the horizontal distance will be in metres, but in the US, state plane coordinates are typically in feet. So let’s read the trails shapefile from the Marble Mountains and look for its units:

library(igisci)
library(sf); library(tidyverse)
trails <- st_read(ex("marbles/trails.shp"))

Then we know we’re in metres, so we’ll create a 100 m buffer this way (Figure 7.13).

trail_buff0 <- st_buffer(trails,100)
ggplot(trail_buff0) + geom_sf()
100 m trail buffer, Marble Mountains

FIGURE 7.13: 100 m trail buffer, Marble Mountains

7.2.4.1 The units package

Since the spatial data are in UTM, we can just specify the distance in metres. However if the data were in decimal degrees of latitude and longitude (GCS), we would need to let the function know that we’re providing it metres so it can transform that to work with the GCS, using units::set_units(100, "m") instead of just 100, for the above example where we are creating a 100 m buffer.

7.2.5 Spatial overlay: union and intersection

Overlay operations are described in the sf cheat sheet under “Geometry operations”. These are useful to explore, but a couple we’ll look at are union and intersection.

Normally these methods have multiple inputs, but we’ll start with one that can also be used to dissolve boundaries, st_union – if only one input is provided it appears to do a dissolve (Figure 7.14):

trail_buff <- st_union(trail_buff0)
ggplot(trail_buff) + geom_sf()
Unioned trail buffer, dissolving boundaries

FIGURE 7.14: Unioned trail buffer, dissolving boundaries

For a clearer example with the normal multiple input, we’ll intersect a 100 m buffer around streams and a 100 m buffer around trails …

streams <- st_read(ex("marbles/streams.shp"))
trail_buff <- st_buffer(trails, 100)
str_buff <- st_buffer(streams,100)
strtrInt <- st_intersection(trail_buff,str_buff)

…to show areas that are close to streams and trails (Figure 7.15):

ggplot(strtrInt) + geom_sf(fill="green") + 
  geom_sf(data=trails, col="red") +
  geom_sf(data=streams, col="blue")
Intersection of trail and stream buffers

FIGURE 7.15: Intersection of trail and stream buffers

Or how about a union of these two buffers? We’ll also dissolve the boundaries using union with a single input (the first union) to dissolve those internal overlays (Figure 7.16):

strtrUnion <- st_union(st_union(trail_buff,str_buff))
ggplot(strtrUnion) + geom_sf(fill="green") + 
  geom_sf(data=trails, col="red") +
  geom_sf(data=streams, col="blue")
Union of two sets of buffer polygons

FIGURE 7.16: Union of two sets of buffer polygons

7.2.6 Clip with st_crop

Clipping a GIS layer to a rectangle (or to a polygon) is often useful. We’ll clip to a rectangle based on latitude and longitude limits we’ll specify, however since our data is in UTM, we’ll need to use st_transform to get it to the right coordinates (Figure 7.17).

xmin=-123.21; xmax=-123.18; ymin=41.55; ymax=41.57
clipMatrix <- rbind(c(xmin,ymin),c(xmin,ymax),c(xmax,ymax),
                    c(xmax,ymin),c(xmin,ymin))
clipGCS <- st_sfc(st_polygon(list(clipMatrix)),crs=4326)
bufferCrop <- st_crop(strtrUnion,st_transform(clipGCS,crs=st_crs(strtrUnion)))
bnd <- st_bbox(bufferCrop)
ggplot(bufferCrop) + geom_sf(fill="green") + 
  geom_sf(data=trails, col="red") +
  geom_sf(data=streams, col="blue") +
  coord_sf(xlim = c(bnd[1], bnd[3]), ylim = c(bnd[2], bnd[4]))
Cropping with specified x and y limits

FIGURE 7.17: Cropping with specified x and y limits

7.2.6.1 sf or terra for vector spatial operations?

Note: there are other geometric operations in sf beyond what we’ve looked at. See the cheat sheet and other resources at https://r-spatial.github.io/sf .

The terra system also has many vector tools (such as terra::union and terra::buffer), that work with and create SpatVector spatial data objects. As noted earlier, these S4 spatial data can be created from (S3) sf data with terra::vect and vice versa with sf::st_as_sf. We’ll mostly focus on sf for vector and terra for raster, but to learn more about SpatVector data and operations in terra, see https://rspatial.org.

For instance, there’s a similar terra operation for one of the things we just did, but instead of using union to dissolve the internal boundaries like st_union did, uses aggregate to remove boundaries between polygons with the same codes:

trails <- vect(ex("marbles/trails.shp"))
trailbuff <- aggregate(buffer(trails,100))
plot(trailbuff)

7.2.7 Spatial join with st_join

A spatial join can do many things, but we’ll just look at its simplest application – connecting a point with the polygon that it’s in. In this case, we want to join the attribute data in BayAreaTracts with EPA Toxic Release Inventory (TRI) point data (at factories and other point sources) and then display a few ethnicity variables from the census, associated spatially with the TRI point locations. We’ll be making better maps later; this is a quick plot() display to show that the spatial join worked (Figure 7.18).

library(igisci); library(tidyverse); library(sf)
TRI_sp <- st_as_sf(read_csv(ex("TRI/TRI_2017_CA.csv")), 
                   coords = c("LONGITUDE", "LATITUDE"), crs=4326) %>%
  st_join(st_make_valid(BayAreaTracts)) %>%
  filter(!is.na(TRACT)) %>%  # removes points not in BayAreaTracts
  dplyr::select(POP2012:HISPANIC)
plot(TRI_sp)
TRI points with census variables added via a spatial join

FIGURE 7.18: TRI points with census variables added via a spatial join

7.2.8 Further exploration of spatial analysis

We’ve looked at a small selection of spatial analysis functions; there’s a lot more we can do, and the R spatial world is rapidly growing. There are many other methods discussed in the “Spatial data operations” and “Geometry operations” chapters of Geocomputation (https://geocompr.robinlovelace.net/, Lovelace, Nowosad, and Muenchow (2019)) that are worth exploring, and still others in the terra package described at https://rspatial.org, Hijmans (n.d.).

For example, getting a good handle on coordinate systems is certainly an important area to learn more about, and you will find coverage of transformation and reprojection methods in the “Reprojecting geographic data” and data access methods in the “Geographic data I/O” chapter of Geocomputation, and in the “Coordinate Reference Systems” section of https://rspatial.org. The Simple Features for R reference (Simple Features for r (n.d.)) site includes a complete reference for all of its functions, as does https://rspatial.org for terra.

And then there’s analysis of raster data, but we’ll leave that for the next chapter.

7.3 Exercises: Spatial Analysis

Exercise 7.1 Maximum Elevation Assuming you wrote them in the last chapter, read in your western states “data/W_States.shp” and peaks “data/peaks.shp” data, then use a spatial join to add the peak points to the states to provide a new attribute maximum elevation, and display that using geom_sf_text() with the state polygons. Note that joining from the states to the peaks works because there’s a one-to-one correspondence between states and maximum peaks. (If you didn’t write them, you’ll need to repeat your code that built them from the previous chapter.)

Exercise 7.2 Using two shape files of your own choice, not from igisci, read them in as sf data, and perform a spatial join to join the polygons to the point data. Then display the point feature data frame, and create a ggplot scatter plot or boxplot graph where the two variables are related, preferably with a separate categorical variable to provide color.

Exercise 7.3 From the above spatially joined data create a map where the points are colored by data brought in from the spatial join; can be either a categorical or continuous variable.

Exercise 7.4 Transect Buffers: Using data from the SFmarine folder in igisci extdata, transects.shp (ex("SFmarine/transects.shp")) and mainland.shp(to make a nicer map showing land for reference), use st_buffer to create 1000 m buffers around the transect points, merged to remove boundaries (Figure 7.19).

Transect Buffers (goal)

FIGURE 7.19: Transect Buffers (goal)

Exercise 7.5 Create an sf that represents all areas within 50 km of a TRI facility in California that has >10,000 pounds of total stack air release for all gases, clipped (intersected) with the state of California as provided in the CA counties data set. Then use this to create a map of California showing the clipped buffered areas in red, with California counties and those selected large-release TRI facilities in black dots, as shown here. To create a variable representing 50 km to provide to the dist parameter in st_buffer [this is needed since the spatial data are in GCS], use the units::set_units function, something like this: D50km <- units::set_units(50, "km"). See the first figure in this chapter for your goal.

References

Berry, Brian, and Duane Marble. 1968. Spatial Analysis: A Reader in Statistical Geography. Prentice-Hall.
Calculate Distance, Bearing and More Between Latitutde/Longitude Points". n.d. Movable Type Ltd. https://www.movable-type.co.uk/scripts/latlong.html.
Hijmans, Robert J. n.d. Spatial Data Science. https://rspatial.org.
Lovelace, Robin, Jakuv Nowosad, and Jannes Muenchow. 2019. Geocomputation with r. CRC Press. https://geocompr.robinlovelace.net/.
Simple Features for r. n.d. https://r-spatial.github.io/sf/.