|
2 | 2 | engine: julia |
3 | 3 | --- |
4 | 4 |
|
5 | | -# Agricultural fields 🚧 |
| 5 | +# Agricultural fields |
6 | 6 |
|
7 | | -## Coming soon... |
| 7 | +Satellite images are widely used for land cover classification and |
| 8 | +various related measurements within agricultural fields, especially |
| 9 | +when the size of these fields is large or when physical access is |
| 10 | +limited. In this chapter, we will illustrate the use of spectral |
| 11 | +indices derived from a [Sentinel-2](https://en.wikipedia.org/wiki/Sentinel-2) |
| 12 | +image to streamline the measurements of area and perimeter in a |
| 13 | +large agricultural field in Brazil. |
| 14 | + |
| 15 | +**TOOLS COVERED:** `Proj`, `Map`, `SpectralIndex`, `ModeFilter`, `Potrace`, |
| 16 | +`describe`, `boundingbox`, `crs`, `utmsouth`, `spectralindices`, `spectralbands`, |
| 17 | +`area`, `perimeter`, `viewer` |
| 18 | + |
| 19 | +**MODULES:** |
| 20 | + |
| 21 | +```{julia} |
| 22 | +# framework |
| 23 | +using GeoStats |
| 24 | +
|
| 25 | +# IO modules |
| 26 | +using GeoIO |
| 27 | +
|
| 28 | +# viz modules |
| 29 | +import CairoMakie as Mke |
| 30 | +``` |
| 31 | + |
| 32 | +```{julia} |
| 33 | +#| echo: false |
| 34 | +#| output: false |
| 35 | +Mke.activate!(type = "png") |
| 36 | +``` |
| 37 | + |
| 38 | +## Data |
| 39 | + |
| 40 | +We will use an image from the |
| 41 | +[Harmonized Sentinel-2 MSI](https://developers.google.com/earth-engine/datasets/catalog/COPERNICUS_S2_SR_HARMONIZED) |
| 42 | +dataset. The image is the result of cloud masking and averaging between |
| 43 | +dates 2025-07-01 and 2025-07-30, followed by a download within a specified |
| 44 | +box of latitude and longitude values: |
| 45 | + |
| 46 | +```{julia} |
| 47 | +image = GeoIO.load("data/sentinel.tif") |
| 48 | +
|
| 49 | +describe(image) |
| 50 | +``` |
| 51 | + |
| 52 | +```{julia} |
| 53 | +boundingbox(image.geometry) |
| 54 | +``` |
| 55 | + |
| 56 | +We convert the `crs` of the image from |
| 57 | + |
| 58 | +```{julia} |
| 59 | +crs(image) |
| 60 | +``` |
| 61 | + |
| 62 | +to a UTM coordinate reference system with the `Proj` transform |
| 63 | +to be able to perform measurements in length (e.g., `m`) units: |
| 64 | + |
| 65 | +```{julia} |
| 66 | +img = image |> Proj(utmsouth(22)) |
| 67 | +
|
| 68 | +img |> viewer |
| 69 | +``` |
| 70 | + |
| 71 | +::: {.callout-note} |
| 72 | + |
| 73 | +The UTM zone `22` was obtained from the longitude coordinate of the |
| 74 | +centroid of the domain via the formula |
| 75 | + |
| 76 | +```{julia} |
| 77 | +lon = coords(centroid(image.geometry)).lon |
| 78 | +
|
| 79 | +frac = ustrip((lon + 183u"°") / 6) |
| 80 | +
|
| 81 | +zone = round(Int, frac) |
| 82 | +``` |
| 83 | + |
| 84 | +The corresponding latitude coordinate is negative, meaning the domain |
| 85 | +is in the `:south` hemisphere. Hence, `utmsouth(22)` was selected as |
| 86 | +the target `CRS`. |
| 87 | + |
| 88 | +::: |
| 89 | + |
| 90 | +## Objectives |
| 91 | + |
| 92 | +Our objective in this application is two-fold. First, we would like to assign |
| 93 | +a categorical value for each "pixel" of the image as a label for [geostatistical |
| 94 | +learning](https://www.frontiersin.org/articles/10.3389/fams.2021.689393/full) |
| 95 | +tasks. Second, we would like to convert the region of the image that is inside |
| 96 | +of the agricultural field into a polygonal area for measurements of area and |
| 97 | +perimeter. |
| 98 | + |
| 99 | +The ability to convert geospatial data between these two representations (i.e., |
| 100 | +"raster" vs. "vector") based on a categorical variable is key for advanced |
| 101 | +geospatial data science workflows. |
| 102 | + |
| 103 | +## Methodology |
| 104 | + |
| 105 | +In order to assign meaning to the pixels of the image, we first need to visualize |
| 106 | +the land as if we were looking at it with our naked eyes. Then, we can highlight |
| 107 | +some regions of the image based on selected formulas--- |
| 108 | +[spectral indices](https://awesome-ee-spectral-indices.readthedocs.io/en/latest)--- |
| 109 | +computed with the spectral bands measured by the Sentinel-2 satellite. Finally, we |
| 110 | +can introduce a threshold for the values of the indices to obtain a categorical |
| 111 | +variable and extract polygonal areas of interest. |
| 112 | + |
| 113 | +The proposed methodology has the following steps: |
| 114 | + |
| 115 | +1. Visualization of land in familiar color space |
| 116 | +2. Identification of useful spectral indices |
| 117 | +3. Segmentation of image based on threshold |
| 118 | +4. Measurement of area and perimeter of field |
| 119 | + |
| 120 | +### Visualization of land |
| 121 | + |
| 122 | +The Sentinel-2 dataset stores the red (`R`), green (`G`) and blue (`B`) bands in the |
| 123 | +4th, 3rd and 2nd channels of the image: |
| 124 | + |
| 125 | +```{julia} |
| 126 | +rgb = img |> Select(4 => "R", 3 => "G", 2 => "B") |
| 127 | +``` |
| 128 | + |
| 129 | +We create an auxiliary function that takes the individual channels as inputs and produces |
| 130 | +a color object from [Colors.jl](https://github.com/JuliaGraphics/Colors.jl) as the output. |
| 131 | +We use an intensity parameter $\lambda$ to scale the channels and lighten up the image: |
| 132 | + |
| 133 | +```{julia} |
| 134 | +λ = 5 # intensity parameter |
| 135 | +
|
| 136 | +ascolor(r, g, b) = RGB(λ * r, λ * g, λ * b) |
| 137 | +``` |
| 138 | + |
| 139 | +We can map the function to all the pixels of the image using the `Map` transform: |
| 140 | + |
| 141 | +```{julia} |
| 142 | +color = rgb |> Map(["R", "G", "B"] => ascolor => "RGB") |
| 143 | +``` |
| 144 | + |
| 145 | +The `viewer` displays the colors that are familiar to us: |
| 146 | + |
| 147 | +```{julia} |
| 148 | +color |> viewer |
| 149 | +``` |
| 150 | + |
| 151 | +The agricultural field of interest is displayed in orange color, surrounded by green vegetation. |
| 152 | + |
| 153 | +### Spectral indices |
| 154 | + |
| 155 | +Given that the image is made of two groups of vegetation (green vs. orange), we lookup all the |
| 156 | +`spectralindices` that are adequate for the application. We print the first 20 vegetation indices |
| 157 | +along with their short and long names: |
| 158 | + |
| 159 | +```{julia} |
| 160 | +isvegetation(ind) = ind.application_domain == "vegetation" |
| 161 | +
|
| 162 | +inds = filter(isvegetation, spectralindices()) |
| 163 | +
|
| 164 | +for ind in first(inds, 20) |
| 165 | + println(ind.short_name, ": ", ind.long_name) |
| 166 | +end |
| 167 | +``` |
| 168 | + |
| 169 | +Each of these indices is computed with specific `spectralbands`. Consider the `"NDVI"` vegetation |
| 170 | +index as an example. It is computed in terms of the red (`R`) and near-infrared (`N`) bands: |
| 171 | + |
| 172 | +```{julia} |
| 173 | +for band in spectralbands("NDVI") |
| 174 | + println(band.short_name, ": ", band.long_name) |
| 175 | +end |
| 176 | +``` |
| 177 | + |
| 178 | +The `R` band was already selected above for the visualization of the land. The `N` band is |
| 179 | +stored in the 8th channel of the image: |
| 180 | + |
| 181 | +```{julia} |
| 182 | +n = img |> Select(8 => "N") |
| 183 | +``` |
| 184 | + |
| 185 | +We can compute the `"NDVI"` spectral index with the `SpectralIndex` transform by providing a |
| 186 | +geotable with all the necessary bands: |
| 187 | + |
| 188 | +```{julia} |
| 189 | +ndvi = [rgb n] |> SpectralIndex("NDVI") |
| 190 | +
|
| 191 | +ndvi |> viewer |
| 192 | +``` |
| 193 | + |
| 194 | +We see that the index assigns values close to one to pixels with green vegetation and values |
| 195 | +close to zero inside the agricultural field. Let's compare this result with other spectral |
| 196 | +indices. The `"MGRVI"` and `"SI"` indices are also computed in terms of the `R`, `G` and `B` |
| 197 | +bands: |
| 198 | + |
| 199 | +```{julia} |
| 200 | +mgrvi = [rgb n] |> SpectralIndex("MGRVI") |
| 201 | +si = [rgb n] |> SpectralIndex("SI") |
| 202 | +
|
| 203 | +spec = [ndvi mgrvi si] |
| 204 | +``` |
| 205 | + |
| 206 | +```{julia} |
| 207 | +fig = Mke.Figure() |
| 208 | +ax1 = Mke.Axis(fig[1, 1], title = "RGB") |
| 209 | +ax2 = Mke.Axis(fig[1, 2], title = "NDVI") |
| 210 | +ax3 = Mke.Axis(fig[2, 1], title = "MGRVI") |
| 211 | +ax4 = Mke.Axis(fig[2, 2], title = "SI") |
| 212 | +viz!(ax1, color.geometry, color = color.RGB) |
| 213 | +viz!(ax2, spec.geometry, color = spec.NDVI) |
| 214 | +viz!(ax3, spec.geometry, color = spec.MGRVI) |
| 215 | +viz!(ax4, spec.geometry, color = spec.SI) |
| 216 | +fig |
| 217 | +``` |
| 218 | + |
| 219 | +### Image segmentation |
| 220 | + |
| 221 | +From visual inspection, we select the `"MGRVI"` index for image segmentation. |
| 222 | +It creates a strong contrast between pixels that are inside and outside the |
| 223 | +agricultural field. We use the `Map` transform again to create a binary |
| 224 | +variable in terms of the 30th percentile of the spectral index: |
| 225 | + |
| 226 | +```{julia} |
| 227 | +q30 = quantile(mgrvi.MGRVI, 0.3) |
| 228 | +
|
| 229 | +isinside(x) = x < q30 |
| 230 | +
|
| 231 | +binary = mgrvi |> Map("MGRVI" => isinside => "label") |
| 232 | +
|
| 233 | +binary |> viewer |
| 234 | +``` |
| 235 | + |
| 236 | +Additionally, we use the `ModeFilter` transform to eliminate small artifacts |
| 237 | +that are not relevant for the stated objectives: |
| 238 | + |
| 239 | +```{julia} |
| 240 | +mask = binary |> ModeFilter() |
| 241 | +
|
| 242 | +mask |> viewer |
| 243 | +``` |
| 244 | + |
| 245 | +The first objective has been achieved. The resulting labels can be used in |
| 246 | +supervised learning tasks with the `Learn` transform and any subset of spectral |
| 247 | +bands. |
| 248 | + |
| 249 | +### Geometric measurements |
| 250 | + |
| 251 | +For the second objective, we use the `Potrace` transform to extract polygonal areas |
| 252 | +from the image based on the labels of the previous section: |
| 253 | + |
| 254 | +```{julia} |
| 255 | +potrace = mask |> Potrace("label") |
| 256 | +``` |
| 257 | + |
| 258 | +We can visualize the region where the label is `true`: |
| 259 | + |
| 260 | +```{julia} |
| 261 | +region = potrace.geometry[findfirst(potrace.label)] |
| 262 | +
|
| 263 | +region |> viz |
| 264 | +``` |
| 265 | + |
| 266 | +It is made of various polygons of different size. Our agricultural field is the |
| 267 | +polygon with largest area: |
| 268 | + |
| 269 | +```{julia} |
| 270 | +polys = parent(region) |
| 271 | +
|
| 272 | +field = argmax(area, polys) |
| 273 | +
|
| 274 | +field |> viz |
| 275 | +``` |
| 276 | + |
| 277 | +It has an area of approximately |
| 278 | + |
| 279 | +```{julia} |
| 280 | +area(field) |> u"ha" |
| 281 | +``` |
| 282 | + |
| 283 | +and a perimeter of approximately |
| 284 | + |
| 285 | +```{julia} |
| 286 | +perimeter(field) |> u"km" |
| 287 | +``` |
| 288 | + |
| 289 | +## Summary |
| 290 | + |
| 291 | +In this chapter, we illustrated an application of the framework in the agriculture |
| 292 | +industry. Among other things, we learned how to |
| 293 | + |
| 294 | +- Compute spectral indices to highlight geospatial objects of interest in satellite images. |
| 295 | +- Extract objects from categorical labels to perform measurements of area and perimeter. |
| 296 | + |
| 297 | +The tools presented here are useful in various other geostatistical applications involving |
| 298 | +remote sensing data. As an advanced geospatial data scientist, you will need to integrate |
| 299 | +data from different satellite missions to improve your understanding of the problem and to |
| 300 | +propose innovative solutions that cannot be derived from a single data source. |
0 commit comments