Assignments Fall 2022

This assignment is compulsory and need to be delivered before 17th of November, 2022 12pm, Nigeria time.

Please use this jupyter file by adding cell-codes and cell-text with your solution. Save the file as assignment_fall2022_name_surname.ipynb and send as e-mail attachment to g.amatulli@spatial-ecology.net and Erin.Stearns@gatesfoundation.org.

The exercises are based on lectures and materials that we covered during the lectures.

Enter in your /media/sf_LVM_shared/my_SE_data/exercise/, save there your assignment_fall2022_name_surname.ipynb and start to solve the exercises by adding new cell-codes and cell-text.

1st excercise

20% grading value

Uisng the geodata/dem/SA_elevation_mn_GMTED2010_mn_msk.tif ocean value mask out the geodata/vegetation/SA_tree_mn_percentage_GFC2013.tif in such a way that the output geodata/vegetation/SA_tree_mn_percentage_GFC2013_msk.tif has as ocean value -1.

[2]:
# check the value range of the mask image
! gdalinfo -mm geodata/dem/SA_elevation_mn_GMTED2010_mn_msk.tif
Driver: GTiff/GeoTIFF
Files: geodata/dem/SA_elevation_mn_GMTED2010_mn_msk.tif
Size is 5880, 8400
Coordinate System is:
GEOGCRS["WGS 84",
    DATUM["World Geodetic System 1984",
        ELLIPSOID["WGS 84",6378137,298.257223563,
            LENGTHUNIT["metre",1]]],
    PRIMEM["Greenwich",0,
        ANGLEUNIT["degree",0.0174532925199433]],
    CS[ellipsoidal,2],
        AXIS["geodetic latitude (Lat)",north,
            ORDER[1],
            ANGLEUNIT["degree",0.0174532925199433]],
        AXIS["geodetic longitude (Lon)",east,
            ORDER[2],
            ANGLEUNIT["degree",0.0174532925199433]],
    ID["EPSG",4326]]
Data axis to CRS axis mapping: 2,1
Origin = (-83.000000000000000,14.000000000000000)
Pixel Size = (0.008333333333333,-0.008333333333333)
Metadata:
  AREA_OR_POINT=Area
  TIFFTAG_DATETIME=2021:05:12 09:34:03
  TIFFTAG_DOCUMENTNAME=geodata/dem/SA_elevation_mn_GMTED2010_mn_msk.tif
  TIFFTAG_SOFTWARE=pktools 2.6.7.6 by Pieter Kempeneers
Image Structure Metadata:
  COMPRESSION=DEFLATE
  INTERLEAVE=BAND
Corner Coordinates:
Upper Left  ( -83.0000000,  14.0000000) ( 83d 0' 0.00"W, 14d 0' 0.00"N)
Lower Left  ( -83.0000000, -56.0000000) ( 83d 0' 0.00"W, 56d 0' 0.00"S)
Upper Right ( -34.0000000,  14.0000000) ( 34d 0' 0.00"W, 14d 0' 0.00"N)
Lower Right ( -34.0000000, -56.0000000) ( 34d 0' 0.00"W, 56d 0' 0.00"S)
Center      ( -58.5000000, -21.0000000) ( 58d30' 0.00"W, 21d 0' 0.00"S)
Band 1 Block=5880x1 Type=Int16, ColorInterp=Gray
    Computed Min/Max=-400.000,6736.000
  NoData Value=-9999
[1]:
# open the file to be sure that the ocean is labeled as -9999 no data
! /usr/bin/openev/bin/openev geodata/dem/SA_elevation_mn_GMTED2010_mn_msk.tif
Default software rendering mode (use -h if accelerated video card installed).
Loading tools from /usr/bin/openev/tools/Tool_Export.py
Loading tools from /usr/bin/openev/tools/Tool_ShapesGrid.py
Loading tools from /usr/bin/openev/tools/Tool_DriverList.py
[7]:
# check the value range of the input image
! gdalinfo -mm geodata/vegetation/SA_tree_mn_percentage_GFC2013.tif
Driver: GTiff/GeoTIFF
Files: geodata/vegetation/SA_tree_mn_percentage_GFC2013.tif
Size is 5880, 8400
Coordinate System is:
GEOGCRS["WGS 84",
    DATUM["World Geodetic System 1984",
        ELLIPSOID["WGS 84",6378137,298.257223563,
            LENGTHUNIT["metre",1]]],
    PRIMEM["Greenwich",0,
        ANGLEUNIT["degree",0.0174532925199433]],
    CS[ellipsoidal,2],
        AXIS["geodetic latitude (Lat)",north,
            ORDER[1],
            ANGLEUNIT["degree",0.0174532925199433]],
        AXIS["geodetic longitude (Lon)",east,
            ORDER[2],
            ANGLEUNIT["degree",0.0174532925199433]],
    ID["EPSG",4326]]
Data axis to CRS axis mapping: 2,1
Origin = (-83.000000000000000,14.000000000000000)
Pixel Size = (0.008333333333333,-0.008333333333333)
Metadata:
  AREA_OR_POINT=Area
  Author=giuseppe.amatulli@gmail.com using pktools
  Input dataset=Global Forest Change 2000-2012 (Hansen 2013)
  Input layer=Tree canopy cover for year 2000 (%)
  Offset=0
  Output=Mean of Tree cover (%)
  Scale=0.01
Image Structure Metadata:
  COMPRESSION=DEFLATE
  INTERLEAVE=BAND
Corner Coordinates:
Upper Left  ( -83.0000000,  14.0000000) ( 83d 0' 0.00"W, 14d 0' 0.00"N)
Lower Left  ( -83.0000000, -56.0000000) ( 83d 0' 0.00"W, 56d 0' 0.00"S)
Upper Right ( -34.0000000,  14.0000000) ( 34d 0' 0.00"W, 14d 0' 0.00"N)
Lower Right ( -34.0000000, -56.0000000) ( 34d 0' 0.00"W, 56d 0' 0.00"S)
Center      ( -58.5000000, -21.0000000) ( 58d30' 0.00"W, 21d 0' 0.00"S)
Band 1 Block=5880x1 Type=UInt16, ColorInterp=Gray
    Computed Min/Max=0.000,10000.000
Check the data type that support a range from -1 to 10000
[15]:
%%bash
pksetmask -co COMPRESS=DEFLATE -co ZLEVEL=9 -ot Int16 \
-m  geodata/dem/SA_elevation_mn_GMTED2010_mn_msk.tif -msknodata -9999 -nodata -1 \
-i  geodata/vegetation/SA_tree_mn_percentage_GFC2013.tif -o geodata/vegetation/SA_tree_mn_percentage_GFC2013_msk.tif
0...10...20...30...40...50...60...70...80...90...100 - done.
[2]:
# open the file to be sure that the ocean is labeled as -1 no data
! /usr/bin/openev/bin/openev geodata/vegetation/SA_tree_mn_percentage_GFC2013_msk.tif
Default software rendering mode (use -h if accelerated video card installed).
Loading tools from /usr/bin/openev/tools/Tool_Export.py
Loading tools from /usr/bin/openev/tools/Tool_ShapesGrid.py
Loading tools from /usr/bin/openev/tools/Tool_DriverList.py

2nd exercise

20% grading value

Using the .tif files in geodata/landsat_ct/, search for a gdal command that is able to create a vector-tile file having a tile-polygon for each tile tif. Run the commands and visualize the vector-tile file in qgis.

[17]:
! ls geodata/landsat_ct/
071W_41N_med_B1.tif  072W_41N_med_B1.tif  073W_41N_med_B1.tif
071W_42N_med_B1.tif  072W_42N_med_B1.tif  073W_42N_med_B1.tif
072W_40N_med_B1.tif  073W_40N_med_B1.tif
[3]:
# open all the tiles
! /usr/bin/openev/bin/openev geodata/landsat_ct/*.tif
Default software rendering mode (use -h if accelerated video card installed).
Loading tools from /usr/bin/openev/tools/Tool_Export.py
Loading tools from /usr/bin/openev/tools/Tool_ShapesGrid.py
Loading tools from /usr/bin/openev/tools/Tool_DriverList.py

First solution This is the most simple and elegant solution.

[7]:
%%bash
rm -f geodata/landsat_ct/alltif_tile.*
gdaltindex geodata/landsat_ct/alltif_tile.shp  geodata/landsat_ct/????_???_med_B1.tif   ### perfect 100%%
ls -l geodata/landsat_ct/
Creating new index file...
total 364
-rwxrwx--- 1 root vboxsf 26579 Oct 12 17:57 071W_41N_med_B1.tif
-rwxrwx--- 1 root vboxsf   361 Nov 24 10:21 071W_41N_med_B1.tif.aux.xml
-rwxrwx--- 1 root vboxsf 26877 Oct 12 17:57 071W_42N_med_B1.tif
-rwxrwx--- 1 root vboxsf   363 Nov 24 10:21 071W_42N_med_B1.tif.aux.xml
-rwxrwx--- 1 root vboxsf 27168 Oct 12 17:57 072W_40N_med_B1.tif
-rwxrwx--- 1 root vboxsf   361 Nov 24 10:21 072W_40N_med_B1.tif.aux.xml
-rwxrwx--- 1 root vboxsf 25833 Oct 12 17:57 072W_41N_med_B1.tif
-rwxrwx--- 1 root vboxsf   361 Nov 24 10:21 072W_41N_med_B1.tif.aux.xml
-rwxrwx--- 1 root vboxsf 24464 Oct 12 17:57 072W_42N_med_B1.tif
-rwxrwx--- 1 root vboxsf   361 Nov 24 10:21 072W_42N_med_B1.tif.aux.xml
-rwxrwx--- 1 root vboxsf 27736 Oct 12 17:57 073W_40N_med_B1.tif
-rwxrwx--- 1 root vboxsf   361 Nov 24 10:21 073W_40N_med_B1.tif.aux.xml
-rwxrwx--- 1 root vboxsf 26047 Oct 12 17:57 073W_41N_med_B1.tif
-rwxrwx--- 1 root vboxsf   362 Nov 24 10:21 073W_41N_med_B1.tif.aux.xml
-rwxrwx--- 1 root vboxsf 25851 Oct 12 17:57 073W_42N_med_B1.tif
-rwxrwx--- 1 root vboxsf   351 Nov 24 10:21 073W_42N_med_B1.tif.aux.xml
-rwxrwx--- 1 root vboxsf  2106 Nov 24  2022 alltif_tile.dbf
-rwxrwx--- 1 root vboxsf 98304 Nov 24 10:21 alltif_tile.gpkg
-rwxrwx--- 1 root vboxsf   145 Nov 24  2022 alltif_tile.prj
-rwxrwx--- 1 root vboxsf  1188 Nov 24  2022 alltif_tile.shp
-rwxrwx--- 1 root vboxsf   164 Nov 24  2022 alltif_tile.shx
[8]:
# open all the tif tiles and the vector-tile
! qgis geodata/landsat_ct/*.tif geodata/landsat_ct/alltif_tile.gpkg
        - 'VirtualXPath'        [XML Path Language - XPath]
        - 'VirtualXPath'        [XML Path Language - XPath]
        - 'VirtualXPath'        [XML Path Language - XPath]
        - 'VirtualXPath'        [XML Path Language - XPath]
        - 'VirtualXPath'        [XML Path Language - XPath]
        - 'VirtualXPath'        [XML Path Language - XPath]
        - 'VirtualXPath'        [XML Path Language - XPath]
        - 'VirtualXPath'        [XML Path Language - XPath]
        - 'VirtualXPath'        [XML Path Language - XPath]

Second soulution

This solution is a bit complex. Anyway can be usefull to see how to combine different gdal/pktools comands to be able to get the same results.

[14]:
%%bash
# for each tile change all the pixel value to 1 or 2 or 3
n=1
for file in geodata/landsat_ct/????_???_med_B1.tif; do
filename=$(basename $file .tif)
pkgetmask -min -1 -max 999999999 -data $n -co COMPRESS=DEFLATE -co ZLEVEL=9 -ot Byte -i $file -o geodata/landsat_ct/${filename}_byte.tif
gdal_edit.py -a_nodata 0 geodata/landsat_ct/${filename}_byte.tif
n=$(expr $n + 1)
gdalinfo -mm geodata/landsat_ct/${filename}_byte.tif | grep Comp
done

# build a vrt in order to combine the created tif
gdalbuildvrt -overwrite   geodata/landsat_ct/all_byte.vrt   geodata/landsat_ct/*_byte.tif

# run polygonize
rm -f geodata/landsat_ct/alltif_tile_poligonize.gpkg
gdal_polygonize.py  geodata/landsat_ct/all_byte.vrt geodata/landsat_ct/alltif_tile_poligonize.gpkg
rm -f geodata/landsat_ct/all_byte.vrt geodata/landsat_ct/*_byte.tif
0...10...20...30...40...50...60...70...80...90...100 - done.
    Computed Min/Max=1.000,1.000
0...10...20...30...40...50...60...70...80...90...100 - done.
    Computed Min/Max=2.000,2.000
0...10...20...30...40...50...60...70...80...90...100 - done.
    Computed Min/Max=3.000,3.000
0...10...20...30...40...50...60...70...80...90...100 - done.
    Computed Min/Max=4.000,4.000
0...10...20...30...40...50...60...70...80...90...100 - done.
    Computed Min/Max=5.000,5.000
0...10...20...30...40...50...60...70...80...90...100 - done.
    Computed Min/Max=6.000,6.000
0...10...20...30...40...50...60...70...80...90...100 - done.
    Computed Min/Max=7.000,7.000
0...10...20...30...40...50...60...70...80...90...100 - done.
    Computed Min/Max=8.000,8.000
0...10...20...30...40...50...60...70...80...90...100 - done.
0...10...20...30...40...50...60...70...80...90...Creating output geodata/landsat_ct/alltif_tile_poligonize.gpkg of format GPKG.
100 - done.
[16]:
! qgis geodata/landsat_ct/????_???_med_B1.tif geodata/landsat_ct/alltif_tile_poligonize.gpkg
        - 'VirtualXPath'        [XML Path Language - XPath]
        - 'VirtualXPath'        [XML Path Language - XPath]
        - 'VirtualXPath'        [XML Path Language - XPath]
        - 'VirtualXPath'        [XML Path Language - XPath]
        - 'VirtualXPath'        [XML Path Language - XPath]
        - 'VirtualXPath'        [XML Path Language - XPath]
        - 'VirtualXPath'        [XML Path Language - XPath]
        - 'VirtualXPath'        [XML Path Language - XPath]
        - 'VirtualXPath'        [XML Path Language - XPath]
        - 'VirtualXPath'        [XML Path Language - XPath]

This solution is also fine but you do not have the tile name in the vector attribute page.

3rd excercise

30% grading value

  • Open one file geodata/LST/LST_MOYDmax_month1.tif in qgis and create a vector file with 3 polygons, save it in LST folder. Any shape of the poligons are fine just be sure that is “on top” of the raster file.

  • Run a pktools command that is able to extract zonal statistic for one .tif file and save the results to a ” GPKG -raster,vector- (rw+vs): GeoPackage” file format.

  • Build a for loop that is able to extract zonal statistic for all .tif files and save the GPKG.

[9]:
# open the tif file and drow the polingons save it as LST/polygons_zonal.gpkg
! qgis geodata/LST/LST_MOYDmax_month1.tif geodata/LST/polygons_zonal.gpkg
        - 'VirtualXPath'        [XML Path Language - XPath]
        - 'VirtualXPath'        [XML Path Language - XPath]
        - 'VirtualXPath'        [XML Path Language - XPath]
        - 'VirtualXPath'        [XML Path Language - XPath]
        - 'VirtualXPath'        [XML Path Language - XPath]
        - 'VirtualXPath'        [XML Path Language - XPath]
        - 'VirtualXPath'        [XML Path Language - XPath]
        - 'VirtualXPath'        [XML Path Language - XPath]
        - 'VirtualXPath'        [XML Path Language - XPath]
        - 'VirtualXPath'        [XML Path Language - XPath]
[29]:
! rm -f geodata/LST/polygons_zonal_month1.gpkg
! pkextractogr  -f "GPKG" -srcnodata -9999 -r mean \
-i geodata/LST/LST_MOYDmax_month1.tif -s  geodata/LST/polygons_zonal.gpkg -o geodata/LST/polygons_zonal_month1.gpkg
        - 'VirtualXPath'        [XML Path Language - XPath]
        - 'VirtualXPath'        [XML Path Language - XPath]
processing layer polygons_zonal
0...10...20...30...40...50...60...70...80...90...100 - done.
[30]:
# check the results
! ogrinfo -al -geom=NO geodata/LST/polygons_zonal_month1.gpkg
        - 'VirtualXPath'        [XML Path Language - XPath]
INFO: Open of `geodata/LST/polygons_zonal_month1.gpkg'
      using driver `GPKG' successful.

Layer name: polygons_zonal
Geometry: Polygon
Feature Count: 3
Extent: (28.159091, 0.336869) - (33.182828, 3.912879)
Layer SRS WKT:
GEOGCRS["WGS 84",
    DATUM["World Geodetic System 1984",
        ELLIPSOID["WGS 84",6378137,298.257223563,
            LENGTHUNIT["metre",1]]],
    PRIMEM["Greenwich",0,
        ANGLEUNIT["degree",0.0174532925199433]],
    CS[ellipsoidal,2],
        AXIS["geodetic latitude (Lat)",north,
            ORDER[1],
            ANGLEUNIT["degree",0.0174532925199433]],
        AXIS["geodetic longitude (Lon)",east,
            ORDER[2],
            ANGLEUNIT["degree",0.0174532925199433]],
    USAGE[
        SCOPE["Horizontal component of 3D system."],
        AREA["World."],
        BBOX[-90,-180,90,180]],
    ID["EPSG",4326]]
Data axis to CRS axis mapping: 2,1
FID Column = fid
Geometry Column = geom
ID: Integer (0.0)
mean: Real (0.0)
OGRFeature(polygons_zonal):1
  ID (Integer) = 2
  mean (Real) = 32.2190872990031

OGRFeature(polygons_zonal):2
  ID (Integer) = 1
  mean (Real) = 37.5875141960014

OGRFeature(polygons_zonal):3
  ID (Integer) = 3
  mean (Real) = 27.8825487457743

[31]:
# create a vrt and use it for the extraction
!  -separate geodata/LST/LST_MOYDmax_allmonth.vrt geodata/LST/LST_MOYDmax_month?.tif geodata/LST/LST_MOYDmax_month??.tif
! rm -f geodata/LST/polygons_zonal_allmonth.gpkg
! pkextractogr  -f "GPKG" -srcnodata -9999 -r mean \
-i geodata/LST/LST_MOYDmax_allmonth.vrt -s  geodata/LST/polygons_zonal.gpkg  -o geodata/LST/polygons_zonal_allmonth.gpkg

0...10...20...30...40...50...60...70...80...90...100 - done.
        - 'VirtualXPath'        [XML Path Language - XPath]
        - 'VirtualXPath'        [XML Path Language - XPath]
processing layer polygons_zonal
0...10...20...30...40...50...60...70...80...90...100 - done.
[32]:
# check the results
! ogrinfo -al -geom=NO geodata/LST/polygons_zonal_allmonth.gpkg
        - 'VirtualXPath'        [XML Path Language - XPath]
INFO: Open of `geodata/LST/polygons_zonal_allmonth.gpkg'
      using driver `GPKG' successful.

Layer name: polygons_zonal
Geometry: Polygon
Feature Count: 3
Extent: (28.159091, 0.336869) - (33.182828, 3.912879)
Layer SRS WKT:
GEOGCRS["WGS 84",
    DATUM["World Geodetic System 1984",
        ELLIPSOID["WGS 84",6378137,298.257223563,
            LENGTHUNIT["metre",1]]],
    PRIMEM["Greenwich",0,
        ANGLEUNIT["degree",0.0174532925199433]],
    CS[ellipsoidal,2],
        AXIS["geodetic latitude (Lat)",north,
            ORDER[1],
            ANGLEUNIT["degree",0.0174532925199433]],
        AXIS["geodetic longitude (Lon)",east,
            ORDER[2],
            ANGLEUNIT["degree",0.0174532925199433]],
    USAGE[
        SCOPE["Horizontal component of 3D system."],
        AREA["World."],
        BBOX[-90,-180,90,180]],
    ID["EPSG",4326]]
Data axis to CRS axis mapping: 2,1
FID Column = fid
Geometry Column = geom
ID: Integer (0.0)
b0: Real (0.0)
b1: Real (0.0)
b2: Real (0.0)
b3: Real (0.0)
b4: Real (0.0)
b5: Real (0.0)
b6: Real (0.0)
b7: Real (0.0)
b8: Real (0.0)
b9: Real (0.0)
b10: Real (0.0)
b11: Real (0.0)
OGRFeature(polygons_zonal):1
  ID (Integer) = 2
  b0 (Real) = 32.2190872990031
  b1 (Real) = 34.2584177655869
  b2 (Real) = 33.1255212012574
  b3 (Real) = 29.7562015823403
  b4 (Real) = 27.3043843524003
  b5 (Real) = 26.0022204431177
  b6 (Real) = 25.9803342467279
  b7 (Real) = 26.0892614446006
  b8 (Real) = 26.5121187622381
  b9 (Real) = 26.9326012949042
  b10 (Real) = 27.8735073791314
  b11 (Real) = 29.8276078033663

OGRFeature(polygons_zonal):2
  ID (Integer) = 1
  b0 (Real) = 37.5875141960014
  b1 (Real) = 41.1516169535228
  b2 (Real) = 38.9246022332857
  b3 (Real) = 32.0599748302381
  b4 (Real) = 28.703743676277
  b5 (Real) = 27.9588523079267
  b6 (Real) = 27.9912325730149
  b7 (Real) = 28.0036972196164
  b8 (Real) = 29.0760217569295
  b9 (Real) = 28.6392086292018
  b10 (Real) = 29.1736903942795
  b11 (Real) = 32.6712335134043

OGRFeature(polygons_zonal):3
  ID (Integer) = 3
  b0 (Real) = 27.8825487457743
  b1 (Real) = 29.5533600442245
  b2 (Real) = 29.5085259041395
  b3 (Real) = 28.1868469851987
  b4 (Real) = 26.6657154694727
  b5 (Real) = 25.5369423150765
  b6 (Real) = 25.4000528157838
  b7 (Real) = 25.7769232047144
  b8 (Real) = 26.3030576657987
  b9 (Real) = 26.699380307982
  b10 (Real) = 26.3889540332556
  b11 (Real) = 26.7058148356776

This exercise can be also solved by building a for loop and extracting the zonal statistic for each file

for file in geodata/LST/LST_MOYDmax_month?.tif ; do
filename=$(basename $file .tif)
rm -f geodata/LST/${filename}_zonal.gpkg
pkextractogr  -f "GPKG" -srcnodata -9999 -r mean \
-i $file -s  geodata/LST/polygons_zonal.gpkg -o geodata/LST/${filename}_zonal.gpkg
done

Anyway is not the most efficient way because yuo create several files. Rather with the vrt option you concentrate the information in only one file.

4th excercise

30% grading value

Using the geodata/shp/Lepidocolaptes_lacrymiger_allpoints.csv at each location extract pixel value from ./geodata/cloud/SA_intra.tif ./geodata/cloud/SA_meanannual.tif ./geodata/dem/SA_elevation_mn_GMTED2010_mn.tif ./geodata/vegetation/SA_tree_mn_percentage_GFC2013.tif. The final result should be a txt file composed as such:

Lat Long SA_intra SA_meanannual SA_elevation SA_tree
-76.18925 3.98125 9 200 97 3 …….
[50]:
! head  geodata/shp/Lepidocolaptes_lacrymiger_allpoints.csv










[25]:
# select only the lon and lat information
! awk -F , '{ if(NR>1) print $1,$2}'  geodata/shp/Lepidocolaptes_lacrymiger_allpoints.csv > geodata/shp/Lepidocolaptes_lacrymiger_allpoints_x_y.txt
! head geodata/shp/Lepidocolaptes_lacrymiger_allpoints_x_y.txt
-76.18925 3.98125
-76.18406 3.93442
-74.30256 4.60675
-74.30256 4.60675
-76.10394 4.74631
-76.13861 4.74536
-76.13919 4.7445
-76.10431 4.72342
-78.70159 -0.06506
-77.89315 -0.46052

At this point you two options: first using the vrt as input and the second one perform a for loop.

First option The first option is the most efficent way. Evrithing is stored in the memory and awk take care of the reformating

[53]:
# build a vrt having one tif for each band of the vrt using the -separate option
! gdalbuildvrt -overwrite -separate  ./geodata/all_tif.vrt   ./geodata/cloud/SA_intra.tif ./geodata/cloud/SA_meanannual.tif ./geodata/dem/SA_elevation_mn_GMTED2010_mn.tif ./geodata/vegetation/SA_tree_mn_percentage_GFC2013.tif
0...10...20...30...40...50...60...70...80...90...100 - done.
[72]:
%%bash
# extract at each point and twist from 1 column to 4 column using the option awk 'ORS=NR%4?FS:RS'.
# The number 4 identifies the number of bands in the vrt
gdallocationinfo -geoloc -wgs84 -valonly ./geodata/all_tif.vrt < geodata/shp/Lepidocolaptes_lacrymiger_allpoints_x_y.txt \
| awk 'ORS=NR%4?FS:RS' >  geodata/shp/Lepidocolaptes_lacrymiger_allpoints_extract.txt
head geodata/shp/Lepidocolaptes_lacrymiger_allpoints_extract.txt
223 8889 1253 6119
217 9337 1593 8371
85 9762 2574 7015
85 9762 2574 7015
392 8132 1705 8939
305 8571 1744 7383
305 8571 1744 7383
373 8599 1750 8104
191 9701 1669 9513
275 9267 1835 4400
[67]:
! echo "Long Lat SA_intra SA_meanannual SA_elevation SA_tree" > geodata/shp/Lepidocolaptes_lacrymiger_allpoints_x_y_extract.txt
! paste -d " " geodata/shp/Lepidocolaptes_lacrymiger_allpoints_x_y.txt geodata/shp/Lepidocolaptes_lacrymiger_allpoints_extract.txt >> geodata/shp/Lepidocolaptes_lacrymiger_allpoints_x_y_extract.txt
! head geodata/shp/Lepidocolaptes_lacrymiger_allpoints_x_y_extract.txt
Long Lat SA_intra SA_meanannual SA_elevation SA_tree
-76.18925 3.98125 223 8889 1253 6119
-76.18406 3.93442 217 9337 1593 8371
-74.30256 4.60675 85 9762 2574 7015
-74.30256 4.60675 85 9762 2574 7015
-76.10394 4.74631 392 8132 1705 8939
-76.13861 4.74536 305 8571 1744 7383
-76.13919 4.7445 305 8571 1744 7383
-76.10431 4.72342 373 8599 1750 8104
-78.70159 -0.06506 191 9701 1669 9513

Second option

[95]:
%%bash
for file in ./geodata/cloud/SA_intra.tif ./geodata/cloud/SA_meanannual.tif ./geodata/dem/SA_elevation_mn_GMTED2010_mn.tif ./geodata/vegetation/SA_tree_mn_percentage_GFC2013.tif ; do
filename=$(basename $file .tif)
gdallocationinfo -geoloc -wgs84 -valonly $file < geodata/shp/Lepidocolaptes_lacrymiger_allpoints_x_y.txt > geodata/shp/$filename.txt
done

At this point it is important to check if the files have the same number of lines and if there are not empity lines

[11]:
%%bash
head geodata/shp/SA_intra.txt
wc -l  geodata/shp/{SA_intra,SA_meanannual,SA_elevation_mn_GMTED2010_mn,SA_tree_mn_percentage_GFC2013}.txt
223
217
85
85
392
305
305
373
191
275
 3438 geodata/shp/SA_intra.txt
 3438 geodata/shp/SA_meanannual.txt
 3438 geodata/shp/SA_elevation_mn_GMTED2010_mn.txt
 3438 geodata/shp/SA_tree_mn_percentage_GFC2013.txt
13752 total
[97]:
! awk '{  if(NF==1) print  NF }' geodata/shp/SA_intra.txt | wc -l
! awk '{  if(NF==1) print  NF }' geodata/shp/SA_meanannual.txt | wc -l
! awk '{  if(NF==1) print  NF }' geodata/shp/SA_elevation_mn_GMTED2010_mn.txt | wc -l
! awk '{  if(NF==1) print  NF }' geodata/shp/SA_tree_mn_percentage_GFC2013.txt | wc -l
3438
3438
3438
3438
[98]:
%%bash
echo "Long Lat SA_intra SA_meanannual SA_elevation SA_tree" > geodata/shp/Lepidocolaptes_lacrymiger_allpoints_x_y_extract.txt
paste -d " " geodata/shp/Lepidocolaptes_lacrymiger_allpoints_x_y.txt geodata/shp/{SA_intra,SA_meanannual,SA_elevation_mn_GMTED2010_mn,SA_tree_mn_percentage_GFC2013}.txt >> geodata/shp/Lepidocolaptes_lacrymiger_allpoints_x_y_extract.txt
head geodata/shp/Lepidocolaptes_lacrymiger_allpoints_x_y_extract.txt
Long Lat SA_intra SA_meanannual SA_elevation SA_tree
-76.18925 3.98125 223 8889 1253 6119
-76.18406 3.93442 217 9337 1593 8371
-74.30256 4.60675 85 9762 2574 7015
-74.30256 4.60675 85 9762 2574 7015
-76.10394 4.74631 392 8132 1705 8939
-76.13861 4.74536 305 8571 1744 7383
-76.13919 4.7445 305 8571 1744 7383
-76.10431 4.72342 373 8599 1750 8104
-78.70159 -0.06506 191 9701 1669 9513

Third solution Also this solution is very effective. The gdallocationinfo is computed on the flight and is kept in the memory and merged with the others using the paste command.

[28]:
%%bash
echo "Long Lat SA_intra SA_meanannual SA_elevation SA_tree" > geodata/shp/Lepidocolaptes_lacrymiger_allpoints_x_y_extract.txt
paste -d " " geodata/shp/Lepidocolaptes_lacrymiger_allpoints_x_y.txt <(gdallocationinfo -geoloc -wgs84 -valonly ./geodata/cloud/SA_intra.tif < geodata/shp/Lepidocolaptes_lacrymiger_allpoints_x_y.txt) \
<(gdallocationinfo -geoloc -wgs84 -valonly ./geodata/cloud/SA_meanannual.tif < geodata/shp/Lepidocolaptes_lacrymiger_allpoints_x_y.txt) \
<(gdallocationinfo -geoloc -wgs84 -valonly ./geodata/dem/SA_elevation_mn_GMTED2010_mn.tif < geodata/shp/Lepidocolaptes_lacrymiger_allpoints_x_y.txt) \
<(gdallocationinfo -geoloc -wgs84 -valonly ./geodata/vegetation/SA_tree_mn_percentage_GFC2013.tif < geodata/shp/Lepidocolaptes_lacrymiger_allpoints_x_y.txt) \
>> geodata/shp/Lepidocolaptes_lacrymiger_allpoints_x_y_extract.txt
head geodata/shp/Lepidocolaptes_lacrymiger_allpoints_x_y_extract.txt
Long Lat SA_intra SA_meanannual SA_elevation SA_tree
-76.18925 3.98125 223 8889 1253 6119
-76.18406 3.93442 217 9337 1593 8371
-74.30256 4.60675 85 9762 2574 7015
-74.30256 4.60675 85 9762 2574 7015
-76.10394 4.74631 392 8132 1705 8939
-76.13861 4.74536 305 8571 1744 7383
-76.13919 4.7445 305 8571 1744 7383
-76.10431 4.72342 373 8599 1750 8104
-78.70159 -0.06506 191 9701 1669 9513