{ "cells": [ { "cell_type": "markdown", "metadata": {}, "source": [ "# SDM1 : Montane woodcreper - Gecomputation\n", "One of the first action in Species Distribution Model is to record presence/absence of the species and then search for raster or vector files that store information about environment variables, that define the ecological niche. We will first concentrate on the collection of the environment variables. " ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Open the bash terminal, migrate in the directory, and open the jupter-notebook.ipynb\n", "\n", " cd /media/sf_LVM_shared/my_SE_data/exercise\n", " jupyter-notebook SDM1_MWood_gecomp.ipynb" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Crop enviromental layers" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "First select a dataset that we can use to define the overall extent of the specie - the study area.\n", "We will therefore crop geodata using the species range as an extent and increase of 3 degree in N-S and W-E direction." ] }, { "cell_type": "code", "execution_count": 1, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "-82 14 -59 -21\n", "Input file size is 5880, 8400\n", "0...10...20...30...40...50...60...70...80...90...100 - done.\n", "Input file size is 5880, 8400\n", "0...10...20...30...40...50...60...70...80...90...100 - done.\n", "Input file size is 5880, 8400\n", "0...10...20...30...40...50...60...70...80...90...100 - done.\n", "Input file size is 5880, 8400\n", "0...10...20...30...40...50...60...70...80...90...100 - done.\n" ] } ], "source": [ "%%bash\n", "cd /media/sf_LVM_shared/my_SE_data/exercise\n", "# test \n", "# define north south weast owest extent\n", "ulx=$(ogrinfo -al -so geodata/shp/cartodb-query.shp | grep Extent | awk '{ gsub(\"\\\\(\",\" \"); print int($2 -3)}')\n", "uly=$(ogrinfo -al -so geodata/shp/cartodb-query.shp | grep Extent | awk '{ gsub(\"\\\\)\",\" \"); print int($6 +3)}')\n", "lrx=$(ogrinfo -al -so geodata/shp/cartodb-query.shp | grep Extent | awk '{ gsub(\"\\\\(\",\" \"); print int($5 +3)}')\n", "lry=$(ogrinfo -al -so geodata/shp/cartodb-query.shp | grep Extent | awk '{ gsub(\"\\\\)\",\" \"); print int($3 -3)}')\n", "\n", "## Print the extent to confirm the lat long \n", "echo $ulx $uly $lrx $lry\n", "## croping\n", "gdal_translate -projwin $ulx $uly $lrx $lry -co COMPRESS=DEFLATE -co ZLEVEL=9 \\\n", "geodata/cloud/SA_intra.tif geodata/cloud/SA_intra_crop.tif\n", "\n", "gdal_translate -projwin $ulx $uly $lrx $lry -co COMPRESS=DEFLATE -co ZLEVEL=9 \\\n", "geodata/cloud/SA_meanannual.tif geodata/cloud/SA_meanannual_crop.tif\n", "\n", "gdal_translate -projwin $ulx $uly $lrx $lry -co COMPRESS=DEFLATE -co ZLEVEL=9 \\\n", "geodata/dem/SA_elevation_mn_GMTED2010_mn.tif geodata/dem/SA_elevation_mn_GMTED2010_mn_crop.tif\n", "\n", "gdal_translate -projwin $ulx $uly $lrx $lry -co COMPRESS=DEFLATE -co ZLEVEL=9 \\\n", "geodata/vegetation/SA_tree_mn_percentage_GFC2013.tif geodata/vegetation/SA_tree_mn_percentage_GFC2013_crop.tif\n" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Download climate layers" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Dowonload climate data from https://chelsa-climate.org/ . Using the /vsicurl/ option in gdal is possible to download only the raster extend of your study area.\n", "\n", "1) Download and process climate data (grade 20%)\n", "\n", "Proceed with the download also for tmin and prec. Build up the loop and also calculate mean and stdev. Consider that, pkstatprofile even giving the compression flag (-co COMPRESS=DEFLATE -co ZLEVEL=9) does not compress very well so insert a gdal_translate to perform the compression (and of course remove the intermediate files).\n", "\n" ] }, { "cell_type": "code", "execution_count": 2, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Input file size is 43200, 20880\n", "0...10...20...30...40...50...60...70...80...90...100 - done.\n", "Input file size is 43200, 20880\n", "0...10...20...30...40...50...60...70...80...90...100 - done.\n", "Input file size is 43200, 20880\n", "0...10...20...30...40...50...60...70...80...90...100 - done.\n", "Input file size is 43200, 20880\n", "0...10...20...30...40...50...60...70...80...90...100 - done.\n", "Input file size is 43200, 20880\n", "0...10...20...30...40...50...60...70...80...90...100 - done.\n", "Input file size is 43200, 20880\n", "0...10...20...30...40...50...60...70...80...90...100 - done.\n", "Input file size is 43200, 20880\n", "0...10...20...30...40...50...60...70...80...90...100 - done.\n", "Input file size is 43200, 20880\n", "0...10...20...30...40...50...60...70...80...90...100 - done.\n", "Input file size is 43200, 20880\n", "0...10...20...30...40...50...60...70...80...90...100 - done.\n", "Input file size is 43200, 20880\n", "0...10...20...30...40...50...60...70...80...90...100 - done.\n", "Input file size is 43200, 20880\n", "0...10...20...30...40...50...60...70...80...90...100 - done.\n", "Input file size is 43200, 20880\n", "0...10...20...30...40...50...60...70...80...90...100 - done.\n", "Input file size is 43200, 20880\n", "0...10...20...30...40...50...60...70...80...90...100 - done.\n", "Input file size is 43200, 20880\n", "0...10...20...30...40...50...60...70...80...90...100 - done.\n", "Input file size is 43200, 20880\n", "0...10...20...30...40...50...60...70...80...90...100 - done.\n", "Input file size is 43200, 20880\n", "0...10...20...30...40...50...60...70...80...90...100 - done.\n", "Input file size is 43200, 20880\n", "0...10...20...30...40...50...60...70...80...90...100 - done.\n", "Input file size is 43200, 20880\n", "0...10...20...30...40...50...60...70...80...90...100 - done.\n", "Input file size is 43200, 20880\n", "0...10...20...30...40...50...60...70...80...90...100 - done.\n", "Input file size is 43200, 20880\n", "0...10...20...30...40...50...60...70...80...90...100 - done.\n", "Input file size is 43200, 20880\n", "0...10...20...30...40...50...60...70...80...90...100 - done.\n", "Input file size is 43200, 20880\n", "0...10...20...30...40...50...60...70...80...90...100 - done.\n", "Input file size is 43200, 20880\n", "0...10...20...30...40...50...60...70...80...90...100 - done.\n", "Input file size is 43200, 20880\n", "0...10...20...30...40...50...60...70...80...90...100 - done.\n" ] } ], "source": [ "%%bash\n", "cd /media/sf_LVM_shared/my_SE_data/exercise\n", "# download temperature tmin and tmax from https://envicloud.wsl.ch/#/?prefix=chelsa%2Fchelsa_V1\n", "for var in tmin tmax ; do \n", "for MM in 01 02 03 04 05 06 07 08 09 10 11 12 ; do \n", "gdal_translate -projwin -82 14 -59 -21 -co COMPRESS=DEFLATE -co ZLEVEL=9 /vsicurl/https://os.zhdk.cloud.switch.ch/envicloud/chelsa/chelsa_V1/climatologies/${var}/CHELSA_${var}10_${MM}_1979-2013_V1.2_land.tif geodata/climate/CHELSA_${var}10_${MM}_1979-2013_V1.2_land_crop.tif \n", "done\n", "done" ] }, { "cell_type": "code", "execution_count": 1, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Input file size is 43200, 20880\n", "0...10...20...30...40...50...60...70...80...90...100 - done.\n", "Input file size is 43200, 20880\n", "0...10...20...30...40...50...60...70...80...90...100 - done.\n", "Input file size is 43200, 20880\n", "0...10...20...30...40...50...60...70...80...90...100 - done.\n", "Input file size is 43200, 20880\n", "0...10...20...30...40...50...60...70...80...90...100 - done.\n", "Input file size is 43200, 20880\n", "0...10...20...30...40...50...60...70...80...90...100 - done.\n", "Input file size is 43200, 20880\n", "0...10...20...30...40...50...60...70...80...90...100 - done.\n", "Input file size is 43200, 20880\n", "0...10...20...30...40...50...60...70...80...90...100 - done.\n", "Input file size is 43200, 20880\n", "0...10...20...30...40...50...60...70...80...90...100 - done.\n", "Input file size is 43200, 20880\n", "0...10...20...30...40...50...60...70...80...90...100 - done.\n", "Input file size is 43200, 20880\n", "0...10...20...30...40...50...60...70...80...90...100 - done.\n", "Input file size is 43200, 20880\n", "0...10...20...30...40...50...60...70...80...90...100 - done.\n", "Input file size is 43200, 20880\n", "0...10...20...30...40...50...60...70...80...90...100 - done.\n" ] } ], "source": [ "%%bash\n", "cd /media/sf_LVM_shared/my_SE_data/exercise\n", "# download precipitation from https://envicloud.wsl.ch/#/?prefix=chelsa%2Fchelsa_V1\n", "for MM in 01 02 03 04 05 06 07 08 09 10 11 12 ; do \n", "gdal_translate -projwin -82 14 -59 -21 -co COMPRESS=DEFLATE -co ZLEVEL=9 /vsicurl/https://os.zhdk.cloud.switch.ch/envicloud/chelsa/chelsa_V1/climatologies/prec/CHELSA_prec_${MM}_V1.2_land.tif geodata/climate/CHELSA_prec_${MM}_V1.2_land_crop.tif \n", "done" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Calculate annual mean and standard deviation with the monthly values." ] }, { "cell_type": "code", "execution_count": 5, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "0...10...20...30...40...50...60...70...80...90...100 - done.\n", "0...10...20...30...40...50...60...70...80...90...100 - done.\n", "0...10...20...30...40...50...60...70...80...90...100 - done.\n", "Input file size is 2760, 4200\n", "0...10...20...30...40...50...60...70...80...90...100 - done.\n", "Input file size is 2760, 4200\n", "0...10...20...30...40...50...60...70...80...90...100 - done.\n", "0...10...20...30...40...50...60...70...80...90...100 - done.\n", "0...10...20...30...40...50...60...70...80...90...100 - done.\n", "0...10...20...30...40...50...60...70...80...90...100 - done.\n", "Input file size is 2760, 4200\n", "0...10...20...30...40...50...60...70...80...90...100 - done.\n", "Input file size is 2760, 4200\n", "0...10...20...30...40...50...60...70...80...90...100 - done.\n" ] } ], "source": [ "%%bash\n", "cd /media/sf_LVM_shared/my_SE_data/exercise\n", "\n", "# calculate mean and stdev for tmax,\n", "for var in tmin tmax ; do\n", "gdalbuildvrt -overwrite -separate geodata/climate/CHELSA_${var}10_1979-2013_V1.2_land_crop.vrt geodata/climate/CHELSA_${var}10_*_1979-2013_V1.2_land_crop.tif\n", "\n", "pkstatprofile -co COMPRESS=DEFLATE -co ZLEVEL=9 -nodata -32768 -f mean -i geodata/climate/CHELSA_${var}10_1979-2013_V1.2_land_crop.vrt -o geodata/climate/CHELSA_${var}10_1979-2013_V1.2_land_crop_mean_tmp.tif \n", "pkstatprofile -co COMPRESS=DEFLATE -co ZLEVEL=9 -nodata -32768 -f stdev -i geodata/climate/CHELSA_${var}10_1979-2013_V1.2_land_crop.vrt -o geodata/climate/CHELSA_${var}10_1979-2013_V1.2_land_crop_stdev_tmp.tif\n", "\n", "gdal_translate -co COMPRESS=DEFLATE -co ZLEVEL=9 geodata/climate/CHELSA_${var}10_1979-2013_V1.2_land_crop_mean_tmp.tif geodata/climate/CHELSA_${var}10_1979-2013_V1.2_land_crop_mean.tif \n", "gdal_translate -co COMPRESS=DEFLATE -co ZLEVEL=9 geodata/climate/CHELSA_${var}10_1979-2013_V1.2_land_crop_stdev_tmp.tif geodata/climate/CHELSA_${var}10_1979-2013_V1.2_land_crop_stdev.tif \n", "rm -f geodata/climate/CHELSA_${var}10_1979-2013_V1.2_land_crop_stdev_tmp.tif geodata/climate/CHELSA_${var}10_1979-2013_V1.2_land_crop_mean_tmp.tif\n", "\n", "done" ] }, { "cell_type": "code", "execution_count": 6, "metadata": { "scrolled": true }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "0...10...20...30...40...50...60...70...80...90...100 - done.\n", "0...10...20...30...40...50...60...70...80...90...100 - done.\n", "0...10...20...30...40...50...60...70...80...90...100 - done.\n", "Input file size is 2760, 4200\n", "0...10...20...30...40...50...60...70...80...90...100 - done.\n", "Input file size is 2760, 4200\n", "0...10...20...30...40...50...60...70...80...90...100 - done.\n" ] } ], "source": [ "%%bash\n", "cd /media/sf_LVM_shared/my_SE_data/exercise\n", "\n", "gdalbuildvrt -overwrite -separate geodata/climate/CHELSA_prec_V1.2_land_crop.vrt geodata/climate/CHELSA_prec_*V1.2_land_crop.tif\n", "\n", "pkstatprofile -co COMPRESS=DEFLATE -co ZLEVEL=9 -nodata -32768 -f mean -i geodata/climate/CHELSA_prec_V1.2_land_crop.vrt -o geodata/climate/CHELSA_prec_V1.2_land_crop_mean_tmp.tif \n", "pkstatprofile -co COMPRESS=DEFLATE -co ZLEVEL=9 -nodata -32768 -f stdev -i geodata/climate/CHELSA_prec_V1.2_land_crop.vrt -o geodata/climate/CHELSA_prec_V1.2_land_crop_stdev_tmp.tif \n", "\n", "gdal_translate -co COMPRESS=DEFLATE -co ZLEVEL=9 geodata/climate/CHELSA_prec_V1.2_land_crop_mean_tmp.tif geodata/climate/CHELSA_prec_V1.2_land_crop_mean.tif \n", "gdal_translate -co COMPRESS=DEFLATE -co ZLEVEL=9 geodata/climate/CHELSA_prec_V1.2_land_crop_stdev_tmp.tif geodata/climate/CHELSA_prec_V1.2_land_crop_stdev.tif \n", "\n", "rm geodata/climate/CHELSA_prec_V1.2_land_crop_mean_tmp.tif geodata/climate/CHELSA_prec_V1.2_land_crop_stdev_tmp.tif" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Building geomorphometric layers\n", "\n", "2) Geomorphometric layers (grade 20%)\n", "\n", "Using the GMTED2010 DEM (geodata/dem/SA_elevation_mn_GMTED2010_mn_crop.tif) calculate \"aspect\" \"slope\" \"Terrain Ruggedness Index\" using gdal.\n", "I have opt for the uploding of \"Multiple Attempts\", so in case you want upload another version of the homework now is possible - please do not send it by e-mail. " ] }, { "cell_type": "code", "execution_count": 10, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "0...10...20...30...40...50...60...70...80...90...100 - done.\n", "0...10...20...30...40...50...60...70...80...90...100 - done.\n", "0...10...20...30...40...50...60...70...80...90...100 - done.\n" ] } ], "source": [ "%%bash\n", "cd /media/sf_LVM_shared/my_SE_data/exercise\n", "\n", "# To calculate the slope by gdaldem slope in lat long you need to setup corretly the scale flag. \n", "# The slope scale value is 111120 at the equator and 0 at the north/south pole. In order to get the slope correctly \n", "# you need first calculate the scale value in accordance to the latitudinal level of your dem \n", "# gdalinfo geodata/dem/SA_elevation_mn_GMTED2010_mn_crop.tif\n", "# Center ( -70.5000000, -3.5000000)\n", "# 111120 - (111120 /90 * 3.5) = 106798 \n", "\n", "gdaldem slope -co COMPRESS=DEFLATE -co ZLEVEL=9 -scale 106798 geodata/dem/SA_elevation_mn_GMTED2010_mn_crop.tif geodata/dem/SA_elevation_mn_GMTED2010_mn_crop_slope.tif\n", "gdaldem aspect -co COMPRESS=DEFLATE -co ZLEVEL=9 geodata/dem/SA_elevation_mn_GMTED2010_mn_crop.tif geodata/dem/SA_elevation_mn_GMTED2010_mn_crop_aspect.tif\n", "gdaldem TRI -co COMPRESS=DEFLATE -co ZLEVEL=9 geodata/dem/SA_elevation_mn_GMTED2010_mn_crop.tif geodata/dem/SA_elevation_mn_GMTED2010_mn_crop_tri.tif" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Masking\n", "Many of the layers that we have downloaded have the same pixel values for the sea and also for the land (e.g. percent of vegation = 0% is given to desert area and also to the see). We can use an uncilary layers (/media/sf_LVM_shared/my_SE_data/exercise/geodata/mask/msk_1km.tif) to mask all the other layers and give a common no-data value. " ] }, { "cell_type": "code", "execution_count": 8, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "0...10...20...30...40...50...60...70...80...90...100 - done.\n", "0...10...20...30...40...50...60...70...80...90...100 - done.\n", "0...10...20...30...40...50...60...70...80...90...100 - done.\n", "0...10...20...30...40...50...60...70...80...90...100 - done.\n" ] } ], "source": [ "%%bash\n", "cd /media/sf_LVM_shared/my_SE_data/exercise\n", "\n", "pksetmask -ot Int16 -m geodata/mask/msk_1km.tif -msknodata 0 -nodata -9999 -i geodata/cloud/SA_meanannual_crop.tif -o geodata/cloud/SA_meanannual_crop_msk.tif\n", "pksetmask -ot Int16 -m geodata/mask/msk_1km.tif -msknodata 0 -nodata -9999 -i geodata/cloud/SA_intra_crop.tif -o geodata/cloud/SA_intra_crop_msk.tif\n", "\n", "pksetmask -ot Int16 -m geodata/mask/msk_1km.tif -msknodata 0 -nodata -9999 -i geodata/dem/SA_elevation_mn_GMTED2010_mn_crop.tif -o geodata/dem/SA_elevation_mn_GMTED2010_mn_crop_msk.tif\n", "pksetmask -ot Int16 -m geodata/mask/msk_1km.tif -msknodata 0 -nodata -9999 -i geodata/vegetation/SA_tree_mn_percentage_GFC2013_crop.tif -o geodata/vegetation/SA_tree_mn_percentage_GFC2013_crop_msk.tif" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [] } ], "metadata": { "kernelspec": { "display_name": "Python 3 (ipykernel)", "language": "python", "name": "python3" }, "language_info": { "codemirror_mode": { "name": "ipython", "version": 3 }, "file_extension": ".py", "mimetype": "text/x-python", "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", "version": "3.8.10" } }, "nbformat": 4, "nbformat_minor": 4 }