0
0

Delete article

Deleted articles cannot be recovered.

Draft of this article would be also deleted.

Are you sure you want to delete this article?

ibm-granite/granite-geospatial-biomassのメモ②

Last updated at Posted at 2025-01-04

ibm-granite/granite-geospatial-biomassのZero-shot for all biomesまでをHLS Sentinel-2 Multi-spectral Instrument Surface Reflectance Daily Global 30mの鹿児島周辺のデータ(以降、HLSデータ)で実施してみたので以下メモ。

データ準備

HLSデータはEarthdata Searchよりダウンロード。
今回は鹿児島周辺かつ雲量が0-10%のHLS.S30.T52RFV.2021036T015851を選択(雲をマスクしたほうが良いかも)。
ダウンロード.png
以下でibm-granite/granite-geospatial-biomassのサンプルデータと同じフォーマットに変換。

mkdir data/tmp
mkdir data/tile

ls ./data/HLS.S30.T52RFV.2021036T015851.v2.0.*.tif | sort -V > data/tmp/HLS.S30.T52RFV.2021036T015851.v2.0.txt
cat data/tmp/HLS.S30.T52RFV.2021036T015851.v2.0.txt

gdal_merge.py -o data/tmp/HLS.S30.T52RFV.2021036T015851.v2.0.tif -a_nodata -9999 -separate `cat data/tmp/HLS.S30.T52RFV.2021036T015851.v2.0.txt`
gdalinfo data/tmp/HLS.S30.T52RFV.2021036T015851.v2.0.tif 

EPSG=`gdalinfo ./data/HLS.S30.T52RFV.2021036T015851.v2.0.B01.tif | grep HORIZONTAL_CS_CODE | sed 's/.*EPSG:\([0-9]*\)/\1/'`
echo ${EPSG}
gdal_edit.py -a_srs EPSG:${EPSG} data/tmp/HLS.S30.T52RFV.2021036T015851.v2.0.tif
gdalinfo data/tmp/HLS.S30.T52RFV.2021036T015851.v2.0.tif

gdal_translate -ot Float32 data/tmp/HLS.S30.T52RFV.2021036T015851.v2.0.tif data/tmp/HLS.S30.T52RFV.2021036T015851.v2.0.Float32.tif
gdalinfo data/tmp/HLS.S30.T52RFV.2021036T015851.v2.0.Float32.tif

gdal_retile.py -ps 226 226 -targetDir data/tile data/tmp/HLS.S30.T52RFV.2021036T015851.v2.0.Float32.tif
gdalinfo data/tile/HLS.S30.T52RFV.2021036T015851.v2.0.Float32_01_09.tif

mkdir ~/Google\ Drive/マイドライブ/HLSL30
cp -irp data/tile/HLS.S30.T52RFV.2021036T015851.v2.0.Float32_01_09.tif ~/Google\ Drive/マイドライブ/HLSL30/
cp -irp data/tmp/HLS.S30.T52RFV.2021036T015851.v2.0.txt ~/Google\ Drive/マイドライブ/HLSL30/

格納の順番は以下参照。

$ cat data/tmp/HLS.S30.T52RFV.2021036T015851.v2.0.txt
./data/HLS.S30.T52RFV.2021036T015851.v2.0.B01.tif
./data/HLS.S30.T52RFV.2021036T015851.v2.0.B02.tif
./data/HLS.S30.T52RFV.2021036T015851.v2.0.B03.tif
./data/HLS.S30.T52RFV.2021036T015851.v2.0.B04.tif
./data/HLS.S30.T52RFV.2021036T015851.v2.0.B05.tif
./data/HLS.S30.T52RFV.2021036T015851.v2.0.B06.tif
./data/HLS.S30.T52RFV.2021036T015851.v2.0.B07.tif
./data/HLS.S30.T52RFV.2021036T015851.v2.0.B08.tif
./data/HLS.S30.T52RFV.2021036T015851.v2.0.B09.tif
./data/HLS.S30.T52RFV.2021036T015851.v2.0.B10.tif
./data/HLS.S30.T52RFV.2021036T015851.v2.0.B11.tif
./data/HLS.S30.T52RFV.2021036T015851.v2.0.B12.tif
./data/HLS.S30.T52RFV.2021036T015851.v2.0.B8A.tif
./data/HLS.S30.T52RFV.2021036T015851.v2.0.Fmask.tif
./data/HLS.S30.T52RFV.2021036T015851.v2.0.SAA.tif
./data/HLS.S30.T52RFV.2021036T015851.v2.0.SZA.tif
./data/HLS.S30.T52RFV.2021036T015851.v2.0.VAA.tif
./data/HLS.S30.T52RFV.2021036T015851.v2.0.VZA.tif

桜島周辺のタイルのHLS.S30.T52RFV.2021036T015851.v2.0.Float32_01_09.tifを利用。
図1.png

HLS.S30.T52RFV.2021036T015851.v2.0.Float32_01_09.tifの仕様は以下の通りで、ほぼibm-granite/granite-geospatial-biomassで利用していたものと同じ。

$ gdalinfo ./data/tile/HLS.S30.T52RFV.2021036T015851.v2.0.Float32_01_09.tif
Driver: GTiff/GeoTIFF
Files: ./data/tile/HLS.S30.T52RFV.2021036T015851.v2.0.Float32_01_09.tif
Size is 226, 226
Coordinate System is:
PROJCRS["WGS 84 / UTM zone 52N",
    BASEGEOGCRS["WGS 84",
        DATUM["World Geodetic System 1984",
            ELLIPSOID["WGS 84",6378137,298.257223563,
                LENGTHUNIT["metre",1]]],
        PRIMEM["Greenwich",0,
            ANGLEUNIT["degree",0.0174532925199433]],
        ID["EPSG",4326]],
    CONVERSION["UTM zone 52N",
        METHOD["Transverse Mercator",
            ID["EPSG",9807]],
        PARAMETER["Latitude of natural origin",0,
            ANGLEUNIT["degree",0.0174532925199433],
            ID["EPSG",8801]],
        PARAMETER["Longitude of natural origin",129,
            ANGLEUNIT["degree",0.0174532925199433],
            ID["EPSG",8802]],
        PARAMETER["Scale factor at natural origin",0.9996,
            SCALEUNIT["unity",1],
            ID["EPSG",8805]],
        PARAMETER["False easting",500000,
            LENGTHUNIT["metre",1],
            ID["EPSG",8806]],
        PARAMETER["False northing",0,
            LENGTHUNIT["metre",1],
            ID["EPSG",8807]]],
    CS[Cartesian,2],
        AXIS["(E)",east,
            ORDER[1],
            LENGTHUNIT["metre",1]],
        AXIS["(N)",north,
            ORDER[2],
            LENGTHUNIT["metre",1]],
    USAGE[
        SCOPE["Navigation and medium accuracy spatial referencing."],
        AREA["Between 126°E and 132°E, northern hemisphere between equator and 84°N, onshore and offshore. China. Indonesia. Japan. North Korea. Russian Federation. South Korea."],
        BBOX[0,126,84,132]],
    ID["EPSG",32652]]
Data axis to CRS axis mapping: 1,2
Origin = (654240.000000000000000,3500040.000000000000000)
Pixel Size = (30.000000000000000,-30.000000000000000)
Metadata:
  AREA_OR_POINT=Area
Image Structure Metadata:
  INTERLEAVE=PIXEL
Corner Coordinates:
Upper Left  (  654240.000, 3500040.000) (130d37'34.54"E, 31d37'30.68"N)
Lower Left  (  654240.000, 3493260.000) (130d37'30.71"E, 31d33'50.55"N)
Upper Right (  661020.000, 3500040.000) (130d41'51.80"E, 31d37'27.33"N)
Lower Right (  661020.000, 3493260.000) (130d41'47.80"E, 31d33'47.21"N)
Center      (  657630.000, 3496650.000) (130d39'41.21"E, 31d35'38.96"N)
Band 1 Block=226x1 Type=Float32, ColorInterp=Gray
  NoData Value=-9999
Band 2 Block=226x1 Type=Float32, ColorInterp=Undefined
  NoData Value=-9999
Band 3 Block=226x1 Type=Float32, ColorInterp=Undefined
  NoData Value=-9999
Band 4 Block=226x1 Type=Float32, ColorInterp=Undefined
  NoData Value=-9999
Band 5 Block=226x1 Type=Float32, ColorInterp=Undefined
  NoData Value=-9999
Band 6 Block=226x1 Type=Float32, ColorInterp=Undefined
  NoData Value=-9999
Band 7 Block=226x1 Type=Float32, ColorInterp=Undefined
  NoData Value=-9999
Band 8 Block=226x1 Type=Float32, ColorInterp=Undefined
  NoData Value=-9999
Band 9 Block=226x1 Type=Float32, ColorInterp=Undefined
  NoData Value=-9999
Band 10 Block=226x1 Type=Float32, ColorInterp=Undefined
  NoData Value=-9999
Band 11 Block=226x1 Type=Float32, ColorInterp=Undefined
  NoData Value=-9999
Band 12 Block=226x1 Type=Float32, ColorInterp=Undefined
  NoData Value=-9999
Band 13 Block=226x1 Type=Float32, ColorInterp=Undefined
  NoData Value=-9999
Band 14 Block=226x1 Type=Float32, ColorInterp=Undefined
  NoData Value=-9999
Band 15 Block=226x1 Type=Float32, ColorInterp=Undefined
  NoData Value=-9999
Band 16 Block=226x1 Type=Float32, ColorInterp=Undefined
  NoData Value=-9999
Band 17 Block=226x1 Type=Float32, ColorInterp=Undefined
  NoData Value=-9999
Band 18 Block=226x1 Type=Float32, ColorInterp=Undefined
  NoData Value=-9999

HLS.S30.T52RFV.2021036T015851.v2.0.Float32_01_09.tifでibm-granite/granite-geospatial-biomassを実行。

# -*- coding: utf-8 -*-
# Commented out IPython magic to ensure Python compatibility.
# %load_ext autoreload
# %autoreload 2

from google.colab import drive
drive.mount('/content/drive')

# Clone the ibm-granite/granite-geospatial-biomass GitHub
!git clone https://github.com/ibm-granite/granite-geospatial-biomass.git

# Commented out IPython magic to ensure Python compatibility.
# Change directory. Move inside the tsfm repo.
# %cd granite-geospatial-biomass/

# Install the package
!pip install -e .

# Commented out IPython magic to ensure Python compatibility.
# Change directory. Standardize so below paths work with local notebook and colab.
# %cd notebooks

# Download dataset from Zenodo
#dataset_path = '../granite-geospatial-biomass-datasets.tar.gz'
#!wget "https://zenodo.org/records/12356481/files/granite-geospatial-biomass-datasets.tar.gz?download=1" -O {dataset_path}

#%mv ../granite-geospatial-biomass-datasets.tar.gz /content/drive/MyDrive/

#%ls /content/drive/MyDrive/granite-geospatial-biomass-datasets.tar.gz

# Unpack compressed dataset
#dataset_path = '/content/drive/MyDrive/granite-geospatial-biomass-datasets.tar.gz'
#target_directory = '../'
#!tar -xvf {dataset_path} --directory {target_directory}

# Commented out IPython magic to ensure Python compatibility.
# %pwd

# Standard
import os
import shutil
import glob
import matplotlib.pyplot as plt
import yaml
import pandas as pd
import numpy as np

# Third Party
import rioxarray as rio
from lightning.pytorch import Trainer
from terratorch.cli_tools import LightningInferenceModel
from huggingface_hub import hf_hub_download

# First Party
from utils.binwise_rmse import calc_binwise_rmse
from utils.dataset_scalers import calc_mean_std
from utils.plotting import plot_rgb_agb_gedi

ckpt_path = hf_hub_download(repo_id="ibm-granite/granite-geospatial-biomass", filename="biomass_model.ckpt")

# Commented out IPython magic to ensure Python compatibility.
# %ls /content/drive/MyDrive/HLSL30/

# Commented out IPython magic to ensure Python compatibility.
# %cat /content/drive/MyDrive/HLSL30/HLS.S30.T52RFV.2021036T015851.v2.0.txt

# https://lpdaac.usgs.gov/products/hlss30v002/
UNUSED_BAND = "-1"
predict_dataset_bands = [
    UNUSED_BAND,
    "BLUE",
    "GREEN",
    "RED",
    UNUSED_BAND,
    UNUSED_BAND,
    UNUSED_BAND,
    UNUSED_BAND,
    UNUSED_BAND,
    UNUSED_BAND,
    "SWIR_1",
    "SWIR_2",
    "NIR_NARROW",
    UNUSED_BAND,
    UNUSED_BAND,
    UNUSED_BAND,
    UNUSED_BAND,
    UNUSED_BAND]
print(len(predict_dataset_bands))

### Provide all necessary fils, paths, and hyperparameter

# Path to configuration file which contains all hyperparameters
config_path = '../configs/config.yaml'

# Path to directory with geotiff test images
#predict_input_dir = '../granite-geospatial-biomass-datasets/all_ecos_datasplit/test_images/'
predict_input_dir = '/content/drive/MyDrive/HLSL30/'

# List to define the bands in the input images. As currently configured, the model looks for following
# HLS bands: BLUE, GREEN, RED, NIR_NARROW, SWIR_1, SWIR_2
# The line below names all the bands in the input, so the ones above can be extracted. we use -1 for placeholders, as we dont care about those
#UNUSED_BAND = "-1"
#predict_dataset_bands = [UNUSED_BAND,"BLUE","GREEN","RED","NIR_NARROW","SWIR_1","SWIR_2",UNUSED_BAND,UNUSED_BAND,UNUSED_BAND,UNUSED_BAND]

# Path to directory with GEDI test labels - these will be used for plotting and evaluation further below
#test_label_dir = '../granite-geospatial-biomass-datasets/all_ecos_datasplit/test_labels/'

# Commented out IPython magic to ensure Python compatibility.
# %cat ../configs/config.yaml

# Commented out IPython magic to ensure Python compatibility.
# %mkdir inference_images
# %cp -irp /content/drive/MyDrive/HLSL30/HLS.S30.T52RFV.2021036T015851.v2.0.Float32_01_09.tif inference_images
# %ls inference_images

# create subset of images for inference
#images_for_inference = ['T10SFF_144_tile_img.tif', 'T33TUM_34_tile_img.tif', 'T47PRR_79_tile_img.tif']

#labels_for_inference = ['T10SFF_144_tile_label.tif', 'T33TUM_34_tile_label.tif', 'T47PRR_79_tile_label.tif']

#if not os.path.isdir("inference_images"):
#    os.mkdir("inference_images")
#    for inference_image in images_for_inference:
#        shutil.copy(os.path.join(predict_input_dir, inference_image), os.path.join("inference_images", inference_image))

#if not os.path.isdir("inference_labels"):
#    os.mkdir("inference_labels")
#    for inference_label in labels_for_inference:
#        shutil.copy(os.path.join(test_label_dir, inference_label), os.path.join("inference_labels", inference_label))

# Commented out IPython magic to ensure Python compatibility.
# %ls inference_images

model = LightningInferenceModel.from_config(config_path, ckpt_path, predict_dataset_bands)

inference_results, input_file_names = model.inference_on_dir("inference_images")
inference_results
inference_results.shape
input_file_names

# 推定結果の可視化
i = 0
tmp = inference_results.numpy()[i]
plt.imshow(tmp, cmap='Greens')
plt.title(label=input_file_names[i])
plt.colorbar()

#i = 1
#tmp = inference_results.numpy()[i]
#plt.imshow(tmp, cmap='Greens')
#plt.title(label=input_file_names[i])
#plt.colorbar()

#i = 2
#tmp = inference_results.numpy()[i]
#plt.imshow(tmp, cmap='Greens')
#plt.title(label=input_file_names[i])
#plt.colorbar()

# 結果の保存
import torch
torch.save(inference_results, "inference_results.pt")

# Commented out IPython magic to ensure Python compatibility.
# %ls inference_results.pt

with open('input_file_names.txt', 'w') as f:
    for file_name in input_file_names:
        f.write(file_name + '\n')

# Commented out IPython magic to ensure Python compatibility.
# %cat input_file_names.txt

# Commented out IPython magic to ensure Python compatibility.
# %mv inference_results.pt input_file_names.txt /content/drive/MyDrive/

# Commented out IPython magic to ensure Python compatibility.
# %cp -irp ./inference_images /content/drive/MyDrive/

# Commented out IPython magic to ensure Python compatibility.
# %ls /content/drive/MyDrive

推定結果

image.png

以下は元データのRGB。
image.png

桜島周辺だと噴煙あるからそこのマスクもいるのかも。。。

0
0
0

Register as a new user and use Qiita more conveniently

  1. You get articles that match your needs
  2. You can efficiently read back useful information
  3. You can use dark theme
What you can do with signing up
0
0

Delete article

Deleted articles cannot be recovered.

Draft of this article would be also deleted.

Are you sure you want to delete this article?