Compare the results of 2 global discharge models with in situ information¶

Notebook prepared by Ben Maathuis, Willem Nieuwenhuis and Bas Retsios. ITC-University of Twente, Enschede. The Netherlands

Through this notebook you will retrieve and process data from GEOGLOWS and GLoFAS and compare the model results on daily discharge also with in situ information of a gauging station. The Rhine river, entering the Netherlands, at the town of Lobith is taken as the common location for which both model and insitu data is available.

Download the sample data (the in situ discharge information from 1990 to mid-2025). When execuing the notebook, the sample data is expected in a folder in the notebook directory and the folder name expected is 'Qmodel_insitu' and is containing a single csv file named:'Q_Lobith_1990_2025_insitu.csv'.

You will use observations which are linked to a stream segment (in case of GEOGLOWS), observations derived from yearly temporal aggregated GRIB files (in the case of GLoFAS) in conjunction with data derived from a SCV file (in situ observations). Once the data is retieved using various libraries, all data is furhter processed in Pandas and visual comparisons are made in Matplotlib.

Links for review of the model / data sources are provided below.

For GEOGLOWS¶

See for further information:

  • geoglows website ECMWF: https://geoglows.ecmwf.int/ - webapp: https://hydroviewer.geoglows.org/
  • geoglows model: https://geoglows.readthedocs.io/en/latest/
  • geoglows use in notebook: https://colab.research.google.com/drive/19PiUTU2noCvNGr6r-1i9cv0YMduTxATs?usp=sharing#scrollTo=CGU2lke5miuR

For GLoFAS¶

See for further information:

  • the GloFAS system: https://www.globalfloods.eu/
  • the data will be retrieved in grib2 format, for further information on data download see: https://ewds.climate.copernicus.eu/datasets/cems-glofas-historical?tab=download

For In Situ¶

See for further information:

  • online discharge info the netherlands: https://waterdata.wrij.nl/index.php?wat=kaart-esri
  • online discharge used: https://waterdata.wrij.nl/index.php?wat=download&deeplink=0&lokid=36393&tsid=16979042
  • note: the in situ data of the Lobith gauging station - mean daily discharge in m3/sec - has already been retrieved and is provided as sample data (in the file 'Q_Lobith_1990_2025_insitu.csv')

Process GEOGLOWS¶

You can Install the GEOGLOWS API client via the package management system pip, type the following command promt window in the python folder: !pip install geoglows - see the first line in the code field below, uncomment the line if the GEOGLOWS package is not installed.

Now you are ready to start retrieving data from GEOGLOWS.

Select the segment of streamnetwork to get the 'StreamID' code:¶

  • https://geoglows.ecmwf.int/
  • StreamID = 230546613 (used here: representing the location of the Rhine river entering the Netherlands - at the in situ station Lobith)
In [1]:
#uncomment the line below to install the geoglows package if not done before
#!pip install geoglows
In [2]:
import os
from osgeo import gdal, osr
import ilwis
import geoglows
import numpy as np
import pandas as pd
from datetime import date
from datetime import datetime, timedelta
import matplotlib.pyplot as plt
import matplotlib.dates as mdates
import warnings
warnings.filterwarnings('ignore')
In [3]:
#set sample data folder
#use the Python function os.getcwd() to retrieve the current folder used and add the apporriate sub-folder
work_dir = os.getcwd() + r'/Qmodel_insitu'

#set the working directory for ILWISPy
ilwis.setWorkingCatalog(work_dir)
print(ilwis.version())
print(work_dir)
1.0 build 20250513
d:\jupyter\notebook_scripts\Special\geoglows/Qmodel_insitu

For Historical Simulation:¶

geoglows.data.retrospective(*args, **kwargs)

  • Retrieves the retrospective simulation of streamflow for a given river_id from s3 buckets

Parameters:

  • river_id (int) – the ID of a stream, should be a 9 digit integer

Keyword Arguments:

  • format (str) – the format to return the data, either ‘df’ or ‘xarray’. default is ‘df’

  • storage_options (dict) – options to pass to the xarray open_dataset function

  • resolution (str) – resolution of data to retrieve: hourly, daily, monthly, or yearly. default hourly

Returns:

  • pd.DataFrame or xr.Dataset
In [4]:
StreamID = 230546613 #Rhine river entering the Netherlands, near Lobith
In [5]:
df_geoglows = geoglows.data.retrospective(StreamID, resolution = 'daily')
In [6]:
df_geoglows
Out[6]:
river_id 230546613
time
1940-01-01 00:00:00+00:00 1586.661987
1940-01-02 00:00:00+00:00 1631.991943
1940-01-03 00:00:00+00:00 1681.213989
1940-01-04 00:00:00+00:00 1686.869995
1940-01-05 00:00:00+00:00 1640.525024
... ...
2025-06-12 00:00:00+00:00 1495.007202
2025-06-13 00:00:00+00:00 1577.915649
2025-06-14 00:00:00+00:00 1595.414185
2025-06-15 00:00:00+00:00 1493.749512
2025-06-16 00:00:00+00:00 1517.984985

31214 rows × 1 columns

In [7]:
#using the hydroviewer of geoglows providing some interactive functionality
hydroviewer_figure = geoglows.plots.retrospective(df_geoglows)
hydroviewer_figure.show()

We continue working with Pandas

In [8]:
#check the dataframe
df_geoglows.head
Out[8]:
<bound method NDFrame.head of river_id                     230546613
time                                  
1940-01-01 00:00:00+00:00  1586.661987
1940-01-02 00:00:00+00:00  1631.991943
1940-01-03 00:00:00+00:00  1681.213989
1940-01-04 00:00:00+00:00  1686.869995
1940-01-05 00:00:00+00:00  1640.525024
...                                ...
2025-06-12 00:00:00+00:00  1495.007202
2025-06-13 00:00:00+00:00  1577.915649
2025-06-14 00:00:00+00:00  1595.414185
2025-06-15 00:00:00+00:00  1493.749512
2025-06-16 00:00:00+00:00  1517.984985

[31214 rows x 1 columns]>
In [9]:
#Reset index first to bring 'time' back into a column
df_geoglows = df_geoglows.reset_index()

#Rename columns
df_geoglows = df_geoglows.rename(columns={ (StreamID): 'discharge'})
In [10]:
df_geoglows
Out[10]:
river_id time discharge
0 1940-01-01 00:00:00+00:00 1586.661987
1 1940-01-02 00:00:00+00:00 1631.991943
2 1940-01-03 00:00:00+00:00 1681.213989
3 1940-01-04 00:00:00+00:00 1686.869995
4 1940-01-05 00:00:00+00:00 1640.525024
... ... ...
31209 2025-06-12 00:00:00+00:00 1495.007202
31210 2025-06-13 00:00:00+00:00 1577.915649
31211 2025-06-14 00:00:00+00:00 1595.414185
31212 2025-06-15 00:00:00+00:00 1493.749512
31213 2025-06-16 00:00:00+00:00 1517.984985

31214 rows × 2 columns

In [11]:
# Selecting a certain time range
mask = (df_geoglows['time'] >= '2022-01-01') & (df_geoglows['time'] <= '2024-12-31')
df_geoglows_sel = df_geoglows.loc[mask]
In [12]:
#create the plot
plt.figure(figsize =(12, 6))
plt.plot(df_geoglows_sel['time'], df_geoglows_sel['discharge'],label='Q-Lobith - GEOGLOWS')
plt.xlabel('Date')
plt.ylabel('Discharge (m³/s)')
plt.title('GEOGLOWS model discharge Rhine, Lobith - The Netherlands')
#plt.xticks(rotation=45)
plt.legend()
plt.grid()
plt.tight_layout()
plt.show();
No description has been provided for this image
In [13]:
#save file
df_geoglows.to_csv(work_dir+'/GEOGLOWS_Lobith.csv')
df_geoglows_sel.to_csv(work_dir+'/GEOGLOWS_Lobith_2022_2024.csv')

Process GLoFAS¶

Retrieve the data from the Copernicus CEMS Early Warning Data Store¶

See also: https://ewds.climate.copernicus.eu/how-to-api

Login to the climate data store, register and if you click on the name you get the details of your user profile. Copy and paste the personal access token to a text file and change it according to the two line example below:

    url: https://ewds.climate.copernicus.eu/api
    key: <PERSONAL-ACCESS-TOKEN>

Save this text file at the follwoing location and filename: C:\Users<USERNAME>.cdsapirc

Install and import the cdsapi¶

Install the cdsapi: The CDS API client is a python based library. It provides support for Python 3.

You can Install the CDS API client via the package management system pip, type the following command promt window in the python folder:

  • !pip install cdsapi - see code field below

Now you are ready to start retrieving data from the CDS

Link to data portal: https://ewds.climate.copernicus.eu/datasets/cems-glofas-historical?tab=download

Limitation of data download is that only up to 500 layers (days) can be retrieved given a download request, so here a yearly data download of the area of interest is used. From the link above generate a data request and review the resulting api request and compare it with the code fields below.

In [14]:
#Uncomment the line below to install the cdsapi if not already done before
#!pip install cdsapi
In [15]:
import cdsapi
In [16]:
year = '2022'
dataset = "cems-glofas-historical"
request = {
    "system_version": ["version_4_0"],
    "hydrological_model": ["lisflood"],
    "product_type": ["consolidated"],
    "variable": ["river_discharge_in_the_last_24_hours"],
    "hyear": [ year ],
    "hmonth": [
        "01","02","03","04","05","06","07","08","09","10","11","12"
    ],
    "hday": [
        "01","02","03","04","05","06","07","08","09","10",
        "11","12","13","14","15","16","17","18","19","20",
        "21","22","23","24","25","26","27","28","29","30","31"
    ],
    "data_format": "grib2",
    "download_format": "unarchived",
    "area": [51.75, 5.95, 51.95, 6.12] # North, West, South, East - Lobith
}
client = cdsapi.Client()
client.retrieve(dataset, request, (work_dir) +'/' +'historical_' + year + '_lobith.grib')
2025-06-27 11:15:49,605 WARNING [2025-06-23T00:00:00] Scheduled System Session affecting Service reliability - 30 June 2025. Please follow status [here](https://status.ecmwf.int/) or in our [forum](https://forum.ecmwf.int/t/scheduled-maintenance-of-the-cloud-infrastructure-on-30-june-2025/13598)
2025-06-27 11:15:49,606 INFO [2025-06-16T00:00:00] CC-BY licence to replace Licence to use Copernicus Products on 02 July 2025. More information available [here](https://forum.ecmwf.int/t/cc-by-licence-to-replace-licence-to-use-copernicus-products-on-02-july-2025/13464)
2025-06-27 11:15:49,607 INFO [2024-09-26T00:00:00] Watch our [Forum]( https://forum.ecmwf.int/) for Announcements, news and other discussed topics.
2025-06-27 11:15:49,803 INFO [2024-02-01T00:00:00] Please note that accessing this dataset via CDS for time-critical operation is not advised or supported
2025-06-27 11:15:49,804 INFO [2024-02-01T00:00:00] Please note we suggest checking the list of known issues on the GloFAS wiki
[here](https://confluence.ecmwf.int/display/CEMS/GloFAS+-+Known+Issues)
before downloading the dataset.
2025-06-27 11:15:49,806 INFO Request ID is 8312e147-32e7-472b-8f44-d8e56b9cb587
2025-06-27 11:15:49,892 INFO status has been updated to accepted
2025-06-27 11:16:22,742 INFO status has been updated to successful
457742acaa4e634edbe5c21dfb595d0f.grib:   0%|          | 0.00/99.8k [00:00<?, ?B/s]
Out[16]:
'd:\\jupyter\\notebook_scripts\\Special\\geoglows/Qmodel_insitu/historical_2022_lobith.grib'
In [17]:
year = '2023'
dataset = "cems-glofas-historical"
request = {
    "system_version": ["version_4_0"],
    "hydrological_model": ["lisflood"],
    "product_type": ["consolidated"],
    "variable": ["river_discharge_in_the_last_24_hours"],
    "hyear": [ year ],
    "hmonth": [
        "01","02","03","04","05","06","07","08","09","10","11","12"
    ],
    "hday": [
        "01","02","03","04","05","06","07","08","09","10",
        "11","12","13","14","15","16","17","18","19","20",
        "21","22","23","24","25","26","27","28","29","30","31"
    ],
    "data_format": "grib2",
    "download_format": "unarchived",
    "area": [51.75, 5.95, 51.95, 6.12] # North, West, South, East - Lobith
}
client = cdsapi.Client()
client.retrieve(dataset, request, (work_dir) +'/' +'historical_' + year + '_lobith.grib')
2025-06-27 11:16:23,820 WARNING [2025-06-23T00:00:00] Scheduled System Session affecting Service reliability - 30 June 2025. Please follow status [here](https://status.ecmwf.int/) or in our [forum](https://forum.ecmwf.int/t/scheduled-maintenance-of-the-cloud-infrastructure-on-30-june-2025/13598)
2025-06-27 11:16:23,821 INFO [2025-06-16T00:00:00] CC-BY licence to replace Licence to use Copernicus Products on 02 July 2025. More information available [here](https://forum.ecmwf.int/t/cc-by-licence-to-replace-licence-to-use-copernicus-products-on-02-july-2025/13464)
2025-06-27 11:16:23,821 INFO [2024-09-26T00:00:00] Watch our [Forum]( https://forum.ecmwf.int/) for Announcements, news and other discussed topics.
2025-06-27 11:16:24,277 INFO [2024-02-01T00:00:00] Please note that accessing this dataset via CDS for time-critical operation is not advised or supported
2025-06-27 11:16:24,278 INFO [2024-02-01T00:00:00] Please note we suggest checking the list of known issues on the GloFAS wiki
[here](https://confluence.ecmwf.int/display/CEMS/GloFAS+-+Known+Issues)
before downloading the dataset.
2025-06-27 11:16:24,279 INFO Request ID is 93542fff-6189-44f2-b183-dea9c1824938
2025-06-27 11:16:24,334 INFO status has been updated to accepted
2025-06-27 11:16:57,098 INFO status has been updated to running
2025-06-27 11:17:14,250 INFO status has been updated to successful
f2f227c85498998d5f6a1865a473a664.grib:   0%|          | 0.00/99.8k [00:00<?, ?B/s]
Out[17]:
'd:\\jupyter\\notebook_scripts\\Special\\geoglows/Qmodel_insitu/historical_2023_lobith.grib'
In [18]:
year = '2024'
dataset = "cems-glofas-historical"
request = {
    "system_version": ["version_4_0"],
    "hydrological_model": ["lisflood"],
    "product_type": ["consolidated"],
    "variable": ["river_discharge_in_the_last_24_hours"],
    "hyear": [ year ],
    "hmonth": [
        "01","02","03","04","05","06","07","08","09","10","11","12"
    ],
    "hday": [
        "01","02","03","04","05","06","07","08","09","10",
        "11","12","13","14","15","16","17","18","19","20",
        "21","22","23","24","25","26","27","28","29","30","31"
    ],
    "data_format": "grib2",
    "download_format": "unarchived",
    "area": [51.75, 5.95, 51.95, 6.12] # North, West, South, East - Lobith
}
client = cdsapi.Client()
client.retrieve(dataset, request, (work_dir) +'/' +'historical_' + year + '_lobith.grib')
2025-06-27 11:17:15,070 WARNING [2025-06-23T00:00:00] Scheduled System Session affecting Service reliability - 30 June 2025. Please follow status [here](https://status.ecmwf.int/) or in our [forum](https://forum.ecmwf.int/t/scheduled-maintenance-of-the-cloud-infrastructure-on-30-june-2025/13598)
2025-06-27 11:17:15,072 INFO [2025-06-16T00:00:00] CC-BY licence to replace Licence to use Copernicus Products on 02 July 2025. More information available [here](https://forum.ecmwf.int/t/cc-by-licence-to-replace-licence-to-use-copernicus-products-on-02-july-2025/13464)
2025-06-27 11:17:15,073 INFO [2024-09-26T00:00:00] Watch our [Forum]( https://forum.ecmwf.int/) for Announcements, news and other discussed topics.
2025-06-27 11:17:15,377 INFO [2024-02-01T00:00:00] Please note that accessing this dataset via CDS for time-critical operation is not advised or supported
2025-06-27 11:17:15,378 INFO [2024-02-01T00:00:00] Please note we suggest checking the list of known issues on the GloFAS wiki
[here](https://confluence.ecmwf.int/display/CEMS/GloFAS+-+Known+Issues)
before downloading the dataset.
2025-06-27 11:17:15,379 INFO Request ID is 46125b2a-4469-4c9d-8ee0-f3e4d9885a84
2025-06-27 11:17:15,480 INFO status has been updated to accepted
2025-06-27 11:17:24,322 INFO status has been updated to running
2025-06-27 11:17:29,645 INFO status has been updated to successful
e309e3cf766b92c3f1f2ddcd4d5ba5d9.grib:   0%|          | 0.00/100k [00:00<?, ?B/s]
Out[18]:
'd:\\jupyter\\notebook_scripts\\Special\\geoglows/Qmodel_insitu/historical_2024_lobith.grib'

Check location of river to retrieve discharge values from¶

Quick review of the data retrieved, including the meta data

In [19]:
#select and load the dataset
grib_content = work_dir +'/'+'historical_2022_lobith.grib'
glofas_2022 = ilwis.RasterCoverage(grib_content)
print(glofas_2022.size())

#retrieve the first layer of the dataset
area_retrieved = ilwis.do('selection',glofas_2022,"rasterbands(0)")
print(area_retrieved.size())
Size(3, 4, 365)
Size(3, 4, 1)
In [20]:
#convert ilwis array to numpy array
area_retrieved_2np = np.fromiter(iter(area_retrieved), np.float64, area_retrieved.size().linearSize()) 
area_retrieved_2np = area_retrieved_2np.reshape((area_retrieved.size().ysize, area_retrieved.size().xsize))

# Plot the 2D array using matplotlib
plt.imshow(area_retrieved_2np, cmap='jet')
plt.title('First Layer of Raster Stack')
plt.colorbar(label='Pixel values')
plt.axis('off')
plt.show()
No description has been provided for this image

Checking the meta data - using the REF_TIME¶

Each layer has a (GRIB) reference time, execute the field below to check the order of the REF-TIME. What can you conclude? Note the operation performed is using gdal.info

In [21]:
dataset = gdal.Open(grib_content)
for i in range(1, min(dataset.RasterCount, 60) + 1):  # Only check first 60 bands
    band = dataset.GetRasterBand(i)
    metadata = band.GetMetadata()
    print(f"Band {i} GRIB_IDS:\n{metadata.get('GRIB_IDS', 'N/A')}\n")
Band 1 GRIB_IDS:
CENTER=98(ECMWF) SUBCENTER=0 MASTER_TABLE=27 LOCAL_TABLE=0 SIGNF_REF_TIME=1(Start_of_Forecast) REF_TIME=2022-02-01T00:00:00Z PROD_STATUS=0(Operational) TYPE=0(Analysis)

Band 2 GRIB_IDS:
CENTER=98(ECMWF) SUBCENTER=0 MASTER_TABLE=27 LOCAL_TABLE=0 SIGNF_REF_TIME=1(Start_of_Forecast) REF_TIME=2022-02-02T00:00:00Z PROD_STATUS=0(Operational) TYPE=0(Analysis)

Band 3 GRIB_IDS:
CENTER=98(ECMWF) SUBCENTER=0 MASTER_TABLE=27 LOCAL_TABLE=0 SIGNF_REF_TIME=1(Start_of_Forecast) REF_TIME=2022-02-03T00:00:00Z PROD_STATUS=0(Operational) TYPE=0(Analysis)

Band 4 GRIB_IDS:
CENTER=98(ECMWF) SUBCENTER=0 MASTER_TABLE=27 LOCAL_TABLE=0 SIGNF_REF_TIME=1(Start_of_Forecast) REF_TIME=2022-02-04T00:00:00Z PROD_STATUS=0(Operational) TYPE=0(Analysis)

Band 5 GRIB_IDS:
CENTER=98(ECMWF) SUBCENTER=0 MASTER_TABLE=27 LOCAL_TABLE=0 SIGNF_REF_TIME=1(Start_of_Forecast) REF_TIME=2022-02-05T00:00:00Z PROD_STATUS=0(Operational) TYPE=0(Analysis)

Band 6 GRIB_IDS:
CENTER=98(ECMWF) SUBCENTER=0 MASTER_TABLE=27 LOCAL_TABLE=0 SIGNF_REF_TIME=1(Start_of_Forecast) REF_TIME=2022-02-06T00:00:00Z PROD_STATUS=0(Operational) TYPE=0(Analysis)

Band 7 GRIB_IDS:
CENTER=98(ECMWF) SUBCENTER=0 MASTER_TABLE=27 LOCAL_TABLE=0 SIGNF_REF_TIME=1(Start_of_Forecast) REF_TIME=2022-02-07T00:00:00Z PROD_STATUS=0(Operational) TYPE=0(Analysis)

Band 8 GRIB_IDS:
CENTER=98(ECMWF) SUBCENTER=0 MASTER_TABLE=27 LOCAL_TABLE=0 SIGNF_REF_TIME=1(Start_of_Forecast) REF_TIME=2022-02-08T00:00:00Z PROD_STATUS=0(Operational) TYPE=0(Analysis)

Band 9 GRIB_IDS:
CENTER=98(ECMWF) SUBCENTER=0 MASTER_TABLE=27 LOCAL_TABLE=0 SIGNF_REF_TIME=1(Start_of_Forecast) REF_TIME=2022-02-09T00:00:00Z PROD_STATUS=0(Operational) TYPE=0(Analysis)

Band 10 GRIB_IDS:
CENTER=98(ECMWF) SUBCENTER=0 MASTER_TABLE=27 LOCAL_TABLE=0 SIGNF_REF_TIME=1(Start_of_Forecast) REF_TIME=2022-02-10T00:00:00Z PROD_STATUS=0(Operational) TYPE=0(Analysis)

Band 11 GRIB_IDS:
CENTER=98(ECMWF) SUBCENTER=0 MASTER_TABLE=27 LOCAL_TABLE=0 SIGNF_REF_TIME=1(Start_of_Forecast) REF_TIME=2022-02-11T00:00:00Z PROD_STATUS=0(Operational) TYPE=0(Analysis)

Band 12 GRIB_IDS:
CENTER=98(ECMWF) SUBCENTER=0 MASTER_TABLE=27 LOCAL_TABLE=0 SIGNF_REF_TIME=1(Start_of_Forecast) REF_TIME=2022-02-12T00:00:00Z PROD_STATUS=0(Operational) TYPE=0(Analysis)

Band 13 GRIB_IDS:
CENTER=98(ECMWF) SUBCENTER=0 MASTER_TABLE=27 LOCAL_TABLE=0 SIGNF_REF_TIME=1(Start_of_Forecast) REF_TIME=2022-02-13T00:00:00Z PROD_STATUS=0(Operational) TYPE=0(Analysis)

Band 14 GRIB_IDS:
CENTER=98(ECMWF) SUBCENTER=0 MASTER_TABLE=27 LOCAL_TABLE=0 SIGNF_REF_TIME=1(Start_of_Forecast) REF_TIME=2022-02-14T00:00:00Z PROD_STATUS=0(Operational) TYPE=0(Analysis)

Band 15 GRIB_IDS:
CENTER=98(ECMWF) SUBCENTER=0 MASTER_TABLE=27 LOCAL_TABLE=0 SIGNF_REF_TIME=1(Start_of_Forecast) REF_TIME=2022-02-15T00:00:00Z PROD_STATUS=0(Operational) TYPE=0(Analysis)

Band 16 GRIB_IDS:
CENTER=98(ECMWF) SUBCENTER=0 MASTER_TABLE=27 LOCAL_TABLE=0 SIGNF_REF_TIME=1(Start_of_Forecast) REF_TIME=2022-02-16T00:00:00Z PROD_STATUS=0(Operational) TYPE=0(Analysis)

Band 17 GRIB_IDS:
CENTER=98(ECMWF) SUBCENTER=0 MASTER_TABLE=27 LOCAL_TABLE=0 SIGNF_REF_TIME=1(Start_of_Forecast) REF_TIME=2022-02-17T00:00:00Z PROD_STATUS=0(Operational) TYPE=0(Analysis)

Band 18 GRIB_IDS:
CENTER=98(ECMWF) SUBCENTER=0 MASTER_TABLE=27 LOCAL_TABLE=0 SIGNF_REF_TIME=1(Start_of_Forecast) REF_TIME=2022-02-18T00:00:00Z PROD_STATUS=0(Operational) TYPE=0(Analysis)

Band 19 GRIB_IDS:
CENTER=98(ECMWF) SUBCENTER=0 MASTER_TABLE=27 LOCAL_TABLE=0 SIGNF_REF_TIME=1(Start_of_Forecast) REF_TIME=2022-02-19T00:00:00Z PROD_STATUS=0(Operational) TYPE=0(Analysis)

Band 20 GRIB_IDS:
CENTER=98(ECMWF) SUBCENTER=0 MASTER_TABLE=27 LOCAL_TABLE=0 SIGNF_REF_TIME=1(Start_of_Forecast) REF_TIME=2022-02-20T00:00:00Z PROD_STATUS=0(Operational) TYPE=0(Analysis)

Band 21 GRIB_IDS:
CENTER=98(ECMWF) SUBCENTER=0 MASTER_TABLE=27 LOCAL_TABLE=0 SIGNF_REF_TIME=1(Start_of_Forecast) REF_TIME=2022-02-21T00:00:00Z PROD_STATUS=0(Operational) TYPE=0(Analysis)

Band 22 GRIB_IDS:
CENTER=98(ECMWF) SUBCENTER=0 MASTER_TABLE=27 LOCAL_TABLE=0 SIGNF_REF_TIME=1(Start_of_Forecast) REF_TIME=2022-02-22T00:00:00Z PROD_STATUS=0(Operational) TYPE=0(Analysis)

Band 23 GRIB_IDS:
CENTER=98(ECMWF) SUBCENTER=0 MASTER_TABLE=27 LOCAL_TABLE=0 SIGNF_REF_TIME=1(Start_of_Forecast) REF_TIME=2022-02-23T00:00:00Z PROD_STATUS=0(Operational) TYPE=0(Analysis)

Band 24 GRIB_IDS:
CENTER=98(ECMWF) SUBCENTER=0 MASTER_TABLE=27 LOCAL_TABLE=0 SIGNF_REF_TIME=1(Start_of_Forecast) REF_TIME=2022-02-24T00:00:00Z PROD_STATUS=0(Operational) TYPE=0(Analysis)

Band 25 GRIB_IDS:
CENTER=98(ECMWF) SUBCENTER=0 MASTER_TABLE=27 LOCAL_TABLE=0 SIGNF_REF_TIME=1(Start_of_Forecast) REF_TIME=2022-02-25T00:00:00Z PROD_STATUS=0(Operational) TYPE=0(Analysis)

Band 26 GRIB_IDS:
CENTER=98(ECMWF) SUBCENTER=0 MASTER_TABLE=27 LOCAL_TABLE=0 SIGNF_REF_TIME=1(Start_of_Forecast) REF_TIME=2022-02-26T00:00:00Z PROD_STATUS=0(Operational) TYPE=0(Analysis)

Band 27 GRIB_IDS:
CENTER=98(ECMWF) SUBCENTER=0 MASTER_TABLE=27 LOCAL_TABLE=0 SIGNF_REF_TIME=1(Start_of_Forecast) REF_TIME=2022-02-27T00:00:00Z PROD_STATUS=0(Operational) TYPE=0(Analysis)

Band 28 GRIB_IDS:
CENTER=98(ECMWF) SUBCENTER=0 MASTER_TABLE=27 LOCAL_TABLE=0 SIGNF_REF_TIME=1(Start_of_Forecast) REF_TIME=2022-02-28T00:00:00Z PROD_STATUS=0(Operational) TYPE=0(Analysis)

Band 29 GRIB_IDS:
CENTER=98(ECMWF) SUBCENTER=0 MASTER_TABLE=27 LOCAL_TABLE=0 SIGNF_REF_TIME=1(Start_of_Forecast) REF_TIME=2022-04-01T00:00:00Z PROD_STATUS=0(Operational) TYPE=0(Analysis)

Band 30 GRIB_IDS:
CENTER=98(ECMWF) SUBCENTER=0 MASTER_TABLE=27 LOCAL_TABLE=0 SIGNF_REF_TIME=1(Start_of_Forecast) REF_TIME=2022-04-02T00:00:00Z PROD_STATUS=0(Operational) TYPE=0(Analysis)

Band 31 GRIB_IDS:
CENTER=98(ECMWF) SUBCENTER=0 MASTER_TABLE=27 LOCAL_TABLE=0 SIGNF_REF_TIME=1(Start_of_Forecast) REF_TIME=2022-04-03T00:00:00Z PROD_STATUS=0(Operational) TYPE=0(Analysis)

Band 32 GRIB_IDS:
CENTER=98(ECMWF) SUBCENTER=0 MASTER_TABLE=27 LOCAL_TABLE=0 SIGNF_REF_TIME=1(Start_of_Forecast) REF_TIME=2022-04-04T00:00:00Z PROD_STATUS=0(Operational) TYPE=0(Analysis)

Band 33 GRIB_IDS:
CENTER=98(ECMWF) SUBCENTER=0 MASTER_TABLE=27 LOCAL_TABLE=0 SIGNF_REF_TIME=1(Start_of_Forecast) REF_TIME=2022-04-05T00:00:00Z PROD_STATUS=0(Operational) TYPE=0(Analysis)

Band 34 GRIB_IDS:
CENTER=98(ECMWF) SUBCENTER=0 MASTER_TABLE=27 LOCAL_TABLE=0 SIGNF_REF_TIME=1(Start_of_Forecast) REF_TIME=2022-04-06T00:00:00Z PROD_STATUS=0(Operational) TYPE=0(Analysis)

Band 35 GRIB_IDS:
CENTER=98(ECMWF) SUBCENTER=0 MASTER_TABLE=27 LOCAL_TABLE=0 SIGNF_REF_TIME=1(Start_of_Forecast) REF_TIME=2022-04-07T00:00:00Z PROD_STATUS=0(Operational) TYPE=0(Analysis)

Band 36 GRIB_IDS:
CENTER=98(ECMWF) SUBCENTER=0 MASTER_TABLE=27 LOCAL_TABLE=0 SIGNF_REF_TIME=1(Start_of_Forecast) REF_TIME=2022-04-08T00:00:00Z PROD_STATUS=0(Operational) TYPE=0(Analysis)

Band 37 GRIB_IDS:
CENTER=98(ECMWF) SUBCENTER=0 MASTER_TABLE=27 LOCAL_TABLE=0 SIGNF_REF_TIME=1(Start_of_Forecast) REF_TIME=2022-04-09T00:00:00Z PROD_STATUS=0(Operational) TYPE=0(Analysis)

Band 38 GRIB_IDS:
CENTER=98(ECMWF) SUBCENTER=0 MASTER_TABLE=27 LOCAL_TABLE=0 SIGNF_REF_TIME=1(Start_of_Forecast) REF_TIME=2022-04-10T00:00:00Z PROD_STATUS=0(Operational) TYPE=0(Analysis)

Band 39 GRIB_IDS:
CENTER=98(ECMWF) SUBCENTER=0 MASTER_TABLE=27 LOCAL_TABLE=0 SIGNF_REF_TIME=1(Start_of_Forecast) REF_TIME=2022-04-11T00:00:00Z PROD_STATUS=0(Operational) TYPE=0(Analysis)

Band 40 GRIB_IDS:
CENTER=98(ECMWF) SUBCENTER=0 MASTER_TABLE=27 LOCAL_TABLE=0 SIGNF_REF_TIME=1(Start_of_Forecast) REF_TIME=2022-04-12T00:00:00Z PROD_STATUS=0(Operational) TYPE=0(Analysis)

Band 41 GRIB_IDS:
CENTER=98(ECMWF) SUBCENTER=0 MASTER_TABLE=27 LOCAL_TABLE=0 SIGNF_REF_TIME=1(Start_of_Forecast) REF_TIME=2022-04-13T00:00:00Z PROD_STATUS=0(Operational) TYPE=0(Analysis)

Band 42 GRIB_IDS:
CENTER=98(ECMWF) SUBCENTER=0 MASTER_TABLE=27 LOCAL_TABLE=0 SIGNF_REF_TIME=1(Start_of_Forecast) REF_TIME=2022-04-14T00:00:00Z PROD_STATUS=0(Operational) TYPE=0(Analysis)

Band 43 GRIB_IDS:
CENTER=98(ECMWF) SUBCENTER=0 MASTER_TABLE=27 LOCAL_TABLE=0 SIGNF_REF_TIME=1(Start_of_Forecast) REF_TIME=2022-04-15T00:00:00Z PROD_STATUS=0(Operational) TYPE=0(Analysis)

Band 44 GRIB_IDS:
CENTER=98(ECMWF) SUBCENTER=0 MASTER_TABLE=27 LOCAL_TABLE=0 SIGNF_REF_TIME=1(Start_of_Forecast) REF_TIME=2022-04-16T00:00:00Z PROD_STATUS=0(Operational) TYPE=0(Analysis)

Band 45 GRIB_IDS:
CENTER=98(ECMWF) SUBCENTER=0 MASTER_TABLE=27 LOCAL_TABLE=0 SIGNF_REF_TIME=1(Start_of_Forecast) REF_TIME=2022-04-17T00:00:00Z PROD_STATUS=0(Operational) TYPE=0(Analysis)

Band 46 GRIB_IDS:
CENTER=98(ECMWF) SUBCENTER=0 MASTER_TABLE=27 LOCAL_TABLE=0 SIGNF_REF_TIME=1(Start_of_Forecast) REF_TIME=2022-04-18T00:00:00Z PROD_STATUS=0(Operational) TYPE=0(Analysis)

Band 47 GRIB_IDS:
CENTER=98(ECMWF) SUBCENTER=0 MASTER_TABLE=27 LOCAL_TABLE=0 SIGNF_REF_TIME=1(Start_of_Forecast) REF_TIME=2022-04-19T00:00:00Z PROD_STATUS=0(Operational) TYPE=0(Analysis)

Band 48 GRIB_IDS:
CENTER=98(ECMWF) SUBCENTER=0 MASTER_TABLE=27 LOCAL_TABLE=0 SIGNF_REF_TIME=1(Start_of_Forecast) REF_TIME=2022-04-20T00:00:00Z PROD_STATUS=0(Operational) TYPE=0(Analysis)

Band 49 GRIB_IDS:
CENTER=98(ECMWF) SUBCENTER=0 MASTER_TABLE=27 LOCAL_TABLE=0 SIGNF_REF_TIME=1(Start_of_Forecast) REF_TIME=2022-04-21T00:00:00Z PROD_STATUS=0(Operational) TYPE=0(Analysis)

Band 50 GRIB_IDS:
CENTER=98(ECMWF) SUBCENTER=0 MASTER_TABLE=27 LOCAL_TABLE=0 SIGNF_REF_TIME=1(Start_of_Forecast) REF_TIME=2022-04-22T00:00:00Z PROD_STATUS=0(Operational) TYPE=0(Analysis)

Band 51 GRIB_IDS:
CENTER=98(ECMWF) SUBCENTER=0 MASTER_TABLE=27 LOCAL_TABLE=0 SIGNF_REF_TIME=1(Start_of_Forecast) REF_TIME=2022-04-23T00:00:00Z PROD_STATUS=0(Operational) TYPE=0(Analysis)

Band 52 GRIB_IDS:
CENTER=98(ECMWF) SUBCENTER=0 MASTER_TABLE=27 LOCAL_TABLE=0 SIGNF_REF_TIME=1(Start_of_Forecast) REF_TIME=2022-04-24T00:00:00Z PROD_STATUS=0(Operational) TYPE=0(Analysis)

Band 53 GRIB_IDS:
CENTER=98(ECMWF) SUBCENTER=0 MASTER_TABLE=27 LOCAL_TABLE=0 SIGNF_REF_TIME=1(Start_of_Forecast) REF_TIME=2022-04-25T00:00:00Z PROD_STATUS=0(Operational) TYPE=0(Analysis)

Band 54 GRIB_IDS:
CENTER=98(ECMWF) SUBCENTER=0 MASTER_TABLE=27 LOCAL_TABLE=0 SIGNF_REF_TIME=1(Start_of_Forecast) REF_TIME=2022-04-26T00:00:00Z PROD_STATUS=0(Operational) TYPE=0(Analysis)

Band 55 GRIB_IDS:
CENTER=98(ECMWF) SUBCENTER=0 MASTER_TABLE=27 LOCAL_TABLE=0 SIGNF_REF_TIME=1(Start_of_Forecast) REF_TIME=2022-04-27T00:00:00Z PROD_STATUS=0(Operational) TYPE=0(Analysis)

Band 56 GRIB_IDS:
CENTER=98(ECMWF) SUBCENTER=0 MASTER_TABLE=27 LOCAL_TABLE=0 SIGNF_REF_TIME=1(Start_of_Forecast) REF_TIME=2022-04-28T00:00:00Z PROD_STATUS=0(Operational) TYPE=0(Analysis)

Band 57 GRIB_IDS:
CENTER=98(ECMWF) SUBCENTER=0 MASTER_TABLE=27 LOCAL_TABLE=0 SIGNF_REF_TIME=1(Start_of_Forecast) REF_TIME=2022-04-29T00:00:00Z PROD_STATUS=0(Operational) TYPE=0(Analysis)

Band 58 GRIB_IDS:
CENTER=98(ECMWF) SUBCENTER=0 MASTER_TABLE=27 LOCAL_TABLE=0 SIGNF_REF_TIME=1(Start_of_Forecast) REF_TIME=2022-04-30T00:00:00Z PROD_STATUS=0(Operational) TYPE=0(Analysis)

Band 59 GRIB_IDS:
CENTER=98(ECMWF) SUBCENTER=0 MASTER_TABLE=27 LOCAL_TABLE=0 SIGNF_REF_TIME=1(Start_of_Forecast) REF_TIME=2022-06-01T00:00:00Z PROD_STATUS=0(Operational) TYPE=0(Analysis)

Band 60 GRIB_IDS:
CENTER=98(ECMWF) SUBCENTER=0 MASTER_TABLE=27 LOCAL_TABLE=0 SIGNF_REF_TIME=1(Start_of_Forecast) REF_TIME=2022-06-02T00:00:00Z PROD_STATUS=0(Operational) TYPE=0(Analysis)

Extract the UNIX timestamps and convert these into readable datetime strings

In [22]:
#extract the unix-time (seconds since Jan 01 1970. (UTC))
ref_times = []

print("Band | GRIB_REF_TIME")
print("-" * 30)

for i in range(1, dataset.RasterCount + 1):
    band = dataset.GetRasterBand(i)
    metadata = band.GetMetadata()
    ref_time = metadata.get('GRIB_REF_TIME')
    ref_times.append(ref_time)
    print(f"{i:4} | {ref_time}")
Band | GRIB_REF_TIME
------------------------------
   1 | 1643673600
   2 | 1643760000
   3 | 1643846400
   4 | 1643932800
   5 | 1644019200
   6 | 1644105600
   7 | 1644192000
   8 | 1644278400
   9 | 1644364800
  10 | 1644451200
  11 | 1644537600
  12 | 1644624000
  13 | 1644710400
  14 | 1644796800
  15 | 1644883200
  16 | 1644969600
  17 | 1645056000
  18 | 1645142400
  19 | 1645228800
  20 | 1645315200
  21 | 1645401600
  22 | 1645488000
  23 | 1645574400
  24 | 1645660800
  25 | 1645747200
  26 | 1645833600
  27 | 1645920000
  28 | 1646006400
  29 | 1648771200
  30 | 1648857600
  31 | 1648944000
  32 | 1649030400
  33 | 1649116800
  34 | 1649203200
  35 | 1649289600
  36 | 1649376000
  37 | 1649462400
  38 | 1649548800
  39 | 1649635200
  40 | 1649721600
  41 | 1649808000
  42 | 1649894400
  43 | 1649980800
  44 | 1650067200
  45 | 1650153600
  46 | 1650240000
  47 | 1650326400
  48 | 1650412800
  49 | 1650499200
  50 | 1650585600
  51 | 1650672000
  52 | 1650758400
  53 | 1650844800
  54 | 1650931200
  55 | 1651017600
  56 | 1651104000
  57 | 1651190400
  58 | 1651276800
  59 | 1654041600
  60 | 1654128000
  61 | 1654214400
  62 | 1654300800
  63 | 1654387200
  64 | 1654473600
  65 | 1654560000
  66 | 1654646400
  67 | 1654732800
  68 | 1654819200
  69 | 1654905600
  70 | 1654992000
  71 | 1655078400
  72 | 1655164800
  73 | 1655251200
  74 | 1655337600
  75 | 1655424000
  76 | 1655510400
  77 | 1655596800
  78 | 1655683200
  79 | 1655769600
  80 | 1655856000
  81 | 1655942400
  82 | 1656028800
  83 | 1656115200
  84 | 1656201600
  85 | 1656288000
  86 | 1656374400
  87 | 1656460800
  88 | 1656547200
  89 | 1661990400
  90 | 1662076800
  91 | 1662163200
  92 | 1662249600
  93 | 1662336000
  94 | 1662422400
  95 | 1662508800
  96 | 1662595200
  97 | 1662681600
  98 | 1662768000
  99 | 1662854400
 100 | 1662940800
 101 | 1663027200
 102 | 1663113600
 103 | 1663200000
 104 | 1663286400
 105 | 1663372800
 106 | 1663459200
 107 | 1663545600
 108 | 1663632000
 109 | 1663718400
 110 | 1663804800
 111 | 1663891200
 112 | 1663977600
 113 | 1664064000
 114 | 1664150400
 115 | 1664236800
 116 | 1664323200
 117 | 1664409600
 118 | 1664496000
 119 | 1667260800
 120 | 1667347200
 121 | 1667433600
 122 | 1667520000
 123 | 1667606400
 124 | 1667692800
 125 | 1667779200
 126 | 1667865600
 127 | 1667952000
 128 | 1668038400
 129 | 1668124800
 130 | 1668211200
 131 | 1668297600
 132 | 1668384000
 133 | 1668470400
 134 | 1668556800
 135 | 1668643200
 136 | 1668729600
 137 | 1668816000
 138 | 1668902400
 139 | 1668988800
 140 | 1669075200
 141 | 1669161600
 142 | 1669248000
 143 | 1669334400
 144 | 1669420800
 145 | 1669507200
 146 | 1669593600
 147 | 1669680000
 148 | 1669766400
 149 | 1640995200
 150 | 1641081600
 151 | 1641168000
 152 | 1641254400
 153 | 1641340800
 154 | 1641427200
 155 | 1641513600
 156 | 1641600000
 157 | 1641686400
 158 | 1641772800
 159 | 1641859200
 160 | 1641945600
 161 | 1642032000
 162 | 1642118400
 163 | 1642204800
 164 | 1642291200
 165 | 1642377600
 166 | 1642464000
 167 | 1642550400
 168 | 1642636800
 169 | 1642723200
 170 | 1642809600
 171 | 1642896000
 172 | 1642982400
 173 | 1643068800
 174 | 1643155200
 175 | 1643241600
 176 | 1643328000
 177 | 1643414400
 178 | 1643500800
 179 | 1643587200
 180 | 1646092800
 181 | 1646179200
 182 | 1646265600
 183 | 1646352000
 184 | 1646438400
 185 | 1646524800
 186 | 1646611200
 187 | 1646697600
 188 | 1646784000
 189 | 1646870400
 190 | 1646956800
 191 | 1647043200
 192 | 1647129600
 193 | 1647216000
 194 | 1647302400
 195 | 1647388800
 196 | 1647475200
 197 | 1647561600
 198 | 1647648000
 199 | 1647734400
 200 | 1647820800
 201 | 1647907200
 202 | 1647993600
 203 | 1648080000
 204 | 1648166400
 205 | 1648252800
 206 | 1648339200
 207 | 1648425600
 208 | 1648512000
 209 | 1648598400
 210 | 1648684800
 211 | 1651363200
 212 | 1651449600
 213 | 1651536000
 214 | 1651622400
 215 | 1651708800
 216 | 1651795200
 217 | 1651881600
 218 | 1651968000
 219 | 1652054400
 220 | 1652140800
 221 | 1652227200
 222 | 1652313600
 223 | 1652400000
 224 | 1652486400
 225 | 1652572800
 226 | 1652659200
 227 | 1652745600
 228 | 1652832000
 229 | 1652918400
 230 | 1653004800
 231 | 1653091200
 232 | 1653177600
 233 | 1653264000
 234 | 1653350400
 235 | 1653436800
 236 | 1653523200
 237 | 1653609600
 238 | 1653696000
 239 | 1653782400
 240 | 1653868800
 241 | 1653955200
 242 | 1656633600
 243 | 1656720000
 244 | 1656806400
 245 | 1656892800
 246 | 1656979200
 247 | 1657065600
 248 | 1657152000
 249 | 1657238400
 250 | 1657324800
 251 | 1657411200
 252 | 1657497600
 253 | 1657584000
 254 | 1657670400
 255 | 1657756800
 256 | 1657843200
 257 | 1657929600
 258 | 1658016000
 259 | 1658102400
 260 | 1658188800
 261 | 1658275200
 262 | 1658361600
 263 | 1658448000
 264 | 1658534400
 265 | 1658620800
 266 | 1658707200
 267 | 1658793600
 268 | 1658880000
 269 | 1658966400
 270 | 1659052800
 271 | 1659139200
 272 | 1659225600
 273 | 1659312000
 274 | 1659398400
 275 | 1659484800
 276 | 1659571200
 277 | 1659657600
 278 | 1659744000
 279 | 1659830400
 280 | 1659916800
 281 | 1660003200
 282 | 1660089600
 283 | 1660176000
 284 | 1660262400
 285 | 1660348800
 286 | 1660435200
 287 | 1660521600
 288 | 1660608000
 289 | 1660694400
 290 | 1660780800
 291 | 1660867200
 292 | 1660953600
 293 | 1661040000
 294 | 1661126400
 295 | 1661212800
 296 | 1661299200
 297 | 1661385600
 298 | 1661472000
 299 | 1661558400
 300 | 1661644800
 301 | 1661731200
 302 | 1661817600
 303 | 1661904000
 304 | 1664582400
 305 | 1664668800
 306 | 1664755200
 307 | 1664841600
 308 | 1664928000
 309 | 1665014400
 310 | 1665100800
 311 | 1665187200
 312 | 1665273600
 313 | 1665360000
 314 | 1665446400
 315 | 1665532800
 316 | 1665619200
 317 | 1665705600
 318 | 1665792000
 319 | 1665878400
 320 | 1665964800
 321 | 1666051200
 322 | 1666137600
 323 | 1666224000
 324 | 1666310400
 325 | 1666396800
 326 | 1666483200
 327 | 1666569600
 328 | 1666656000
 329 | 1666742400
 330 | 1666828800
 331 | 1666915200
 332 | 1667001600
 333 | 1667088000
 334 | 1667174400
 335 | 1669852800
 336 | 1669939200
 337 | 1670025600
 338 | 1670112000
 339 | 1670198400
 340 | 1670284800
 341 | 1670371200
 342 | 1670457600
 343 | 1670544000
 344 | 1670630400
 345 | 1670716800
 346 | 1670803200
 347 | 1670889600
 348 | 1670976000
 349 | 1671062400
 350 | 1671148800
 351 | 1671235200
 352 | 1671321600
 353 | 1671408000
 354 | 1671494400
 355 | 1671580800
 356 | 1671667200
 357 | 1671753600
 358 | 1671840000
 359 | 1671926400
 360 | 1672012800
 361 | 1672099200
 362 | 1672185600
 363 | 1672272000
 364 | 1672358400
 365 | 1672444800
In [23]:
# Convert and print human-readable dates
print("Band | GRIB_REF_TIME (UTC)")
print("-" * 35)

for i, ref_time in enumerate(ref_times, start=1):
    if ref_time:
        dt = datetime.utcfromtimestamp(int(ref_time.split()[0])).isoformat() + 'Z'
    else:
        dt = "N/A"
    print(f"{i:4} | {dt}")
Band | GRIB_REF_TIME (UTC)
-----------------------------------
   1 | 2022-02-01T00:00:00Z
   2 | 2022-02-02T00:00:00Z
   3 | 2022-02-03T00:00:00Z
   4 | 2022-02-04T00:00:00Z
   5 | 2022-02-05T00:00:00Z
   6 | 2022-02-06T00:00:00Z
   7 | 2022-02-07T00:00:00Z
   8 | 2022-02-08T00:00:00Z
   9 | 2022-02-09T00:00:00Z
  10 | 2022-02-10T00:00:00Z
  11 | 2022-02-11T00:00:00Z
  12 | 2022-02-12T00:00:00Z
  13 | 2022-02-13T00:00:00Z
  14 | 2022-02-14T00:00:00Z
  15 | 2022-02-15T00:00:00Z
  16 | 2022-02-16T00:00:00Z
  17 | 2022-02-17T00:00:00Z
  18 | 2022-02-18T00:00:00Z
  19 | 2022-02-19T00:00:00Z
  20 | 2022-02-20T00:00:00Z
  21 | 2022-02-21T00:00:00Z
  22 | 2022-02-22T00:00:00Z
  23 | 2022-02-23T00:00:00Z
  24 | 2022-02-24T00:00:00Z
  25 | 2022-02-25T00:00:00Z
  26 | 2022-02-26T00:00:00Z
  27 | 2022-02-27T00:00:00Z
  28 | 2022-02-28T00:00:00Z
  29 | 2022-04-01T00:00:00Z
  30 | 2022-04-02T00:00:00Z
  31 | 2022-04-03T00:00:00Z
  32 | 2022-04-04T00:00:00Z
  33 | 2022-04-05T00:00:00Z
  34 | 2022-04-06T00:00:00Z
  35 | 2022-04-07T00:00:00Z
  36 | 2022-04-08T00:00:00Z
  37 | 2022-04-09T00:00:00Z
  38 | 2022-04-10T00:00:00Z
  39 | 2022-04-11T00:00:00Z
  40 | 2022-04-12T00:00:00Z
  41 | 2022-04-13T00:00:00Z
  42 | 2022-04-14T00:00:00Z
  43 | 2022-04-15T00:00:00Z
  44 | 2022-04-16T00:00:00Z
  45 | 2022-04-17T00:00:00Z
  46 | 2022-04-18T00:00:00Z
  47 | 2022-04-19T00:00:00Z
  48 | 2022-04-20T00:00:00Z
  49 | 2022-04-21T00:00:00Z
  50 | 2022-04-22T00:00:00Z
  51 | 2022-04-23T00:00:00Z
  52 | 2022-04-24T00:00:00Z
  53 | 2022-04-25T00:00:00Z
  54 | 2022-04-26T00:00:00Z
  55 | 2022-04-27T00:00:00Z
  56 | 2022-04-28T00:00:00Z
  57 | 2022-04-29T00:00:00Z
  58 | 2022-04-30T00:00:00Z
  59 | 2022-06-01T00:00:00Z
  60 | 2022-06-02T00:00:00Z
  61 | 2022-06-03T00:00:00Z
  62 | 2022-06-04T00:00:00Z
  63 | 2022-06-05T00:00:00Z
  64 | 2022-06-06T00:00:00Z
  65 | 2022-06-07T00:00:00Z
  66 | 2022-06-08T00:00:00Z
  67 | 2022-06-09T00:00:00Z
  68 | 2022-06-10T00:00:00Z
  69 | 2022-06-11T00:00:00Z
  70 | 2022-06-12T00:00:00Z
  71 | 2022-06-13T00:00:00Z
  72 | 2022-06-14T00:00:00Z
  73 | 2022-06-15T00:00:00Z
  74 | 2022-06-16T00:00:00Z
  75 | 2022-06-17T00:00:00Z
  76 | 2022-06-18T00:00:00Z
  77 | 2022-06-19T00:00:00Z
  78 | 2022-06-20T00:00:00Z
  79 | 2022-06-21T00:00:00Z
  80 | 2022-06-22T00:00:00Z
  81 | 2022-06-23T00:00:00Z
  82 | 2022-06-24T00:00:00Z
  83 | 2022-06-25T00:00:00Z
  84 | 2022-06-26T00:00:00Z
  85 | 2022-06-27T00:00:00Z
  86 | 2022-06-28T00:00:00Z
  87 | 2022-06-29T00:00:00Z
  88 | 2022-06-30T00:00:00Z
  89 | 2022-09-01T00:00:00Z
  90 | 2022-09-02T00:00:00Z
  91 | 2022-09-03T00:00:00Z
  92 | 2022-09-04T00:00:00Z
  93 | 2022-09-05T00:00:00Z
  94 | 2022-09-06T00:00:00Z
  95 | 2022-09-07T00:00:00Z
  96 | 2022-09-08T00:00:00Z
  97 | 2022-09-09T00:00:00Z
  98 | 2022-09-10T00:00:00Z
  99 | 2022-09-11T00:00:00Z
 100 | 2022-09-12T00:00:00Z
 101 | 2022-09-13T00:00:00Z
 102 | 2022-09-14T00:00:00Z
 103 | 2022-09-15T00:00:00Z
 104 | 2022-09-16T00:00:00Z
 105 | 2022-09-17T00:00:00Z
 106 | 2022-09-18T00:00:00Z
 107 | 2022-09-19T00:00:00Z
 108 | 2022-09-20T00:00:00Z
 109 | 2022-09-21T00:00:00Z
 110 | 2022-09-22T00:00:00Z
 111 | 2022-09-23T00:00:00Z
 112 | 2022-09-24T00:00:00Z
 113 | 2022-09-25T00:00:00Z
 114 | 2022-09-26T00:00:00Z
 115 | 2022-09-27T00:00:00Z
 116 | 2022-09-28T00:00:00Z
 117 | 2022-09-29T00:00:00Z
 118 | 2022-09-30T00:00:00Z
 119 | 2022-11-01T00:00:00Z
 120 | 2022-11-02T00:00:00Z
 121 | 2022-11-03T00:00:00Z
 122 | 2022-11-04T00:00:00Z
 123 | 2022-11-05T00:00:00Z
 124 | 2022-11-06T00:00:00Z
 125 | 2022-11-07T00:00:00Z
 126 | 2022-11-08T00:00:00Z
 127 | 2022-11-09T00:00:00Z
 128 | 2022-11-10T00:00:00Z
 129 | 2022-11-11T00:00:00Z
 130 | 2022-11-12T00:00:00Z
 131 | 2022-11-13T00:00:00Z
 132 | 2022-11-14T00:00:00Z
 133 | 2022-11-15T00:00:00Z
 134 | 2022-11-16T00:00:00Z
 135 | 2022-11-17T00:00:00Z
 136 | 2022-11-18T00:00:00Z
 137 | 2022-11-19T00:00:00Z
 138 | 2022-11-20T00:00:00Z
 139 | 2022-11-21T00:00:00Z
 140 | 2022-11-22T00:00:00Z
 141 | 2022-11-23T00:00:00Z
 142 | 2022-11-24T00:00:00Z
 143 | 2022-11-25T00:00:00Z
 144 | 2022-11-26T00:00:00Z
 145 | 2022-11-27T00:00:00Z
 146 | 2022-11-28T00:00:00Z
 147 | 2022-11-29T00:00:00Z
 148 | 2022-11-30T00:00:00Z
 149 | 2022-01-01T00:00:00Z
 150 | 2022-01-02T00:00:00Z
 151 | 2022-01-03T00:00:00Z
 152 | 2022-01-04T00:00:00Z
 153 | 2022-01-05T00:00:00Z
 154 | 2022-01-06T00:00:00Z
 155 | 2022-01-07T00:00:00Z
 156 | 2022-01-08T00:00:00Z
 157 | 2022-01-09T00:00:00Z
 158 | 2022-01-10T00:00:00Z
 159 | 2022-01-11T00:00:00Z
 160 | 2022-01-12T00:00:00Z
 161 | 2022-01-13T00:00:00Z
 162 | 2022-01-14T00:00:00Z
 163 | 2022-01-15T00:00:00Z
 164 | 2022-01-16T00:00:00Z
 165 | 2022-01-17T00:00:00Z
 166 | 2022-01-18T00:00:00Z
 167 | 2022-01-19T00:00:00Z
 168 | 2022-01-20T00:00:00Z
 169 | 2022-01-21T00:00:00Z
 170 | 2022-01-22T00:00:00Z
 171 | 2022-01-23T00:00:00Z
 172 | 2022-01-24T00:00:00Z
 173 | 2022-01-25T00:00:00Z
 174 | 2022-01-26T00:00:00Z
 175 | 2022-01-27T00:00:00Z
 176 | 2022-01-28T00:00:00Z
 177 | 2022-01-29T00:00:00Z
 178 | 2022-01-30T00:00:00Z
 179 | 2022-01-31T00:00:00Z
 180 | 2022-03-01T00:00:00Z
 181 | 2022-03-02T00:00:00Z
 182 | 2022-03-03T00:00:00Z
 183 | 2022-03-04T00:00:00Z
 184 | 2022-03-05T00:00:00Z
 185 | 2022-03-06T00:00:00Z
 186 | 2022-03-07T00:00:00Z
 187 | 2022-03-08T00:00:00Z
 188 | 2022-03-09T00:00:00Z
 189 | 2022-03-10T00:00:00Z
 190 | 2022-03-11T00:00:00Z
 191 | 2022-03-12T00:00:00Z
 192 | 2022-03-13T00:00:00Z
 193 | 2022-03-14T00:00:00Z
 194 | 2022-03-15T00:00:00Z
 195 | 2022-03-16T00:00:00Z
 196 | 2022-03-17T00:00:00Z
 197 | 2022-03-18T00:00:00Z
 198 | 2022-03-19T00:00:00Z
 199 | 2022-03-20T00:00:00Z
 200 | 2022-03-21T00:00:00Z
 201 | 2022-03-22T00:00:00Z
 202 | 2022-03-23T00:00:00Z
 203 | 2022-03-24T00:00:00Z
 204 | 2022-03-25T00:00:00Z
 205 | 2022-03-26T00:00:00Z
 206 | 2022-03-27T00:00:00Z
 207 | 2022-03-28T00:00:00Z
 208 | 2022-03-29T00:00:00Z
 209 | 2022-03-30T00:00:00Z
 210 | 2022-03-31T00:00:00Z
 211 | 2022-05-01T00:00:00Z
 212 | 2022-05-02T00:00:00Z
 213 | 2022-05-03T00:00:00Z
 214 | 2022-05-04T00:00:00Z
 215 | 2022-05-05T00:00:00Z
 216 | 2022-05-06T00:00:00Z
 217 | 2022-05-07T00:00:00Z
 218 | 2022-05-08T00:00:00Z
 219 | 2022-05-09T00:00:00Z
 220 | 2022-05-10T00:00:00Z
 221 | 2022-05-11T00:00:00Z
 222 | 2022-05-12T00:00:00Z
 223 | 2022-05-13T00:00:00Z
 224 | 2022-05-14T00:00:00Z
 225 | 2022-05-15T00:00:00Z
 226 | 2022-05-16T00:00:00Z
 227 | 2022-05-17T00:00:00Z
 228 | 2022-05-18T00:00:00Z
 229 | 2022-05-19T00:00:00Z
 230 | 2022-05-20T00:00:00Z
 231 | 2022-05-21T00:00:00Z
 232 | 2022-05-22T00:00:00Z
 233 | 2022-05-23T00:00:00Z
 234 | 2022-05-24T00:00:00Z
 235 | 2022-05-25T00:00:00Z
 236 | 2022-05-26T00:00:00Z
 237 | 2022-05-27T00:00:00Z
 238 | 2022-05-28T00:00:00Z
 239 | 2022-05-29T00:00:00Z
 240 | 2022-05-30T00:00:00Z
 241 | 2022-05-31T00:00:00Z
 242 | 2022-07-01T00:00:00Z
 243 | 2022-07-02T00:00:00Z
 244 | 2022-07-03T00:00:00Z
 245 | 2022-07-04T00:00:00Z
 246 | 2022-07-05T00:00:00Z
 247 | 2022-07-06T00:00:00Z
 248 | 2022-07-07T00:00:00Z
 249 | 2022-07-08T00:00:00Z
 250 | 2022-07-09T00:00:00Z
 251 | 2022-07-10T00:00:00Z
 252 | 2022-07-11T00:00:00Z
 253 | 2022-07-12T00:00:00Z
 254 | 2022-07-13T00:00:00Z
 255 | 2022-07-14T00:00:00Z
 256 | 2022-07-15T00:00:00Z
 257 | 2022-07-16T00:00:00Z
 258 | 2022-07-17T00:00:00Z
 259 | 2022-07-18T00:00:00Z
 260 | 2022-07-19T00:00:00Z
 261 | 2022-07-20T00:00:00Z
 262 | 2022-07-21T00:00:00Z
 263 | 2022-07-22T00:00:00Z
 264 | 2022-07-23T00:00:00Z
 265 | 2022-07-24T00:00:00Z
 266 | 2022-07-25T00:00:00Z
 267 | 2022-07-26T00:00:00Z
 268 | 2022-07-27T00:00:00Z
 269 | 2022-07-28T00:00:00Z
 270 | 2022-07-29T00:00:00Z
 271 | 2022-07-30T00:00:00Z
 272 | 2022-07-31T00:00:00Z
 273 | 2022-08-01T00:00:00Z
 274 | 2022-08-02T00:00:00Z
 275 | 2022-08-03T00:00:00Z
 276 | 2022-08-04T00:00:00Z
 277 | 2022-08-05T00:00:00Z
 278 | 2022-08-06T00:00:00Z
 279 | 2022-08-07T00:00:00Z
 280 | 2022-08-08T00:00:00Z
 281 | 2022-08-09T00:00:00Z
 282 | 2022-08-10T00:00:00Z
 283 | 2022-08-11T00:00:00Z
 284 | 2022-08-12T00:00:00Z
 285 | 2022-08-13T00:00:00Z
 286 | 2022-08-14T00:00:00Z
 287 | 2022-08-15T00:00:00Z
 288 | 2022-08-16T00:00:00Z
 289 | 2022-08-17T00:00:00Z
 290 | 2022-08-18T00:00:00Z
 291 | 2022-08-19T00:00:00Z
 292 | 2022-08-20T00:00:00Z
 293 | 2022-08-21T00:00:00Z
 294 | 2022-08-22T00:00:00Z
 295 | 2022-08-23T00:00:00Z
 296 | 2022-08-24T00:00:00Z
 297 | 2022-08-25T00:00:00Z
 298 | 2022-08-26T00:00:00Z
 299 | 2022-08-27T00:00:00Z
 300 | 2022-08-28T00:00:00Z
 301 | 2022-08-29T00:00:00Z
 302 | 2022-08-30T00:00:00Z
 303 | 2022-08-31T00:00:00Z
 304 | 2022-10-01T00:00:00Z
 305 | 2022-10-02T00:00:00Z
 306 | 2022-10-03T00:00:00Z
 307 | 2022-10-04T00:00:00Z
 308 | 2022-10-05T00:00:00Z
 309 | 2022-10-06T00:00:00Z
 310 | 2022-10-07T00:00:00Z
 311 | 2022-10-08T00:00:00Z
 312 | 2022-10-09T00:00:00Z
 313 | 2022-10-10T00:00:00Z
 314 | 2022-10-11T00:00:00Z
 315 | 2022-10-12T00:00:00Z
 316 | 2022-10-13T00:00:00Z
 317 | 2022-10-14T00:00:00Z
 318 | 2022-10-15T00:00:00Z
 319 | 2022-10-16T00:00:00Z
 320 | 2022-10-17T00:00:00Z
 321 | 2022-10-18T00:00:00Z
 322 | 2022-10-19T00:00:00Z
 323 | 2022-10-20T00:00:00Z
 324 | 2022-10-21T00:00:00Z
 325 | 2022-10-22T00:00:00Z
 326 | 2022-10-23T00:00:00Z
 327 | 2022-10-24T00:00:00Z
 328 | 2022-10-25T00:00:00Z
 329 | 2022-10-26T00:00:00Z
 330 | 2022-10-27T00:00:00Z
 331 | 2022-10-28T00:00:00Z
 332 | 2022-10-29T00:00:00Z
 333 | 2022-10-30T00:00:00Z
 334 | 2022-10-31T00:00:00Z
 335 | 2022-12-01T00:00:00Z
 336 | 2022-12-02T00:00:00Z
 337 | 2022-12-03T00:00:00Z
 338 | 2022-12-04T00:00:00Z
 339 | 2022-12-05T00:00:00Z
 340 | 2022-12-06T00:00:00Z
 341 | 2022-12-07T00:00:00Z
 342 | 2022-12-08T00:00:00Z
 343 | 2022-12-09T00:00:00Z
 344 | 2022-12-10T00:00:00Z
 345 | 2022-12-11T00:00:00Z
 346 | 2022-12-12T00:00:00Z
 347 | 2022-12-13T00:00:00Z
 348 | 2022-12-14T00:00:00Z
 349 | 2022-12-15T00:00:00Z
 350 | 2022-12-16T00:00:00Z
 351 | 2022-12-17T00:00:00Z
 352 | 2022-12-18T00:00:00Z
 353 | 2022-12-19T00:00:00Z
 354 | 2022-12-20T00:00:00Z
 355 | 2022-12-21T00:00:00Z
 356 | 2022-12-22T00:00:00Z
 357 | 2022-12-23T00:00:00Z
 358 | 2022-12-24T00:00:00Z
 359 | 2022-12-25T00:00:00Z
 360 | 2022-12-26T00:00:00Z
 361 | 2022-12-27T00:00:00Z
 362 | 2022-12-28T00:00:00Z
 363 | 2022-12-29T00:00:00Z
 364 | 2022-12-30T00:00:00Z
 365 | 2022-12-31T00:00:00Z

Process Historical 2022 - 2024 data using GDAL¶

Note from the inventory above that first the month of February is presented, then the months having 30 days and finaly the months having 31 days! To import the data layers in the grib file we will use the reference time to ensure the appropriate temporal ordering!

In [24]:
# read and process the 2024 GRIB raster data
#eventually uncomment the 'print' commands to see the output
historical_grib_file_2024 = work_dir +'/'+'historical_2024_lobith.grib'
glofas_historical_2024 = ilwis.RasterCoverage(historical_grib_file_2024)
dataset = gdal.Open(historical_grib_file_2024)

#create a 3d stack stack using the 'grib reference time' to order the grib layers
date_historical = []
for i in range(1, dataset.RasterCount + 1):
    date = dataset.GetRasterBand(i).GetMetadata()['GRIB_REF_TIME']
    date = date.split()[0]
    date_historical.append(date)
date_historical = [datetime.fromtimestamp(int(date)) for date in date_historical]
#date_historical

l = 1
c = 2 
z = (glofas_historical_2024.size().zsize)
#print(z)

Q_2024 = []
for n in range(0,(z)):
    point_value = (glofas_historical_2024.pix2value(ilwis.Pixel(c,l,n)))
    Q_2024.append(point_value)

#print('Values extracted for selected location:', Q_2024)

data = sorted(zip(date_historical, Q_2024))
df2024= pd.DataFrame(data, columns=['date', 'discharge'])
date_historical = np.array(df2024.date)
Q_2024 = np.array(df2024.discharge)
#print(date_historical)

fig = plt.figure(figsize =(14, 7))
plt.plot(date_historical, Q_2024, label='Discharge')
plt.xlabel('Time')
plt.ylabel('Discharge (m3/sec)')
plt.title('Discharge Rhine  - at Lobith (the Netherlands)')
plt.legend()
Out[24]:
<matplotlib.legend.Legend at 0x22cd07242d0>
No description has been provided for this image
In [25]:
# read and process the 2023 GRIB raster data
#eventually uncomment the 'print' commands to see the output
historical_grib_file_2023 = work_dir +'/'+'historical_2023_lobith.grib'
glofas_historical_2023 = ilwis.RasterCoverage(historical_grib_file_2023)
dataset = gdal.Open(historical_grib_file_2023)

#create a 3d stack stack using the 'grib reference time' to order the grib layers
date_historical = []
for i in range(1, dataset.RasterCount + 1):
    date = dataset.GetRasterBand(i).GetMetadata()['GRIB_REF_TIME']
    date = date.split()[0]
    date_historical.append(date)
date_historical = [datetime.fromtimestamp(int(date)) for date in date_historical]
#date_historical

l = 1
c = 2 
z = (glofas_historical_2023.size().zsize)
#print(z)

Q_2023 = []
for n in range(0,(z)):
    point_value = (glofas_historical_2023.pix2value(ilwis.Pixel(c,l,n)))
    Q_2023.append(point_value)

#print('Values extracted for selected location:', Q_2023)

data = sorted(zip(date_historical, Q_2023))
df2023= pd.DataFrame(data, columns=['date', 'discharge'])
date_historical = np.array(df2023.date)
Q_2023 = np.array(df2023.discharge)
#print(date_historical)

fig = plt.figure(figsize =(14, 7))
plt.plot(date_historical, Q_2023, label='Discharge')
plt.xlabel('Time')
plt.ylabel('Discharge (m3/sec)')
plt.title('Discharge Rhine  - at Lobith (the Netherlands)')
plt.legend()
Out[25]:
<matplotlib.legend.Legend at 0x22cd093b250>
No description has been provided for this image
In [26]:
# read and process the 2022 GRIB raster data
#eventually uncomment the 'print' commands to see the output
historical_grib_file_2022 = work_dir +'/'+'historical_2022_lobith.grib'
glofas_historical_2022 = ilwis.RasterCoverage(historical_grib_file_2022)
dataset = gdal.Open(historical_grib_file_2022)

#create a 3d stack stack using the 'grib reference time' to order the grib layers
date_historical = []
for i in range(1, dataset.RasterCount + 1):
    date = dataset.GetRasterBand(i).GetMetadata()['GRIB_REF_TIME']
    date = date.split()[0]
    date_historical.append(date)
date_historical = [datetime.fromtimestamp(int(date)) for date in date_historical]
#date_historical

l = 1
c = 2 
z = (glofas_historical_2022.size().zsize)
#print(z)

Q_2022 = []
for n in range(0,(z)):
    point_value = (glofas_historical_2022.pix2value(ilwis.Pixel(c,l,n)))
    Q_2022.append(point_value)

#print('Values extracted for selected location:', Q_2022)

data = sorted(zip(date_historical, Q_2022))
df2022= pd.DataFrame(data, columns=['date', 'discharge'])
date_historical = np.array(df2022.date)
Q_2022 = np.array(df2022.discharge)
#print(date_historical)

fig = plt.figure(figsize =(14, 7))
plt.plot(date_historical, Q_2022, label='Discharge')
plt.xlabel('Time')
plt.ylabel('Discharge (m3/sec)')
plt.title('Discharge Rhine  - at Lobith (the Netherlands)')
plt.legend();
No description has been provided for this image

Plot all year / series together¶

In [27]:
GLoFAS_all = pd.concat([df2022, df2023, df2024], ignore_index=True)
print(len(GLoFAS_all))
1096
In [28]:
fig = plt.figure(figsize =(12, 6))
plt.plot(GLoFAS_all['date'], GLoFAS_all['discharge'])
plt.xlabel('Date')
plt.ylabel('Discharge (m³/s)')
plt.title('GLoFAS model discharge Rhine, Lobith - The Netherlands')
#plt.xticks(rotation=45)
plt.legend()
plt.grid()
plt.tight_layout()
plt.show();
No description has been provided for this image
In [29]:
#save file
GLoFAS_all.to_csv(work_dir+'/Glofas_Lobith_2022_2024.csv')
GLoFAS_all
Out[29]:
date discharge
0 2022-01-01 01:00:00 3996.890625
1 2022-01-02 01:00:00 4223.968750
2 2022-01-03 01:00:00 4137.421875
3 2022-01-04 01:00:00 3950.281250
4 2022-01-05 01:00:00 4246.953125
... ... ...
1091 2024-12-27 01:00:00 3610.601562
1092 2024-12-28 01:00:00 3307.554688
1093 2024-12-29 01:00:00 2959.820312
1094 2024-12-30 01:00:00 2644.265625
1095 2024-12-31 01:00:00 2380.515625

1096 rows × 2 columns

Processing the In Situ Daily Mean Discharge¶

In [30]:
# data source: WaterInformatie @ Waterschap Rijn en IJssel - https://waterdata.wrij.nl/
# file available in your sample data folder
#inspect the file using e.g. notepad and check the content
file = work_dir+'/Q_Lobith_1990_2025_insitu.csv'
In [31]:
# Load the CSV file
df_gauge = pd.read_csv(
    file,                         # note name above
    sep=",",                      # Semicolon as delimiter
    parse_dates=["Date"]          # Parse datetime column
)
In [32]:
#perform some additonal data fram modifications
start_date = pd.to_datetime('1990-01-01')
df_gauge['date_new'] = start_date + pd.to_timedelta(df_gauge.index, unit='D')
df_gauge['discharge_m3s'] = df_gauge['Discharge'] / 1000
In [33]:
#check the dataframe content
df_gauge
Out[33]:
Date Discharge date_new discharge_m3s
0 01/01/1990 23:00 1867000.0 1990-01-01 1867.000
1 02/01/1990 23:00 1734000.0 1990-01-02 1734.000
2 03/01/1990 23:00 1612000.0 1990-01-03 1612.000
3 04/01/1990 23:00 1504000.0 1990-01-04 1504.000
4 05/01/1990 23:00 1416500.0 1990-01-05 1416.500
... ... ... ... ...
12946 12/06/2025 23:00 1907251.0 2025-06-12 1907.251
12947 13/06/2025 23:00 1765162.0 2025-06-13 1765.162
12948 14/06/2025 23:00 1660813.0 2025-06-14 1660.813
12949 15/06/2025 23:00 1612323.0 2025-06-15 1612.323
12950 16/06/2025 23:00 1590411.0 2025-06-16 1590.411

12951 rows × 4 columns

In [34]:
# Selecting a certain time range
mask = (df_gauge['date_new'] >= '2022-01-01') & (df_gauge['date_new'] <= '2024-12-31')
df_gauge_sel = df_gauge.loc[mask]
In [35]:
# Plot using matplotlib
plt.figure(figsize=(12, 6))
plt.plot(df_gauge_sel["date_new"], df_gauge_sel["discharge_m3s"], label="Q-Lobith - In Situ")
plt.xlabel("Date")
plt.ylabel("Discharge (m³/s)")
plt.title("In Situ discharge measurements Rhine, Lobith - The Netherlands")
#plt.xticks(rotation=45)
plt.legend()
plt.grid()
plt.tight_layout()
plt.show();
No description has been provided for this image

Plot of the 3 discharge time series for the years 2022, 2023 and 2024¶

In [36]:
# Plot using matplotlib
plt.figure(figsize=(12, 6))
plt.plot(df_gauge_sel["date_new"], df_gauge_sel["discharge_m3s"], label='Q-Lobith - In-situ')
plt.plot(df_geoglows_sel['time'], df_geoglows_sel['discharge'],label='Q-Lobith - GEOGLOWS')
plt.plot(GLoFAS_all['date'], GLoFAS_all['discharge'],label='Q-Lobith - GLoFAS')
plt.xlabel("Date")
plt.ylabel("Discharge (m³/s)")
plt.title("Discharge comparison global discharge model output versus in situ observation for the Rhine river at Lobith, the Netherlands")
plt.grid(True)
plt.legend()
plt.tight_layout()
plt.show()
No description has been provided for this image

What can visually be observed when inspecting the figure above?

Some statistics¶

Create a new dataframe with the discharge values for insitu, geoglows and glofas

In [37]:
df1 = df_gauge_sel.reset_index(drop=True)
df2 = df_geoglows_sel.reset_index(drop=True)
df3 = GLoFAS_all.reset_index(drop=True)
In [38]:
new_df = pd.DataFrame({'insitu': df1['discharge_m3s'],'geoglows': df2['discharge'],'glofas': df3['discharge']})
In [39]:
new_df
Out[39]:
insitu geoglows glofas
0 3503.803 1051.514038 3996.890625
1 3642.563 1127.447021 4223.968750
2 3559.721 1183.495972 4137.421875
3 3538.527 1230.483032 3950.281250
4 4164.402 1351.469971 4246.953125
... ... ... ...
1091 3337.286 1606.383057 3610.601562
1092 3027.616 1661.037964 3307.554688
1093 2758.890 1775.723999 2959.820312
1094 2542.590 1900.836060 2644.265625
1095 2355.328 2017.777954 2380.515625

1096 rows × 3 columns

In [40]:
# get correlation coefficients
corr_insitu_geoglows = new_df['insitu'].corr(new_df['geoglows'])
corr_insitu_glofas = new_df['insitu'].corr(new_df['glofas'])

# Print results
print("Correlation (insitu & geoglows):", corr_insitu_geoglows)
print("Correlation (insitu & glofas):", corr_insitu_glofas)
Correlation (insitu & geoglows): 0.6092627501640251
Correlation (insitu & glofas): 0.9482473774593965

Compare the in situ observations and the GEOGLOW model output¶

For GLoFAS only 3 years for data is being processed in this notebook, but for the in situ and GEOGLOWs there is a longer common time period. Now let's inspect about 25 years of daily discharge data

In [41]:
# Selecting a certain time range
mask = (df_geoglows['time'] >= '1990-01-01') & (df_geoglows['time'] <= '2024-12-31')
df_geoglows_sel1 = df_geoglows.loc[mask]

mask = (df_gauge['date_new'] >= '1990-01-01') & (df_gauge['date_new'] <= '2024-12-31')
df_gauge_sel1 = df_gauge.loc[mask]
In [42]:
# Plot using matplotlib
plt.figure(figsize=(12, 6))
plt.plot(df_gauge_sel1["date_new"], df_gauge_sel1["discharge_m3s"], label='Q-Lobith - In-situ')
plt.plot(df_geoglows_sel1['time'], df_geoglows_sel1['discharge'],label='Q-Lobith - GEOGLOWS')
plt.xlabel("Date")
plt.ylabel("Discharge (m³/s)")
plt.title("Discharge comparison global discharge model output versus in situ observation for the Rhine river at Lobith, the Netherlands ")
plt.grid(True)
plt.legend()
plt.tight_layout()
plt.show();
No description has been provided for this image

Correlation coefficient over some longer time range for insitu and geoglows discharge comparison¶

As we compared only a shorter time series of GEOGLOWS against the in situ measurements above, we can also check if for a longer time period the correlation is changing as we have the common data series from 1990 onwards

In [43]:
# Selecting a longer common time range from the insitu dataframe
mask = (df_gauge['date_new'] >= '1990-01-01') & (df_gauge['date_new'] <= '2024-12-31')
df_gauge_sel2 = df_gauge.loc[mask]

# Selecting a longer common time range from the geoglows dataframe
mask = (df_geoglows['time'] >= '1990-01-01') & (df_geoglows['time'] <= '2024-12-31')
df_geoglows_sel2 = df_geoglows.loc[mask]
In [44]:
df1_long = df_gauge_sel2.reset_index(drop=True)
df2_long = df_geoglows_sel2.reset_index(drop=True)
In [45]:
new_df_long = pd.DataFrame({'insitu': df1_long['discharge_m3s'],'geoglows': df2_long['discharge']})
In [46]:
new_df_long
Out[46]:
insitu geoglows
0 1867.000 1236.641968
1 1734.000 1214.713013
2 1612.000 1165.150024
3 1504.000 1053.722046
4 1416.500 904.718994
... ... ...
12779 3337.286 1606.383057
12780 3027.616 1661.037964
12781 2758.890 1775.723999
12782 2542.590 1900.836060
12783 2355.328 2017.777954

12784 rows × 2 columns

In [47]:
# get correlation coefficients
corr_insitu_geoglows_long = new_df_long['insitu'].corr(new_df_long['geoglows'])

# Print results
print("Correlation (insitu & geoglows):", corr_insitu_geoglows_long)
Correlation (insitu & geoglows): 0.5523272658138597

Eventually select another time range of GLoFAS and conduct another temporal comparison yourself¶