import glob
import os
import requests
import s3fs
import fiona
import netCDF4 as nc
import h5netcdf
import xarray as xr
import pandas as pd
import geopandas as gpd
import numpy as np
import matplotlib.pyplot as plt
import hvplot.xarray
import earthaccess
from earthaccess import Auth, DataCollections, DataGranules, Store
This is an old version of the notebook, see the latest here.
From the PO.DAAC Cookbook, to access the GitHub version of the notebook, follow this link.
SWOT Simulated North American Continent Hydrology Dataset Exploration in the Cloud
Accessing and Visualizing SWOT Simulated Datasets
Requirement:
This tutorial can only be run in an AWS cloud instance running in us-west-2: NASA Earthdata Cloud data in S3 can be directly accessed via earthaccess
python library; this access is limited to requests made within the US West (Oregon) (code: us-west-2
) AWS region.
Learning Objectives:
- Access all 5 products of SWOT HR sample data (archived in NASA Earthdata Cloud) within the AWS cloud, without downloading to local machine
- Visualize accessed data
SWOT Simulated Level 2 North America Continent KaRIn High Rate Version 1 Datasets:
- River Vector Shapefile - SWOT_SIMULATED_NA_CONTINENT_L2_HR_RIVERSP_V1
- Lake Vector Shapefile - SWOT_SIMULATED_NA_CONTINENT_L2_HR_LAKESP_V1
- Water Mask Pixel Cloud NetCDF - SWOT_SIMULATED_NA_CONTINENT_L2_HR_PIXC_V1
- Water Mask Pixel Cloud Vector Attribute NetCDF - SWOT_SIMULATED_NA_CONTINENT_L2_HR_PIXCVEC_V1
- Raster NetCDF - SWOT_SIMULATED_NA_CONTINENT_L2_HR_RASTER_V1
Notebook Author: Cassie Nickles, NASA PO.DAAC (Aug 2022)
Libraries Needed
Earthdata Login
An Earthdata Login account is required to access data, as well as discover restricted data, from the NASA Earthdata system. Thus, to access NASA data, you need Earthdata Login. Please visit https://urs.earthdata.nasa.gov to register and manage your Earthdata Login account. This account is free to create and only takes a moment to set up. We use earthaccess to authenticate your login credentials below.
#auth = earthaccess.login(strategy="interactive", persist=True) #if you do not have a netrc created, this line will do so with your credentials
= earthaccess.login(strategy="netrc") auth
Set up an s3fs
session for Direct Access
s3fs
sessions are used for authenticated access to s3 bucket and allows for typical file-system style operations. Below we create session by passing in the data access information.
= earthaccess.get_s3fs_session(daac='PODAAC', provider='POCLOUD') fs_s3
Single File Access
The s3 access link can be found using earthaccess
data search. Since this collection consists of Reach and Node files, we need to extract only the granule for the Reach file. We do this by filtering for the ‘Reach’ title in the data link.
Alternatively, Earthdata Search (see tutorial) can be used to manually search for a single file.
1. River Vector Shapefiles
#retrieves granule from the day we want
= earthaccess.search_data(short_name = 'SWOT_SIMULATED_NA_CONTINENT_L2_HR_RIVERSP_V1', temporal = ('2022-08-22 19:24:41', '2022-08-22 19:30:37')) river_results
#finds the s3 link of the one granule we want (The collection contains both Reaches and Nodes, but here we want only the Reach)
= []
river_data_urls for g in river_results:
for l in earthaccess.results.DataGranule.data_links(g, access='direct'):
if "Reach" in l:
river_data_urls.append(l)print(river_data_urls[0])
The native format for this data is a .zip file, and we want the .shp file within the .zip file, so we will create a Fiona AWS session using the credentials from setting up the s3fs session above to access the shapefiles within the shp files. If we don’t do this, the alternative would be to download the data to the cloud environment and extract the .zip file there.
=fiona.session.AWSSession(
fiona_session=fs_s3.storage_options["key"],
aws_access_key_id=fs_s3.storage_options["secret"],
aws_secret_access_key=fs_s3.storage_options["token"]
aws_session_token )
# We use the zip+ prefix so fiona knows that we are operating on a zip file
= f"zip+{river_data_urls[0]}"
river_shp_url
with fiona.Env(session=fiona_session):
= gpd.read_file(river_shp_url)
SWOT_HR_shp1
#view the attribute table
SWOT_HR_shp1
= plt.subplots(figsize=(11,7))
fig, ax =ax, color='black') SWOT_HR_shp1.plot(ax
2. Lake Vector Shapefiles
The lake vector shapefiles can be accessed in the same way as the river shapefiles above.
= earthaccess.search_data(short_name = 'SWOT_SIMULATED_NA_CONTINENT_L2_HR_LAKESP_V1', temporal = ('2022-08-22 19:24:18', '2022-08-22 19:30:50')) lake_results
#find the s3 link of the desired granule (This collection has three options: Obs, Unassigned, and Prior - we want Obs)
= []
lake_data_urls for g in lake_results:
for l in earthaccess.results.DataGranule.data_links(g, access='direct'):
if "Obs" in l:
lake_data_urls.append(l)print(lake_data_urls[0])
The native format for this data is a .zip file, and we want the .shp file within the .zip file, so we will create a Fiona AWS session using the credentials from setting up the s3fs session above to access the shapefiles within the shp files. If we don’t do this, the alternative would be to download the data to the cloud environment and extract the .zip file there.
=fiona.session.AWSSession(
fiona_session=fs_s3.storage_options["key"],
aws_access_key_id=fs_s3.storage_options["secret"],
aws_secret_access_key=fs_s3.storage_options["token"]
aws_session_token )
# We use the zip+ prefix so fiona knows that we are operating on a zip file
= f"zip+{lake_data_urls[0]}"
lake_shp_url
with fiona.Env(session=fiona_session):
= gpd.read_file(lake_shp_url)
SWOT_HR_shp2
#view the attribute table
SWOT_HR_shp2
= plt.subplots(figsize=(7,12))
fig, ax =ax, color='black') SWOT_HR_shp2.plot(ax
3. Water Mask Pixel Cloud NetCDF
Accessing the remaining files is different than the shp files above. We do not need to unzip the files because they are stored in native netCDF files in the cloud. For the rest of the products, we will open via xarray
.
= earthaccess.search_data(short_name = 'SWOT_SIMULATED_NA_CONTINENT_L2_HR_PIXC_V1', temporal = ('2022-08-22 19:29:00', '2022-08-22 19:29:11'), point = ('-90', '35')) watermask_results
The pixel cloud netCDF files are formatted with three groups titled, “pixel cloud”, “tvp”, or “noise” (more detail here). In order to access the coordinates and variables within the file, a group must be specified when calling xarray open_dataset.
= xr.open_mfdataset(earthaccess.open([watermask_results[0]]), group = 'pixel_cloud', engine='h5netcdf')
ds_PIXC ds_PIXC
=ds_PIXC.longitude, y=ds_PIXC.latitude, c=ds_PIXC.height)
plt.scatter(x'Height (m)') plt.colorbar().set_label(
4. Water Mask Pixel Cloud Vector Attribute NetCDF
= earthaccess.search_data(short_name = 'SWOT_SIMULATED_NA_CONTINENT_L2_HR_PIXCVEC_V1', temporal = ('2022-08-22 19:29:00', '2022-08-22 19:29:11'), point = ('-90', '35')) vector_results
= xr.open_mfdataset(earthaccess.open([vector_results[0]]), decode_cf=False, engine='h5netcdf')
ds_PIXCVEC ds_PIXCVEC
= ds_PIXCVEC.height_vectorproc.compute()
pixcvec_htvals = ds_PIXCVEC.latitude_vectorproc.compute()
pixcvec_latvals = ds_PIXCVEC.longitude_vectorproc.compute()
pixcvec_lonvals
#Before plotting, we set all fill values to nan so that the graph shows up better spatially
> 15000] = np.nan
pixcvec_htvals[pixcvec_htvals > 80] = np.nan
pixcvec_latvals[pixcvec_latvals > 180] = np.nan pixcvec_lonvals[pixcvec_lonvals
=pixcvec_lonvals, y=pixcvec_latvals, c=pixcvec_htvals)
plt.scatter(x'Height (m)') plt.colorbar().set_label(
5. Raster NetCDF
= earthaccess.search_data(short_name = 'SWOT_SIMULATED_NA_CONTINENT_L2_HR_RASTER_V1', temporal = ('2022-08-22 19:28:50', '2022-08-22 19:29:11'), point = ('-90', '35')) raster_results
#this collection has 100m and 250m granules, but we only want 100m
= []
raster_data for g in raster_results:
for l in earthaccess.results.DataGranule.data_links(g, access='direct'):
if "100m" in l:
raster_data.append(l)print(raster_data)
= xr.open_mfdataset(earthaccess.open([raster_data[0]], provider = 'POCLOUD'), engine='h5netcdf')
ds_raster ds_raster
It’s easy to analyze and plot the data with packages such as hvplot
!
='y', x='x') ds_raster.wse.hvplot.image(y