WARNING: THIS SITE IS A MIRROR OF GITHUB.COM / IT CANNOT LOGIN OR REGISTER ACCOUNTS / THE CONTENTS ARE PROVIDED AS-IS / THIS SITE ASSUMES NO RESPONSIBILITY FOR ANY DISPLAYED CONTENT OR LINKS / IF YOU FOUND SOMETHING MAY NOT GOOD FOR EVERYONE, CONTACT ADMIN AT ilovescratch@foxmail.com
Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@

This repository contains code for running SAR processing pipelines on the NCI and AWS. Currently, this codebase supports two pipelines for generating Sentinel-1 Normalised Radar Backscatter (NRB).

* [isce3_rtc (Sentinel-1 IW) that can be run locally and on AWS](docs/pipelines/aws_isce3_rtc.md)
* [isce3_rtc (Sentinel-1 IW) that can be run locally and on AWS](docs/pipelines/isce3_rtc.md)
* [pyroSAR-GAMMA (Sentinel-1 IW/EW) that can be run on the NCI](docs/pipelines/pyrosar_gamma.md)

For more information see [Pipelines](docs/pipelines/README.md) or the specific workflow docs for usage examples and running tests.
Expand Down
4 changes: 2 additions & 2 deletions docs/pipelines/isce3_rtc.md
Original file line number Diff line number Diff line change
Expand Up @@ -122,7 +122,7 @@ At runtime, the script [run_isce3_rtc_pipeline.sh](../../scripts/run_isce3_rtc_p
--make_existing_products=false
--skip_upload_to_s3=false
--scene_data_source=("AUS_COP_HUB" "ASF" "CDSE") # order of preference
--orbit_data_source=("ASF" "CDSE") # order of preference
--orbit_data_source=("AUS_COP_HUB" "ASF" "CDSE") # order of preference
--skip_validate_stac=false
# Required inputs for linking RTC_S1_STATIC to RTC_S1
# Assumes that a RTC_S1_STATIC products exist for all RTC_S1 bursts being processed
Expand All @@ -147,7 +147,7 @@ At runtime, the script [run_isce3_rtc_pipeline.sh](../../scripts/run_isce3_rtc_p
- **WARNING** - Passing this flag will create duplicate files and overwrite existing metadata, which may affect downstream workflows.
- `skip_upload_to_s3` -> Make the products, but skip uploading them to AWS S3.
- `scene_data_source` -> Where to download the scene SLC file. Can be single string or a list of preferences separated by a space. Supported values are any of `AUS_COP_HUB`, `ASF` or `CDSE`. The default is (`AUS_COP_HUB` `ASF` `CDSE`).
- `orbit_data_source` -> Where to download the orbit files. Can be single string or a list of preferences separated by a space. Can be any of `ASF` or `CDSE`. The default is (`ASF` `CDSE`).
- `orbit_data_source` -> Where to download the orbit files. Can be single string or a list of preferences separated by a space. Can be any of `AUS_COP_HUB`, `ASF` or `CDSE`. The default is (`AUS_COP_HUB` `ASF` `CDSE`).
- `skip_validate_stac` -> To skip validation of the created STAC doc within the code. If this is not set and the stac is invalid, products will not be uploaded. By default we want to validate the stac.
- `link_static_layers` -> Flag to link RTC_S1_STATIC to RTC_S1
- `linked_static_layers_s3_bucket` -> bucket where RTC_S1_STATIC stored
Expand Down
4 changes: 2 additions & 2 deletions pixi.lock

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

4 changes: 2 additions & 2 deletions pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -96,9 +96,9 @@ test-isce3-rtc = "pytest tests/sar_pipeline/isce3_rtc -o log_cli=true --capture=
test-isce3-rtc-full-docker-run = "pytest tests/sar_pipeline/isce3_rtc/test_full_docker_build_and_run.py -o log_cli=true --capture=tee-sys --log-cli-level=INFO -v -s"
test-isce3-rtc-cli-make-rtc-opera-stac-and-upload-bursts = "pytest tests/sar_pipeline/isce3_rtc/test_cli_make_rtc_opera_stac_and_upload_bursts.py -o log_cli=true --capture=tee-sys --log-cli-level=INFO -v -s"
# test queries from all providers (ASF, CDSE, AUS_COP_HUB). pygssearch conda environment is required for the AUS_COP_HUB test
test-scene-data-source-queries="pixi run install-pygssearch-env && pytest tests/sar_pipeline/test_scenes.py -o log_cli=true --capture=tee-sys --log-cli-level=INFO -v -s"
test-scene-data-source-queries="pixi run install-pygssearch-env && export PYGSSEARCH_CONDA_ENV='$(conda info --base)/envs/pygssearch-env' && pytest tests/sar_pipeline/test_scenes.py -o log_cli=true --capture=tee-sys --log-cli-level=INFO -v -s"
# test downloads from all providers. pygssearch conda environment is required for the AUS_COP_HUB test
test-isce3-rtc-downloads= "pixi run install-pygssearch-env && pytest tests/sar_pipeline/isce3_rtc/test_downloads.py -o log_cli=true --capture=tee-sys --log-cli-level=INFO -v -s"
test-isce3-rtc-downloads= "pixi run install-pygssearch-env && export PYGSSEARCH_CONDA_ENV='$(conda info --base)/envs/pygssearch-env' && pytest tests/sar_pipeline/isce3_rtc/test_downloads.py -o log_cli=true --capture=tee-sys --log-cli-level=INFO -v -s"
# nci specific tests that should be run locally on the nci
test-nci-filesystem = "pytest tests/filesystem"
lint = "black sar_pipeline"
Expand Down
27 changes: 16 additions & 11 deletions sar_pipeline/pipelines/isce3_rtc/cli.py
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@
NonSingleSceneResultError,
)
from sar_pipeline.preparation.downloads.orbits import (
download_orbits,
download_orbits_from_preference_list,
VALID_ORBIT_DATA_SOURCES,
)
from sar_pipeline.pipelines.isce3_rtc.utils.burst_utils import (
Expand Down Expand Up @@ -347,6 +347,18 @@ def get_data_for_scene_and_make_run_config(
all_scene_burst_info = get_burst_info_for_scene_from_cdse(scene)
logger.info(f"{len(all_scene_burst_info)} burst ids found for scene from CDSE API")

# write the geometries to a geojson. Useful for debugging if needed
if save_burst_geometries:
_burst_id_list = all_scene_burst_info.keys()
_burst_geoms_list = [
all_scene_burst_info[b]["geometry"] for b in _burst_id_list
]
write_burst_geometries_to_geojson(
_burst_id_list,
_burst_geoms_list,
out_folder / f"{scene}_burst_geoms.json",
)

# Limit the bursts to be processed if a list has been provided
if burst_id_list:
logger.info(f"List of bursts to process provided")
Expand Down Expand Up @@ -420,24 +432,17 @@ def get_data_for_scene_and_make_run_config(

# # download the orbits
logger.info(f"Downloading Orbits for scene : {scene}")
ORBIT_PATH = download_orbits(
ORBIT_PATH = download_orbits_from_preference_list(
scene_safe_file=scene + ".SAFE",
save_dir=orbit_folder,
source=orbit_data_sources,
download_folder=orbit_folder,
orbit_data_source_preferences=orbit_data_sources,
)
logger.info(f"File downloaded to : {ORBIT_PATH}")

# get the shape of the area covering the bursts to be processed
burst_geoms_to_process = [
all_scene_burst_info[id_]["geometry"] for id_ in burst_id_list_to_process
]
# write the geometries to a geojson. Useful for debugging if needed
if save_burst_geometries:
write_burst_geometries_to_geojson(
burst_id_list_to_process,
burst_geoms_to_process,
out_folder / f"{scene}_burst_geoms.json",
)

# download the DEM
dem_folder = download_folder / "dem" / dem_type
Expand Down
Loading
Loading