diff --git a/CHANGELOG.md b/CHANGELOG.md
index f838cc64a..dfedc955f 100644
--- a/CHANGELOG.md
+++ b/CHANGELOG.md
@@ -40,13 +40,12 @@ Attention: The newest changes should be on top -->
### Fixed
--
+- BUG: Migrate Forecasts to UCAR THREDDS [#943](https://github.com/RocketPy-Team/RocketPy/pull/943)
## [v1.12.0] - 2026-03-08
### Added
-
- ENH: Air brakes controller functions now support 8-parameter signature [#854](https://github.com/RocketPy-Team/RocketPy/pull/854)
- TST: Add acceptance tests for 3DOF flight simulation based on Bella Lui rocket [#914] (https://github.com/RocketPy-Team/RocketPy/pull/914_
- ENH: Add background map auto download functionality to Monte Carlo plots [#896](https://github.com/RocketPy-Team/RocketPy/pull/896)
diff --git a/docs/user/environment/1-atm-models/ensemble.rst b/docs/user/environment/1-atm-models/ensemble.rst
index 97c247f68..504cbfe60 100644
--- a/docs/user/environment/1-atm-models/ensemble.rst
+++ b/docs/user/environment/1-atm-models/ensemble.rst
@@ -1,3 +1,5 @@
+.. _ensemble_atmosphere:
+
Ensemble
========
@@ -21,7 +23,21 @@ Ensemble Forecast
Global Ensemble Forecast System (GEFS)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
-The ``GEFS`` model is a global ensemble forecast model ...
+.. danger::
+
+ **GEFS shortcut unavailable**: ``file="GEFS"`` is currently disabled in
+ RocketPy because NOMADS OPeNDAP is deactivated for this endpoint.
+
+.. note::
+
+ If you have a GEFS-compatible NetCDF or OPeNDAP dataset from another
+ provider (or a local copy), you can still load it explicitly by passing the
+ dataset path/URL in ``file`` and a compatible mapping in ``dictionary``.
+
+
+The ``GEFS`` model is a global ensemble forecast system useful for uncertainty
+analysis, but RocketPy's automatic ``file="GEFS"`` shortcut is temporarily
+disabled.
.. code-block:: python
@@ -71,20 +87,16 @@ CMC Ensemble
resulted in a change of the model's endpoint. Efforts are underway to \
restore access to the CMC Ensemble model as swiftly as possible.
-.. code-block:: python
+At the moment, there is no built-in ``file="CMC"`` shortcut in
+``Environment.set_atmospheric_model``.
- env_cmc = Environment(
- date=date_info,
- latitude=-21.960641,
- longitude=-47.482122,
- elevation=640,
- )
- env_cmc.set_atmospheric_model(type="Ensemble", file="CMC")
- env_cmc.all_info()
+If you have a CMC-compatible NetCDF or OPeNDAP dataset, load it explicitly by
+passing the dataset path/URL in ``file`` and a matching mapping dictionary in
+``dictionary``.
Ensemble Reanalysis
-------------------
Ensemble reanalyses are also possible with RocketPy. See the
-:ref:`reanalysis_ensemble` section for more information.
+:ref:`reanalysis_ensemble` section for more information.
\ No newline at end of file
diff --git a/docs/user/environment/1-atm-models/forecast.rst b/docs/user/environment/1-atm-models/forecast.rst
index c88c71ff2..ac91504e0 100644
--- a/docs/user/environment/1-atm-models/forecast.rst
+++ b/docs/user/environment/1-atm-models/forecast.rst
@@ -24,7 +24,7 @@ Global Forecast System (GFS)
Using the latest forecast from GFS is simple.
Set the atmospheric model to ``forecast`` and specify that GFS is the file you want.
-Note that since data is downloaded from the NOMADS server, this line of code can
+Note that since data is downloaded from a remote OPeNDAP server, this line of code can
take longer than usual.
.. jupyter-execute::
@@ -111,36 +111,15 @@ The same coordinates for SpacePort America will be used.
High Resolution Window (HIRESW)
-------------------------------
-The High Resolution Window (HIRESW) model is a sophisticated weather forecasting
-system that operates at a high spatial resolution of approximately 3 km.
-It utilizes two main dynamical cores: the Advanced Research WRF (WRF-ARW) and
-the Finite Volume Cubed Sphere (FV3), each designed to enhance the accuracy of
-weather predictions.
+.. danger::
-You can easily set up HIRESW in RocketPy by specifying the date, latitude, and
-longitude of your location. Let's use SpacePort America as an example.
+ **HIRESW shortcut unavailable**: ``file="HIRESW"`` is currently disabled in
+ RocketPy because NOMADS OPeNDAP is deactivated for this endpoint.
-.. jupyter-execute::
-
- env_hiresw = Environment(
- date=tomorrow,
- latitude=32.988528,
- longitude=-106.975056,
- )
+If you have a HIRESW-compatible dataset from another provider (or a local copy),
+you can still load it explicitly by passing the path/URL in ``file`` and an
+appropriate mapping in ``dictionary``.
- env_hiresw.set_atmospheric_model(
- type="Forecast",
- file="HIRESW",
- dictionary="HIRESW",
- )
-
- env_hiresw.plots.atmospheric_model()
-
-.. note::
-
- The HRES model is updated every 12 hours, providing forecasts with a \
- resolution of 3 km. The model can predict weather conditions up to 48 hours \
- in advance. RocketPy uses the CONUS domain with ARW core.
Using Windy Atmosphere
@@ -248,6 +227,5 @@ Also, the servers may be down or may face high traffic.
.. seealso::
- To see a complete list of available models on the NOAA's NOMADS server, visit
- `NOMADS `_.
-
+ To browse available NCEP model collections on UCAR THREDDS, visit
+ `THREDDS NCEP Catalog `_.
\ No newline at end of file
diff --git a/docs/user/environment/1-atm-models/soundings.rst b/docs/user/environment/1-atm-models/soundings.rst
index 9a276477e..279750df5 100644
--- a/docs/user/environment/1-atm-models/soundings.rst
+++ b/docs/user/environment/1-atm-models/soundings.rst
@@ -57,31 +57,22 @@ This service allows users to download virtual soundings from numerical weather
prediction models such as GFS, RAP, and NAM, and also real soundings from the
Integrated Global Radiosonde Archive (IGRA).
-These options can be retrieved as a text file in GSD format.
-By generating such a file through the link above, the file's URL can be used to
-import the atmospheric data into RocketPy.
-
-We will use the same sounding station as we did for the Wyoming Soundings.
+These options can be retrieved as a text file in GSD format. However,
+RocketPy no longer provides a dedicated ``set_atmospheric_model`` type for
+NOAA RUC Soundings.
.. note::
Select ROABs as the initial data source, specify the station through its \
WMO-ID, and opt for the ASCII (GSD format) button.
-Initialize a new Environment instance:
-
-.. code-block:: python
+If you need to use RUC-sounding-like data in RocketPy, convert it to one of the
+supported workflows:
- url = r"https://rucsoundings.noaa.gov/get_raobs.cgi?data_source=RAOB&latest=latest&start_year=2019&start_month_name=Feb&start_mday=5&start_hour=12&start_min=0&n_hrs=1.0&fcst_len=shortest&airport=83779&text=Ascii%20text%20%28GSD%20format%29&hydrometeors=false&start=latest"
-
- env = Environment()
- env.set_atmospheric_model(type="NOAARucSounding", file=url)
- env.plots.atmospheric_model()
+- Use :ref:`custom_atmosphere` after parsing the text data.
+- Use :ref:`reanalysis` or :ref:`forecast` with NetCDF/OPeNDAP sources.
.. note::
The leading `r` in the URL string is used to indicate a raw string, which \
- is useful when dealing with backslashes in URLs.
-
-
-
+ is useful when dealing with backslashes in URLs.
\ No newline at end of file
diff --git a/docs/user/environment/1-atm-models/standard_atmosphere.rst b/docs/user/environment/1-atm-models/standard_atmosphere.rst
index 0c125dfd8..d6c1de782 100644
--- a/docs/user/environment/1-atm-models/standard_atmosphere.rst
+++ b/docs/user/environment/1-atm-models/standard_atmosphere.rst
@@ -1,3 +1,5 @@
+.. _standard_atmosphere:
+
Standard Atmosphere
===================
@@ -29,4 +31,4 @@ The International Standard Atmosphere can also be reset at any time by using the
.. jupyter-execute::
- env.set_atmospheric_model(type="standard_atmosphere")
+ env.set_atmospheric_model(type="standard_atmosphere")
\ No newline at end of file
diff --git a/docs/user/environment/3-further/other_apis.rst b/docs/user/environment/3-further/other_apis.rst
index c70fd58f7..01d4b9a30 100644
--- a/docs/user/environment/3-further/other_apis.rst
+++ b/docs/user/environment/3-further/other_apis.rst
@@ -1,3 +1,5 @@
+.. _environment_other_apis:
+
Connecting to other APIs
========================
@@ -25,14 +27,19 @@ the following dimensions and variables:
- Latitude
- Longitude
- Pressure Levels
+- Temperature (as a function of Time, Pressure Levels, Latitude and Longitude)
- Geopotential Height (as a function of Time, Pressure Levels, Latitude and Longitude)
+- or Geopotential (as a function of Time, Pressure Levels, Latitude and Longitude)
- Surface Geopotential Height (as a function of Time, Latitude and Longitude)
+ (optional)
- Wind - U Component (as a function of Time, Pressure Levels, Latitude and Longitude)
- Wind - V Component (as a function of Time, Pressure Levels, Latitude and Longitude)
+Some projected grids also require a ``projection`` key in the mapping.
+
-For example, let's imagine we want to use the HIRESW model from this endpoint:
-`https://nomads.ncep.noaa.gov/dods/hiresw/ `_
+For example, let's imagine we want to use a forecast model available via an
+OPeNDAP endpoint.
Looking through the variable list in the link above, we find the following correspondence:
@@ -72,15 +79,85 @@ Therefore, we can create an environment like this:
dictionary=name_mapping,
)
+Built-in mapping dictionaries
+-----------------------------
+
+Instead of a custom dictionary, you can pass a built-in mapping name in the
+``dictionary`` argument. Common options include:
+
+- ``"ECMWF"``
+- ``"ECMWF_v0"``
+- ``"NOAA"``
+- ``"GFS"``
+- ``"NAM"``
+- ``"RAP"``
+- ``"HIRESW"`` (mapping available; latest-model shortcut currently disabled)
+- ``"GEFS"`` (mapping available; latest-model shortcut currently disabled)
+- ``"MERRA2"``
+- ``"CMC"`` (for compatible datasets loaded explicitly)
+
+What a mapping name means
+^^^^^^^^^^^^^^^^^^^^^^^^^
+
+- Base mapping names (for example ``"GFS"``, ``"NAM"`` and ``"RAP"``) map
+ RocketPy weather keys to the current default variable naming used by the
+ corresponding provider datasets.
+- These defaults are aligned with current shortcut workflows (for example,
+ THREDDS-backed latest model sources) and may use projected coordinates
+ (``x``/``y`` plus ``projection``) depending on the model.
+
+Legacy mapping names
+^^^^^^^^^^^^^^^^^^^^
+
+If you are loading archived or older NOMADS-style datasets, use the explicit
+legacy aliases:
+
+- ``"GFS_LEGACY"``
+- ``"NAM_LEGACY"``
+- ``"NOAA_LEGACY"``
+- ``"RAP_LEGACY"``
+- ``"CMC_LEGACY"``
+- ``"GEFS_LEGACY"``
+- ``"HIRESW_LEGACY"``
+- ``"MERRA2_LEGACY"``
+
+Legacy aliases primarily cover older variable naming patterns such as
+``lev``, ``tmpprs``, ``hgtprs``, ``ugrdprs`` and ``vgrdprs``.
+
+.. note::
+
+ Mapping names are case-insensitive. For example,
+ ``"gfs_legacy"`` and ``"GFS_LEGACY"`` are equivalent.
+
+For custom dictionaries, the canonical structure is:
+
+.. code-block:: python
+
+ mapping = {
+ "time": "time",
+ "latitude": "lat",
+ "longitude": "lon",
+ "level": "lev",
+ "temperature": "tmpprs",
+ "surface_geopotential_height": "hgtsfc", # optional
+ "geopotential_height": "hgtprs", # or geopotential
+ "geopotential": None,
+ "u_wind": "ugrdprs",
+ "v_wind": "vgrdprs",
+ }
+
+.. important::
+
+ Ensemble datasets require an additional key for member selection:
+ ``"ensemble": ""``.
+
.. caution::
- Notice the ``file`` argument were suppressed in the code above. This is because \
- the URL depends on the date you are running the simulation. For example, as \
- it for now, a possible link could be: https://nomads.ncep.noaa.gov/dods/hiresw/hiresw20240803/hiresw_conusfv3_12z \
- (for the 3rd of August, 2024, at 12:00 UTC). \
- You should replace the date in the URL with the date you are running the simulation. \
- Different models may have different URL structures, so be sure to check the \
- documentation of the model you are using.
+ The ``file`` argument was intentionally omitted in the example above. This is
+ because the URL depends on the provider, dataset, and date you are running
+ the simulation. Build the endpoint according to the provider specification
+ and always validate that the target service is active before running your
+ simulation workflow.
Without OPeNDAP protocol
@@ -94,4 +171,3 @@ Environment class, for example:
- `Meteomatics `_: `#545 `_
- `Open-Meteo `_: `#520 `_
-
diff --git a/rocketpy/environment/environment.py b/rocketpy/environment/environment.py
index 6743b06ae..39441ecae 100644
--- a/rocketpy/environment/environment.py
+++ b/rocketpy/environment/environment.py
@@ -27,6 +27,7 @@
find_latitude_index,
find_longitude_index,
find_time_index,
+ geodesic_to_lambert_conformal,
geodesic_to_utm,
get_elevation_data_from_dataset,
get_final_date_from_time_array,
@@ -138,15 +139,15 @@ class Environment:
Environment.atmospheric_model_type : string
Describes the atmospheric model which is being used. Can only assume the
following values: ``standard_atmosphere``, ``custom_atmosphere``,
- ``wyoming_sounding``, ``Forecast``, ``Reanalysis``,
- ``Ensemble``.
+ ``wyoming_sounding``, ``windy``, ``forecast``, ``reanalysis``,
+ ``ensemble``.
Environment.atmospheric_model_file : string
Address of the file used for the atmospheric model being used. Only
- defined for ``wyoming_sounding``, ``Forecast``,
- ``Reanalysis``, ``Ensemble``
+ defined for ``wyoming_sounding``, ``windy``, ``forecast``,
+ ``reanalysis``, ``ensemble``
Environment.atmospheric_model_dict : dictionary
Dictionary used to properly interpret ``netCDF`` and ``OPeNDAP`` files.
- Only defined for ``Forecast``, ``Reanalysis``, ``Ensemble``.
+ Only defined for ``forecast``, ``reanalysis``, ``ensemble``.
Environment.atmospheric_model_init_date : datetime
Datetime object instance of first available date in ``netCDF``
and ``OPeNDAP`` files when using ``Forecast``, ``Reanalysis`` or
@@ -295,21 +296,21 @@ def __init__(
- :attr:`Environment.datetime_date`: UTC time of launch.
- Must be given if a Forecast, Reanalysis
- or Ensemble, will be set as an atmospheric model.
+ Must be given if a ``windy``, ``forecast``, ``reanalysis``
+ or ``ensemble`` atmospheric model will be used.
Default is None.
See :meth:`Environment.set_date` for more information.
latitude : float, optional
Latitude in degrees (ranging from -90 to 90) of rocket
- launch location. Must be given if a Forecast, Reanalysis
- or Ensemble will be used as an atmospheric model or if
+ launch location. Must be given if a ``windy``, ``forecast``,
+ ``reanalysis`` or ``ensemble`` atmospheric model will be used or if
Open-Elevation will be used to compute elevation. Positive
values correspond to the North. Default value is 0, which
corresponds to the equator.
longitude : float, optional
Longitude in degrees (ranging from -180 to 180) of rocket
- launch location. Must be given if a Forecast, Reanalysis
- or Ensemble will be used as an atmospheric model or if
+ launch location. Must be given if a ``windy``, ``forecast``,
+ ``reanalysis`` or ``ensemble`` atmospheric model will be used or if
Open-Elevation will be used to compute elevation. Positive
values correspond to the East. Default value is 0, which
corresponds to the Greenwich Meridian.
@@ -605,13 +606,81 @@ def __set_earth_rotation_vector(self):
# Validators (used to verify an attribute is being set correctly.)
+ @staticmethod
+ def __dictionary_matches_dataset(dictionary, dataset):
+ """Check whether a mapping dictionary is compatible with a dataset."""
+ variables = dataset.variables
+ required_keys = (
+ "time",
+ "latitude",
+ "longitude",
+ "level",
+ "temperature",
+ "u_wind",
+ "v_wind",
+ )
+
+ for key in required_keys:
+ variable_name = dictionary.get(key)
+ if variable_name is None or variable_name not in variables:
+ return False
+
+ projection_name = dictionary.get("projection")
+ if projection_name is not None and projection_name not in variables:
+ return False
+
+ geopotential_height_name = dictionary.get("geopotential_height")
+ geopotential_name = dictionary.get("geopotential")
+ has_geopotential_height = (
+ geopotential_height_name is not None
+ and geopotential_height_name in variables
+ )
+ has_geopotential = (
+ geopotential_name is not None and geopotential_name in variables
+ )
+
+ return has_geopotential_height or has_geopotential
+
+ def __resolve_dictionary_for_dataset(self, dictionary, dataset):
+ """Resolve a compatible mapping dictionary for the loaded dataset.
+
+ If the provided mapping is incompatible with the dataset variables,
+ this method tries built-in mappings and falls back to the first
+ compatible one.
+ """
+ if self.__dictionary_matches_dataset(dictionary, dataset):
+ return dictionary
+
+ for model_name, candidate in self.__weather_model_map.all_dictionaries.items():
+ if self.__dictionary_matches_dataset(candidate, dataset):
+ warnings.warn(
+ "Provided weather mapping does not match dataset variables. "
+ f"Falling back to built-in mapping '{model_name}'."
+ )
+ return candidate
+
+ return dictionary
+
def __validate_dictionary(self, file, dictionary):
# removed CMC until it is fixed.
- available_models = ["GFS", "NAM", "RAP", "HIRESW", "GEFS", "ERA5", "MERRA2"]
+ available_models = [
+ "GFS",
+ "NAM",
+ "RAP",
+ "HIRESW",
+ "GEFS",
+ "ERA5",
+ "MERRA2",
+ ]
if isinstance(dictionary, str):
dictionary = self.__weather_model_map.get(dictionary)
- elif file in available_models:
- dictionary = self.__weather_model_map.get(file)
+ elif isinstance(file, str):
+ matching_model = next(
+ (model for model in available_models if model.lower() == file.lower()),
+ None,
+ )
+ if matching_model is not None:
+ dictionary = self.__weather_model_map.get(matching_model)
if not isinstance(dictionary, dict):
raise TypeError(
"Please specify a dictionary or choose a valid model from the "
@@ -1045,171 +1114,41 @@ def set_atmospheric_model( # pylint: disable=too-many-statements
wind_u=0,
wind_v=0,
):
- """Defines an atmospheric model for the Environment. Supported
- functionality includes using data from the `International Standard
- Atmosphere`, importing data from weather reanalysis, forecasts and
- ensemble forecasts, importing data from upper air soundings and
- inputting data as custom functions, arrays or csv files.
+ """Define the atmospheric model for this Environment.
Parameters
----------
type : string
- One of the following options:
-
- - ``standard_atmosphere``: sets pressure and temperature profiles
- corresponding to the International Standard Atmosphere defined by
- ISO 2533 and ranging from -2 km to 80 km of altitude above sea
- level. Note that the wind profiles are set to zero when this type
- is chosen.
-
- - ``wyoming_sounding``: sets pressure, temperature, wind-u
- and wind-v profiles and surface elevation obtained from
- an upper air sounding given by the file parameter through
- an URL. This URL should point to a data webpage given by
- selecting plot type as text: list, a station and a time at
- `weather.uwyo`_.
- An example of a valid link would be:
-
- http://weather.uwyo.edu/cgi-bin/sounding?region=samer&TYPE=TEXT%3ALIST&YEAR=2019&MONTH=02&FROM=0200&TO=0200&STNM=82599
-
- .. _weather.uwyo: http://weather.uwyo.edu/upperair/sounding.html
-
- - ``windy_atmosphere``: sets pressure, temperature, wind-u and
- wind-v profiles and surface elevation obtained from the Windy API.
- See file argument to specify the model as either ``ECMWF``,
- ``GFS`` or ``ICON``.
-
- - ``Forecast``: sets pressure, temperature, wind-u and wind-v
- profiles and surface elevation obtained from a weather forecast
- file in ``netCDF`` format or from an ``OPeNDAP`` URL, both given
- through the file parameter. When this type is chosen, the date
- and location of the launch should already have been set through
- the date and location parameters when initializing the
- Environment. The ``netCDF`` and ``OPeNDAP`` datasets must contain
- at least geopotential height or geopotential, temperature, wind-u
- and wind-v profiles as a function of pressure levels. If surface
- geopotential or geopotential height is given, elevation is also
- set. Otherwise, elevation is not changed. Profiles are
- interpolated bi-linearly using supplied latitude and longitude.
- The date used is the nearest one to the date supplied.
- Furthermore, a dictionary must be supplied through the dictionary
- parameter in order for the dataset to be accurately read. Lastly,
- the dataset must use a rectangular grid sorted in either ascending
- or descending order of latitude and longitude.
-
- - ``Reanalysis``: sets pressure, temperature, wind-u and wind-v
- profiles and surface elevation obtained from a weather forecast
- file in ``netCDF`` format or from an ``OPeNDAP`` URL, both given
- through the file parameter. When this type is chosen, the date and
- location of the launch should already have been set through the
- date and location parameters when initializing the Environment.
- The ``netCDF`` and ``OPeNDAP`` datasets must contain at least
- geopotential height or geopotential, temperature, wind-u and
- wind-v profiles as a function of pressure levels. If surface
- geopotential or geopotential height is given, elevation is also
- set. Otherwise, elevation is not changed. Profiles are
- interpolated bi-linearly using supplied latitude and longitude.
- The date used is the nearest one to the date supplied.
- Furthermore, a dictionary must be supplied through the dictionary
- parameter in order for the dataset to be accurately read. Lastly,
- the dataset must use a rectangular grid sorted in either ascending
- or descending order of latitude and longitude.
-
- - ``Ensemble``: sets pressure, temperature, wind-u and wind-v
- profiles and surface elevation obtained from a weather forecast
- file in ``netCDF`` format or from an ``OPeNDAP`` URL, both given
- through the file parameter. When this type is chosen, the date and
- location of the launch should already have been set through the
- date and location parameters when initializing the Environment.
- The ``netCDF`` and ``OPeNDAP`` datasets must contain at least
- geopotential height or geopotential, temperature, wind-u and
- wind-v profiles as a function of pressure levels. If surface
- geopotential or geopotential height is given, elevation is also
- set. Otherwise, elevation is not changed. Profiles are
- interpolated bi-linearly using supplied latitude and longitude.
- The date used is the nearest one to the date supplied.
- Furthermore, a dictionary must be supplied through the dictionary
- parameter in order for the dataset to be accurately read. Lastly,
- the dataset must use a rectangular grid sorted in either ascending
- or descending order of latitude and longitude. By default the
- first ensemble forecast is activated.
-
- .. seealso::
-
- To activate other ensemble forecasts see
- :meth:`rocketpy.Environment.select_ensemble_member`.
-
- - ``custom_atmosphere``: sets pressure, temperature, wind-u and
- wind-v profiles given though the pressure, temperature, wind-u and
- wind-v parameters of this method. If pressure or temperature is
- not given, it will default to the `International Standard
- Atmosphere`. If the wind components are not given, it will default
- to 0.
-
- file : string, optional
- String that must be given when type is either ``wyoming_sounding``,
- ``Forecast``, ``Reanalysis``, ``Ensemble`` or ``Windy``. It
- specifies the location of the data given, either through a local
- file address or a URL. If type is ``Forecast``, this parameter can
- also be either ``GFS``, ``FV3``, ``RAP`` or ``NAM`` for latest of
- these forecasts.
-
- .. note::
-
- Time reference for the Forecasts are:
-
- - ``GFS``: `Global` - 0.25deg resolution - Updates every 6
- hours, forecast for 81 points spaced by 3 hours
- - ``RAP``: `Regional USA` - 0.19deg resolution - Updates hourly,
- forecast for 40 points spaced hourly
- - ``NAM``: `Regional CONUS Nest` - 5 km resolution - Updates
- every 6 hours, forecast for 21 points spaced by 3 hours
-
- If type is ``Ensemble``, this parameter can also be ``GEFS``
- for the latest of this ensemble.
-
- .. note::
-
- Time referece for the Ensembles are:
-
- - GEFS: Global, bias-corrected, 0.5deg resolution, 21 forecast
- members, Updates every 6 hours, forecast for 65 points spaced
- by 4 hours
- - CMC (currently not available): Global, 0.5deg resolution, 21 \
- forecast members, Updates every 12 hours, forecast for 65 \
- points spaced by 4 hours
-
- If type is ``Windy``, this parameter can be either ``GFS``,
- ``ECMWF``, ``ICON`` or ``ICONEU``. Default in this case is ``ECMWF``.
- dictionary : dictionary, string, optional
- Dictionary that must be given when type is either ``Forecast``,
- ``Reanalysis`` or ``Ensemble``. It specifies the dictionary to be
- used when reading ``netCDF`` and ``OPeNDAP`` files, allowing the
- correct retrieval of data. Acceptable values include ``ECMWF``,
- ``NOAA``, ``UCAR`` and ``MERRA2`` for default dictionaries which can generally
- be used to read datasets from these institutes. Alternatively, a
- dictionary structure can also be given, specifying the short names
- used for time, latitude, longitude, pressure levels, temperature
- profile, geopotential or geopotential height profile, wind-u and
- wind-v profiles in the dataset given in the file parameter.
- Additionally, ensemble dictionaries must have the ensemble as well.
- An example is the following dictionary, used for ``NOAA``:
-
- .. code-block:: python
-
- dictionary = {
- "time": "time",
- "latitude": "lat",
- "longitude": "lon",
- "level": "lev",
- "ensemble": "ens",
- "temperature": "tmpprs",
- "surface_geopotential_height": "hgtsfc",
- "geopotential_height": "hgtprs",
- "geopotential": None,
- "u_wind": "ugrdprs",
- "v_wind": "vgrdprs",
- }
+ Atmospheric model selector (case-insensitive). Accepted values are
+ ``"standard_atmosphere"``, ``"wyoming_sounding"``, ``"windy"``,
+ ``"forecast"``, ``"reanalysis"``, ``"ensemble"`` and
+ ``"custom_atmosphere"``.
+ file : string | netCDF4.Dataset, optional
+ Data source or model shortcut. Meaning depends on ``type``:
+
+ - ``"standard_atmosphere"`` and ``"custom_atmosphere"``: ignored.
+ - ``"wyoming_sounding"``: URL of the sounding text page.
+ - ``"windy"``: one of ``"ECMWF"``, ``"GFS"``, ``"ICON"`` or
+ ``"ICONEU"``.
+ - ``"forecast"``: local path, OPeNDAP URL, open
+ ``netCDF4.Dataset``, or one of ``"GFS"``, ``"NAM"`` or ``"RAP"``
+ for the latest available forecast.
+ - ``"reanalysis"``: local path, OPeNDAP URL, or open
+ ``netCDF4.Dataset``.
+ - ``"ensemble"``: local path, OPeNDAP URL, open
+ ``netCDF4.Dataset``, or ``"GEFS"`` for the latest available
+ forecast.
+ dictionary : dict | str, optional
+ Variable-name mapping for ``"forecast"``, ``"reanalysis"`` and
+ ``"ensemble"``. It may be a custom dictionary or a built-in
+ mapping name (for example: ``"ECMWF"``, ``"ECMWF_v0"``,
+ ``"NOAA"``, ``"GFS"``, ``"NAM"``, ``"RAP"``, ``"HIRESW"``,
+ ``"GEFS"``, ``"MERRA2"`` or ``"CMC"``).
+
+ If ``dictionary`` is omitted and ``file`` is one of RocketPy's
+ latest-model shortcuts, the matching built-in mapping is selected
+ automatically. For ensemble datasets, the mapping must include the
+ ensemble dimension key (typically ``"ensemble"``).
pressure : float, string, array, callable, optional
This defines the atmospheric pressure profile.
@@ -1272,6 +1211,36 @@ def set_atmospheric_model( # pylint: disable=too-many-statements
Returns
-------
None
+
+ Raises
+ ------
+ ValueError
+ If ``type`` is unknown, if required launch date/time information is
+ missing for date-dependent models, if Windy model names are invalid,
+ or if required atmospheric variables cannot be read from the input
+ dataset.
+ TypeError
+ If ``dictionary`` is invalid for ``"forecast"``, ``"reanalysis"``
+ or ``"ensemble"``.
+ KeyError
+ If a built-in mapping name passed in ``dictionary`` is unknown.
+
+ See Also
+ --------
+ :ref:`atmospheric_models`
+ Overview of all atmospheric-model workflows in the user guide.
+ :ref:`forecast`
+ Forecast and Windy usage details, including latest-model shortcuts.
+ :ref:`reanalysis`
+ Reanalysis and MERRA-2 examples.
+ :ref:`soundings`
+ Wyoming sounding workflow and RUC migration notes.
+ :ref:`custom_atmosphere`
+ Defining pressure, temperature and wind profiles directly.
+ :ref:`ensemble_atmosphere`
+ Ensemble forecasts and member-selection workflow.
+ :ref:`environment_other_apis`
+ Building custom mapping dictionaries for NetCDF/OPeNDAP APIs.
"""
# Save atmospheric model type
self.atmospheric_model_type = type
@@ -1287,6 +1256,36 @@ def set_atmospheric_model( # pylint: disable=too-many-statements
case "windy":
self.process_windy_atmosphere(file)
case "forecast" | "reanalysis" | "ensemble":
+ if isinstance(file, str):
+ shortcut_map = self.__atm_type_file_to_function_map.get(type, {})
+ matching_shortcut = next(
+ (
+ shortcut
+ for shortcut in shortcut_map
+ if shortcut.lower() == file.lower()
+ ),
+ None,
+ )
+ if matching_shortcut is not None:
+ file = matching_shortcut
+
+ if isinstance(file, str):
+ file_upper = file.upper()
+ if type == "forecast" and file_upper == "HIRESW":
+ raise ValueError(
+ "The HIRESW latest-model shortcut is currently "
+ "unavailable because NOMADS OPeNDAP is deactivated. "
+ "Please use another forecast source or provide a "
+ "compatible dataset path/URL explicitly."
+ )
+ if type == "ensemble" and file_upper == "GEFS":
+ raise ValueError(
+ "The GEFS latest-model shortcut is currently "
+ "unavailable because NOMADS OPeNDAP is deactivated. "
+ "Please use another ensemble source or provide a "
+ "compatible dataset path/URL explicitly."
+ )
+
dictionary = self.__validate_dictionary(file, dictionary)
try:
fetch_function = self.__atm_type_file_to_function_map[type][file]
@@ -1471,6 +1470,12 @@ def process_windy_atmosphere(self, model="ECMWF"): # pylint: disable=too-many-s
``ECMWF`` for the `ECMWF-HRES` model, ``GFS`` for the `GFS` model,
``ICON`` for the `ICON-Global` model or ``ICONEU`` for the `ICON-EU`
model.
+
+ Raises
+ ------
+ ValueError
+ If ``model`` is not one of ``ECMWF``, ``GFS``, ``ICON`` or
+ ``ICONEU``.
"""
if model.lower() not in ["ecmwf", "gfs", "icon", "iconeu"]:
@@ -1728,6 +1733,13 @@ def process_forecast_reanalysis(self, file, dictionary): # pylint: disable=too-
Returns
-------
None
+
+ Raises
+ ------
+ ValueError
+ If launch date/time was not set before loading date-dependent data,
+ or if required geopotential/geopotential-height, temperature,
+ wind-u, or wind-v variables cannot be read from the dataset.
"""
# Check if date, lat and lon are known
self.__validate_datetime()
@@ -1735,20 +1747,34 @@ def process_forecast_reanalysis(self, file, dictionary): # pylint: disable=too-
# Read weather file
if isinstance(file, str):
data = netCDF4.Dataset(file)
- if dictionary["time"] not in data.variables.keys():
- dictionary = self.__weather_model_map.get("ECMWF_v0")
else:
data = file
+ dictionary = self.__resolve_dictionary_for_dataset(dictionary, data)
+
# Get time, latitude and longitude data from file
time_array = data.variables[dictionary["time"]]
- lon_list = data.variables[dictionary["longitude"]][:].tolist()
- lat_list = data.variables[dictionary["latitude"]][:].tolist()
+ lon_array = data.variables[dictionary["longitude"]]
+ lat_array = data.variables[dictionary["latitude"]]
+
+ # Some THREDDS datasets use projected x/y coordinates.
+ if dictionary.get("projection") is not None:
+ projection_variable = data.variables[dictionary["projection"]]
+ x_units = getattr(lon_array, "units", "m")
+ target_lon, target_lat = geodesic_to_lambert_conformal(
+ self.latitude,
+ self.longitude,
+ projection_variable,
+ x_units=x_units,
+ )
+ else:
+ target_lon = self.longitude
+ target_lat = self.latitude
# Find time, latitude and longitude indexes
time_index = find_time_index(self.datetime_date, time_array)
- lon, lon_index = find_longitude_index(self.longitude, lon_list)
- _, lat_index = find_latitude_index(self.latitude, lat_list)
+ lon, lon_index = find_longitude_index(target_lon, lon_array)
+ _, lat_index = find_latitude_index(target_lat, lat_array)
# Get pressure level data from file
levels = get_pressure_levels_from_file(data, dictionary)
@@ -1806,9 +1832,9 @@ def process_forecast_reanalysis(self, file, dictionary): # pylint: disable=too-
) from e
# Prepare for bilinear interpolation
- x, y = self.latitude, lon
- x1, y1 = lat_list[lat_index - 1], lon_list[lon_index - 1]
- x2, y2 = lat_list[lat_index], lon_list[lon_index]
+ x, y = target_lat, lon
+ x1, y1 = float(lat_array[lat_index - 1]), float(lon_array[lon_index - 1])
+ x2, y2 = float(lat_array[lat_index]), float(lon_array[lon_index])
# Determine properties in lat, lon
height = bilinear_interpolation(
@@ -1860,6 +1886,17 @@ def process_forecast_reanalysis(self, file, dictionary): # pylint: disable=too-
wind_vs[:, 1, 1],
)
+ # Some datasets expose different level counts between fields
+ # (e.g., temperature on isobaric1 and geopotential on isobaric).
+ min_profile_length = min(
+ len(levels), len(height), len(temper), len(wind_u), len(wind_v)
+ )
+ levels = levels[:min_profile_length]
+ height = height[:min_profile_length]
+ temper = temper[:min_profile_length]
+ wind_u = wind_u[:min_profile_length]
+ wind_v = wind_v[:min_profile_length]
+
# Determine wind speed, heading and direction
wind_speed = calculate_wind_speed(wind_u, wind_v)
wind_heading = calculate_wind_heading(wind_u, wind_v)
@@ -1917,14 +1954,14 @@ def process_forecast_reanalysis(self, file, dictionary): # pylint: disable=too-
)
else:
self.atmospheric_model_interval = 0
- self.atmospheric_model_init_lat = lat_list[0]
- self.atmospheric_model_end_lat = lat_list[-1]
- self.atmospheric_model_init_lon = lon_list[0]
- self.atmospheric_model_end_lon = lon_list[-1]
+ self.atmospheric_model_init_lat = float(lat_array[0])
+ self.atmospheric_model_end_lat = float(lat_array[len(lat_array) - 1])
+ self.atmospheric_model_init_lon = float(lon_array[0])
+ self.atmospheric_model_end_lon = float(lon_array[len(lon_array) - 1])
# Save debugging data
- self.lat_array = lat_list
- self.lon_array = lon_list
+ self.lat_array = [x1, x2]
+ self.lon_array = [y1, y2]
self.lon_index = lon_index
self.lat_index = lat_index
self.geopotentials = geopotentials
@@ -1932,7 +1969,10 @@ def process_forecast_reanalysis(self, file, dictionary): # pylint: disable=too-
self.wind_vs = wind_vs
self.levels = levels
self.temperatures = temperatures
- self.time_array = time_array[:].tolist()
+ self.time_array = [
+ float(time_array[0]),
+ float(time_array[time_array.shape[0] - 1]),
+ ]
self.height = height
# Close weather data
@@ -1994,6 +2034,13 @@ def process_ensemble(self, file, dictionary): # pylint: disable=too-many-locals
--------
See the :class:``rocketpy.environment.weather_model_mapping`` for some
dictionary examples.
+
+ Raises
+ ------
+ ValueError
+ If launch date/time was not set before loading date-dependent data,
+ or if required geopotential/geopotential-height, temperature,
+ wind-u, or wind-v variables cannot be read from the dataset.
"""
# Check if date, lat and lon are known
self.__validate_datetime()
@@ -2004,23 +2051,40 @@ def process_ensemble(self, file, dictionary): # pylint: disable=too-many-locals
else:
data = file
+ dictionary = self.__resolve_dictionary_for_dataset(dictionary, data)
+
# Get time, latitude and longitude data from file
time_array = data.variables[dictionary["time"]]
- lon_list = data.variables[dictionary["longitude"]][:].tolist()
- lat_list = data.variables[dictionary["latitude"]][:].tolist()
+ lon_array = data.variables[dictionary["longitude"]]
+ lat_array = data.variables[dictionary["latitude"]]
+
+ # Some THREDDS datasets use projected x/y coordinates.
+ # TODO CHECK THIS I AM NOT SURE?????
+ if dictionary.get("projection") is not None:
+ projection_variable = data.variables[dictionary["projection"]]
+ x_units = getattr(lon_array, "units", "m")
+ target_lon, target_lat = geodesic_to_lambert_conformal(
+ self.latitude,
+ self.longitude,
+ projection_variable,
+ x_units=x_units,
+ )
+ else:
+ target_lon = self.longitude
+ target_lat = self.latitude
# Find time, latitude and longitude indexes
time_index = find_time_index(self.datetime_date, time_array)
- lon, lon_index = find_longitude_index(self.longitude, lon_list)
- _, lat_index = find_latitude_index(self.latitude, lat_list)
+ lon, lon_index = find_longitude_index(target_lon, lon_array)
+ _, lat_index = find_latitude_index(target_lat, lat_array)
# Get ensemble data from file
+ has_ensemble_dimension = True
try:
num_members = len(data.variables[dictionary["ensemble"]][:])
- except KeyError as e:
- raise ValueError(
- "Unable to read ensemble data from file. Check file and dictionary."
- ) from e
+ except KeyError:
+ has_ensemble_dimension = False
+ num_members = 1
# Get pressure level data from file
levels = get_pressure_levels_from_file(data, dictionary)
@@ -2079,10 +2143,16 @@ def process_ensemble(self, file, dictionary): # pylint: disable=too-many-locals
"Unable to read wind-v component. Check file and dictionary."
) from e
+ if not has_ensemble_dimension:
+ geopotentials = np.expand_dims(geopotentials, axis=0)
+ temperatures = np.expand_dims(temperatures, axis=0)
+ wind_us = np.expand_dims(wind_us, axis=0)
+ wind_vs = np.expand_dims(wind_vs, axis=0)
+
# Prepare for bilinear interpolation
- x, y = self.latitude, lon
- x1, y1 = lat_list[lat_index - 1], lon_list[lon_index - 1]
- x2, y2 = lat_list[lat_index], lon_list[lon_index]
+ x, y = target_lat, lon
+ x1, y1 = float(lat_array[lat_index - 1]), float(lon_array[lon_index - 1])
+ x2, y2 = float(lat_array[lat_index]), float(lon_array[lon_index])
# Determine properties in lat, lon
height = bilinear_interpolation(
@@ -2134,6 +2204,19 @@ def process_ensemble(self, file, dictionary): # pylint: disable=too-many-locals
wind_vs[:, :, 1, 1],
)
+ min_profile_length = min(
+ len(levels),
+ height.shape[1],
+ temper.shape[1],
+ wind_u.shape[1],
+ wind_v.shape[1],
+ )
+ levels = levels[:min_profile_length]
+ height = height[:, :min_profile_length]
+ temper = temper[:, :min_profile_length]
+ wind_u = wind_u[:, :min_profile_length]
+ wind_v = wind_v[:, :min_profile_length]
+
# Determine wind speed, heading and direction
wind_speed = calculate_wind_speed(wind_u, wind_v)
wind_heading = calculate_wind_heading(wind_u, wind_v)
@@ -2166,14 +2249,14 @@ def process_ensemble(self, file, dictionary): # pylint: disable=too-many-locals
self.atmospheric_model_init_date = get_initial_date_from_time_array(time_array)
self.atmospheric_model_end_date = get_final_date_from_time_array(time_array)
self.atmospheric_model_interval = get_interval_date_from_time_array(time_array)
- self.atmospheric_model_init_lat = lat_list[0]
- self.atmospheric_model_end_lat = lat_list[-1]
- self.atmospheric_model_init_lon = lon_list[0]
- self.atmospheric_model_end_lon = lon_list[-1]
+ self.atmospheric_model_init_lat = float(lat_array[0])
+ self.atmospheric_model_end_lat = float(lat_array[len(lat_array) - 1])
+ self.atmospheric_model_init_lon = float(lon_array[0])
+ self.atmospheric_model_end_lon = float(lon_array[len(lon_array) - 1])
# Save debugging data
- self.lat_array = lat_list
- self.lon_array = lon_list
+ self.lat_array = [x1, x2]
+ self.lon_array = [y1, y2]
self.lon_index = lon_index
self.lat_index = lat_index
self.geopotentials = geopotentials
@@ -2181,7 +2264,10 @@ def process_ensemble(self, file, dictionary): # pylint: disable=too-many-locals
self.wind_vs = wind_vs
self.levels = levels
self.temperatures = temperatures
- self.time_array = time_array[:].tolist()
+ self.time_array = [
+ float(time_array[0]),
+ float(time_array[time_array.shape[0] - 1]),
+ ]
self.height = height
# Close weather data
diff --git a/rocketpy/environment/fetchers.py b/rocketpy/environment/fetchers.py
index d5ac2a1df..589159f1c 100644
--- a/rocketpy/environment/fetchers.py
+++ b/rocketpy/environment/fetchers.py
@@ -113,33 +113,18 @@ def fetch_gfs_file_return_dataset(max_attempts=10, base_delay=2):
RuntimeError
If unable to load the latest weather data for GFS.
"""
- time_attempt = datetime.now(tz=timezone.utc)
+ file_url = (
+ "https://thredds.ucar.edu/thredds/dodsC/grib/NCEP/GFS/Global_0p25deg/Best"
+ )
attempt_count = 0
- dataset = None
-
- # TODO: the code below is trying to determine the hour of the latest available
- # forecast by trial and error. This is not the best way to do it. We should
- # actually check the NOAA website for the latest forecast time. Refactor needed.
while attempt_count < max_attempts:
- time_attempt -= timedelta(hours=6) # GFS updates every 6 hours
- file_url = (
- f"https://nomads.ncep.noaa.gov/dods/gfs_0p25/gfs"
- f"{time_attempt.year:04d}{time_attempt.month:02d}"
- f"{time_attempt.day:02d}/"
- f"gfs_0p25_{6 * (time_attempt.hour // 6):02d}z"
- )
try:
- # Attempts to create a dataset from the file using OpenDAP protocol.
- dataset = netCDF4.Dataset(file_url)
- return dataset
+ return netCDF4.Dataset(file_url)
except OSError:
attempt_count += 1
time.sleep(base_delay**attempt_count)
- if dataset is None:
- raise RuntimeError(
- "Unable to load latest weather data for GFS through " + file_url
- )
+ raise RuntimeError("Unable to load latest weather data for GFS through " + file_url)
def fetch_nam_file_return_dataset(max_attempts=10, base_delay=2):
@@ -163,28 +148,16 @@ def fetch_nam_file_return_dataset(max_attempts=10, base_delay=2):
RuntimeError
If unable to load the latest weather data for NAM.
"""
- # Attempt to get latest forecast
- time_attempt = datetime.now(tz=timezone.utc)
+ file_url = "https://thredds.ucar.edu/thredds/dodsC/grib/NCEP/NAM/CONUS_12km/Best"
attempt_count = 0
- dataset = None
-
while attempt_count < max_attempts:
- time_attempt -= timedelta(hours=6) # NAM updates every 6 hours
- file = (
- f"https://nomads.ncep.noaa.gov/dods/nam/nam{time_attempt.year:04d}"
- f"{time_attempt.month:02d}{time_attempt.day:02d}/"
- f"nam_conusnest_{6 * (time_attempt.hour // 6):02d}z"
- )
try:
- # Attempts to create a dataset from the file using OpenDAP protocol.
- dataset = netCDF4.Dataset(file)
- return dataset
+ return netCDF4.Dataset(file_url)
except OSError:
attempt_count += 1
time.sleep(base_delay**attempt_count)
- if dataset is None:
- raise RuntimeError("Unable to load latest weather data for NAM through " + file)
+ raise RuntimeError("Unable to load latest weather data for NAM through " + file_url)
def fetch_rap_file_return_dataset(max_attempts=10, base_delay=2):
@@ -208,28 +181,88 @@ def fetch_rap_file_return_dataset(max_attempts=10, base_delay=2):
RuntimeError
If unable to load the latest weather data for RAP.
"""
- # Attempt to get latest forecast
- time_attempt = datetime.now(tz=timezone.utc)
+ file_url = "https://thredds.ucar.edu/thredds/dodsC/grib/NCEP/RAP/CONUS_13km/Best"
attempt_count = 0
- dataset = None
+ while attempt_count < max_attempts:
+ try:
+ return netCDF4.Dataset(file_url)
+ except OSError:
+ attempt_count += 1
+ time.sleep(base_delay**attempt_count)
+
+ raise RuntimeError("Unable to load latest weather data for RAP through " + file_url)
+
+def fetch_hrrr_file_return_dataset(max_attempts=10, base_delay=2):
+ """Fetches the latest HRRR (High-Resolution Rapid Refresh) dataset from
+ the NOAA's GrADS data server using the OpenDAP protocol.
+
+ Parameters
+ ----------
+ max_attempts : int, optional
+ The maximum number of attempts to fetch the dataset. Default is 10.
+ base_delay : int, optional
+ The base delay in seconds between attempts. Default is 2.
+
+ Returns
+ -------
+ netCDF4.Dataset
+ The HRRR dataset.
+
+ Raises
+ ------
+ RuntimeError
+ If unable to load the latest weather data for HRRR.
+ """
+ file_url = "https://thredds.ucar.edu/thredds/dodsC/grib/NCEP/HRRR/CONUS_2p5km/Best"
+ attempt_count = 0
while attempt_count < max_attempts:
- time_attempt -= timedelta(hours=1) # RAP updates every hour
- file = (
- f"https://nomads.ncep.noaa.gov/dods/rap/rap{time_attempt.year:04d}"
- f"{time_attempt.month:02d}{time_attempt.day:02d}/"
- f"rap_{time_attempt.hour:02d}z"
- )
try:
- # Attempts to create a dataset from the file using OpenDAP protocol.
- dataset = netCDF4.Dataset(file)
- return dataset
+ return netCDF4.Dataset(file_url)
except OSError:
attempt_count += 1
time.sleep(base_delay**attempt_count)
- if dataset is None:
- raise RuntimeError("Unable to load latest weather data for RAP through " + file)
+ raise RuntimeError(
+ "Unable to load latest weather data for HRRR through " + file_url
+ )
+
+
+def fetch_aigfs_file_return_dataset(max_attempts=10, base_delay=2):
+ """Fetches the latest AIGFS (Artificial Intelligence GFS) dataset from
+ the NOAA's GrADS data server using the OpenDAP protocol.
+
+ Parameters
+ ----------
+ max_attempts : int, optional
+ The maximum number of attempts to fetch the dataset. Default is 10.
+ base_delay : int, optional
+ The base delay in seconds between attempts. Default is 2.
+
+ Returns
+ -------
+ netCDF4.Dataset
+ The AIGFS dataset.
+
+ Raises
+ ------
+ RuntimeError
+ If unable to load the latest weather data for AIGFS.
+ """
+ file_url = (
+ "https://thredds.ucar.edu/thredds/dodsC/grib/NCEP/AIGFS/Global_0p25deg/Best"
+ )
+ attempt_count = 0
+ while attempt_count < max_attempts:
+ try:
+ return netCDF4.Dataset(file_url)
+ except OSError:
+ attempt_count += 1
+ time.sleep(base_delay**attempt_count)
+
+ raise RuntimeError(
+ "Unable to load latest weather data for AIGFS through " + file_url
+ )
def fetch_hiresw_file_return_dataset(max_attempts=10, base_delay=2):
diff --git a/rocketpy/environment/tools.py b/rocketpy/environment/tools.py
index 1239ee6b9..fb0179c9e 100644
--- a/rocketpy/environment/tools.py
+++ b/rocketpy/environment/tools.py
@@ -5,7 +5,7 @@
future to improve their performance and usability.
"""
-import bisect
+import math
import warnings
import netCDF4
@@ -109,6 +109,63 @@ def calculate_wind_speed(u, v, w=0.0):
return np.sqrt(u**2 + v**2 + w**2)
+def geodesic_to_lambert_conformal(lat, lon, projection_variable, x_units="m"):
+ """Convert geodesic coordinates to Lambert conformal projected coordinates.
+
+ Parameters
+ ----------
+ lat : float
+ Latitude in degrees.
+ lon : float
+ Longitude in degrees, ranging from -180 to 180.
+ projection_variable : netCDF4.Variable
+ Projection variable containing Lambert conformal metadata.
+ x_units : str, optional
+ Units used by the dataset x coordinate. Supported values are meters
+ and kilometers. Default is "m".
+
+ Returns
+ -------
+ tuple[float, float]
+ Projected coordinates ``(x, y)`` in the same units as ``x_units``.
+ """
+ lat_radians = math.radians(lat)
+ lon_radians = math.radians(lon % 360)
+
+ lat_origin = math.radians(float(projection_variable.latitude_of_projection_origin))
+ lon_origin = math.radians(float(projection_variable.longitude_of_central_meridian))
+
+ standard_parallel = projection_variable.standard_parallel
+ if np.ndim(standard_parallel) == 0:
+ standard_parallels = [float(standard_parallel)]
+ else:
+ standard_parallels = np.asarray(standard_parallel, dtype=float).tolist()
+
+ if len(standard_parallels) >= 2:
+ phi_1 = math.radians(standard_parallels[0])
+ phi_2 = math.radians(standard_parallels[1])
+ n = math.log(math.cos(phi_1) / math.cos(phi_2)) / math.log(
+ math.tan(math.pi / 4 + phi_2 / 2) / math.tan(math.pi / 4 + phi_1 / 2)
+ )
+ else:
+ phi_1 = math.radians(standard_parallels[0])
+ n = math.sin(phi_1)
+
+ earth_radius = float(getattr(projection_variable, "earth_radius", 6371229.0))
+ f_const = (math.cos(phi_1) * math.tan(math.pi / 4 + phi_1 / 2) ** n) / n
+
+ rho = earth_radius * f_const / (math.tan(math.pi / 4 + lat_radians / 2) ** n)
+ rho_origin = earth_radius * f_const / (math.tan(math.pi / 4 + lat_origin / 2) ** n)
+ theta = n * (lon_radians - lon_origin)
+
+ x_meters = rho * math.sin(theta)
+ y_meters = rho_origin - rho * math.cos(theta)
+
+ if str(x_units).lower().startswith("km"):
+ return x_meters / 1000.0, y_meters / 1000.0
+ return x_meters, y_meters
+
+
## These functions are meant to be used with netcdf4 datasets
@@ -168,7 +225,7 @@ def mask_and_clean_dataset(*args):
return data_array
-def find_longitude_index(longitude, lon_list):
+def find_longitude_index(longitude, lon_list): # pylint: disable=too-many-statements
"""Finds the index of the given longitude in a list of longitudes.
Parameters
@@ -188,30 +245,48 @@ def find_longitude_index(longitude, lon_list):
ValueError
If the longitude is not within the range covered by the list.
"""
- # Determine if file uses -180 to 180 or 0 to 360
- if lon_list[0] < 0 or lon_list[-1] < 0:
- # Convert input to -180 - 180
- lon = longitude if longitude < 180 else -180 + longitude % 180
- else:
- # Convert input to 0 - 360
- lon = longitude % 360
- # Check if reversed or sorted
- if lon_list[0] < lon_list[-1]:
- # Deal with sorted lon_list
- lon_index = bisect.bisect(lon_list, lon)
+
+ def _coord_value(source, index):
+ return float(source[index])
+
+ lon_len = len(lon_list)
+ lon_start = _coord_value(lon_list, 0)
+ lon_end = _coord_value(lon_list, lon_len - 1)
+
+ # Determine if file uses geographic longitudes in [-180, 180] or [0, 360].
+ # Do not remap projected x coordinates.
+ is_geographic_longitude = abs(lon_start) <= 360 and abs(lon_end) <= 360
+ if is_geographic_longitude:
+ if lon_start < 0 or lon_end < 0:
+ lon = longitude if longitude < 180 else -180 + longitude % 180
+ else:
+ lon = longitude % 360
else:
- # Deal with reversed lon_list
- lon_list.reverse()
- lon_index = len(lon_list) - bisect.bisect_left(lon_list, lon)
- lon_list.reverse()
+ lon = longitude
+
+ is_ascending = lon_start < lon_end
+
+ # Binary search to find the insertion index such that index-1 and index
+ # bracket the requested longitude.
+ low = 0
+ high = lon_len
+ while low < high:
+ mid = (low + high) // 2
+ mid_value = _coord_value(lon_list, mid)
+ if (mid_value < lon) if is_ascending else (mid_value > lon):
+ low = mid + 1
+ else:
+ high = mid
+ lon_index = low
+
# Take care of longitude value equal to maximum longitude in the grid
- if lon_index == len(lon_list) and lon_list[lon_index - 1] == lon:
- lon_index = lon_index - 1
+ if lon_index == lon_len and _coord_value(lon_list, lon_index - 1) == lon:
+ lon_index -= 1
# Check if longitude value is inside the grid
- if lon_index == 0 or lon_index == len(lon_list):
+ if lon_index in (0, lon_len):
raise ValueError(
f"Longitude {lon} not inside region covered by file, which is "
- f"from {lon_list[0]} to {lon_list[-1]}."
+ f"from {lon_start} to {lon_end}."
)
return lon, lon_index
@@ -237,28 +312,39 @@ def find_latitude_index(latitude, lat_list):
ValueError
If the latitude is not within the range covered by the list.
"""
- # Check if reversed or sorted
- if lat_list[0] < lat_list[-1]:
- # Deal with sorted lat_list
- lat_index = bisect.bisect(lat_list, latitude)
- else:
- # Deal with reversed lat_list
- lat_list.reverse()
- lat_index = len(lat_list) - bisect.bisect_left(lat_list, latitude)
- lat_list.reverse()
- # Take care of latitude value equal to maximum longitude in the grid
- if lat_index == len(lat_list) and lat_list[lat_index - 1] == latitude:
- lat_index = lat_index - 1
+
+ def _coord_value(source, index):
+ return float(source[index])
+
+ lat_len = len(lat_list)
+ lat_start = _coord_value(lat_list, 0)
+ lat_end = _coord_value(lat_list, lat_len - 1)
+ is_ascending = lat_start < lat_end
+
+ low = 0
+ high = lat_len
+ while low < high:
+ mid = (low + high) // 2
+ mid_value = _coord_value(lat_list, mid)
+ if (mid_value < latitude) if is_ascending else (mid_value > latitude):
+ low = mid + 1
+ else:
+ high = mid
+ lat_index = low
+
+ # Take care of latitude value equal to maximum latitude in the grid
+ if lat_index == lat_len and _coord_value(lat_list, lat_index - 1) == latitude:
+ lat_index -= 1
# Check if latitude value is inside the grid
- if lat_index == 0 or lat_index == len(lat_list):
+ if lat_index in (0, lat_len):
raise ValueError(
f"Latitude {latitude} not inside region covered by file, "
- f"which is from {lat_list[0]} to {lat_list[-1]}."
+ f"which is from {lat_start} to {lat_end}."
)
return latitude, lat_index
-def find_time_index(datetime_date, time_array):
+def find_time_index(datetime_date, time_array): # pylint: disable=too-many-statements
"""Finds the index of the given datetime in a netCDF4 time array.
Parameters
@@ -280,26 +366,58 @@ def find_time_index(datetime_date, time_array):
ValueError
If the exact datetime is not available and the nearest datetime is used instead.
"""
- time_index = netCDF4.date2index(
- datetime_date, time_array, calendar="gregorian", select="nearest"
- )
- # Convert times do dates and numbers
- input_time_num = netCDF4.date2num(
- datetime_date, time_array.units, calendar="gregorian"
- )
- file_time_num = time_array[time_index]
- file_time_date = netCDF4.num2date(
- time_array[time_index], time_array.units, calendar="gregorian"
- )
+ time_len = len(time_array)
+ time_units = time_array.units
+ input_time_num = netCDF4.date2num(datetime_date, time_units, calendar="gregorian")
+
+ first_time_num = float(time_array[0])
+ last_time_num = float(time_array[time_len - 1])
+ is_ascending = first_time_num <= last_time_num
+
+ # Binary search nearest index using scalar probing only.
+ low = 0
+ high = time_len
+ while low < high:
+ mid = (low + high) // 2
+ mid_time_num = float(time_array[mid])
+ if (
+ (mid_time_num < input_time_num)
+ if is_ascending
+ else (mid_time_num > input_time_num)
+ ):
+ low = mid + 1
+ else:
+ high = mid
+
+ right_index = min(max(low, 0), time_len - 1)
+ left_index = min(max(right_index - 1, 0), time_len - 1)
+
+ right_time_num = float(time_array[right_index])
+ left_time_num = float(time_array[left_index])
+ if abs(input_time_num - left_time_num) <= abs(right_time_num - input_time_num):
+ time_index = left_index
+ file_time_num = left_time_num
+ else:
+ time_index = right_index
+ file_time_num = right_time_num
+
+ file_time_date = netCDF4.num2date(file_time_num, time_units, calendar="gregorian")
+
# Check if time is inside range supplied by file
- if time_index == 0 and input_time_num < file_time_num:
+ if time_index == 0 and (
+ (is_ascending and input_time_num < file_time_num)
+ or (not is_ascending and input_time_num > file_time_num)
+ ):
raise ValueError(
f"The chosen launch time '{datetime_date.strftime('%Y-%m-%d-%H:')} UTC' is"
" not available in the provided file. Please choose a time within the range"
" of the file, which starts at "
f"'{file_time_date.strftime('%Y-%m-%d-%H')} UTC'."
)
- elif time_index == len(time_array) - 1 and input_time_num > file_time_num:
+ elif time_index == time_len - 1 and (
+ (is_ascending and input_time_num > file_time_num)
+ or (not is_ascending and input_time_num < file_time_num)
+ ):
raise ValueError(
"Chosen launch time is not available in the provided file, "
f"which ends at {file_time_date}."
diff --git a/rocketpy/environment/weather_model_mapping.py b/rocketpy/environment/weather_model_mapping.py
index 75089f577..c490fad9d 100644
--- a/rocketpy/environment/weather_model_mapping.py
+++ b/rocketpy/environment/weather_model_mapping.py
@@ -1,9 +1,42 @@
class WeatherModelMapping:
- """Class to map the weather model variables to the variables used in the
- Environment class.
+ """Map provider-specific variable names to RocketPy weather fields.
+
+ RocketPy reads forecast/reanalysis/ensemble datasets using canonical keys
+ such as ``time``, ``latitude``, ``longitude``, ``level``, ``temperature``,
+ ``geopotential_height``, ``geopotential``, ``u_wind`` and ``v_wind``.
+ Each dictionary in this class maps those canonical keys to the actual
+ variable names in a specific data provider format.
+
+ Mapping families
+ ----------------
+ - Base names (for example ``GFS``, ``NAM``, ``RAP``) represent the current
+ default mappings used by the latest-model shortcuts and THREDDS-style
+ datasets.
+ - ``*_LEGACY`` names represent older NOMADS-style variable naming
+ conventions (for example ``lev``, ``tmpprs``, ``ugrdprs`` and
+ ``vgrdprs``) and are intended for archived or previously downloaded files.
+
+ Notes
+ -----
+ - Mappings can also include optional keys such as ``projection`` for
+ projected grids and ``ensemble`` for member dimensions.
+ - The :meth:`get` method is case-insensitive, so ``"gfs_legacy"`` and
+ ``"GFS_LEGACY"`` are equivalent.
"""
GFS = {
+ "time": "time",
+ "latitude": "lat",
+ "longitude": "lon",
+ "level": "isobaric",
+ "temperature": "Temperature_isobaric",
+ "surface_geopotential_height": "Geopotential_height_surface",
+ "geopotential_height": "Geopotential_height_isobaric",
+ "geopotential": None,
+ "u_wind": "u-component_of_wind_isobaric",
+ "v_wind": "v-component_of_wind_isobaric",
+ }
+ GFS_LEGACY = {
"time": "time",
"latitude": "lat",
"longitude": "lon",
@@ -16,6 +49,19 @@ class WeatherModelMapping:
"v_wind": "vgrdprs",
}
NAM = {
+ "time": "time",
+ "latitude": "y",
+ "longitude": "x",
+ "projection": "LambertConformal_Projection",
+ "level": "isobaric",
+ "temperature": "Temperature_isobaric",
+ "surface_geopotential_height": None,
+ "geopotential_height": "Geopotential_height_isobaric",
+ "geopotential": None,
+ "u_wind": "u-component_of_wind_isobaric",
+ "v_wind": "v-component_of_wind_isobaric",
+ }
+ NAM_LEGACY = {
"time": "time",
"latitude": "lat",
"longitude": "lon",
@@ -54,6 +100,18 @@ class WeatherModelMapping:
"v_wind": "v",
}
NOAA = {
+ "time": "time",
+ "latitude": "lat",
+ "longitude": "lon",
+ "level": "isobaric",
+ "temperature": "Temperature_isobaric",
+ "surface_geopotential_height": "Geopotential_height_surface",
+ "geopotential_height": "Geopotential_height_isobaric",
+ "geopotential": None,
+ "u_wind": "u-component_of_wind_isobaric",
+ "v_wind": "v-component_of_wind_isobaric",
+ }
+ NOAA_LEGACY = {
"time": "time",
"latitude": "lat",
"longitude": "lon",
@@ -66,6 +124,19 @@ class WeatherModelMapping:
"v_wind": "vgrdprs",
}
RAP = {
+ "time": "time",
+ "latitude": "y",
+ "longitude": "x",
+ "projection": "LambertConformal_Projection",
+ "level": "isobaric",
+ "temperature": "Temperature_isobaric",
+ "surface_geopotential_height": None,
+ "geopotential_height": "Geopotential_height_isobaric",
+ "geopotential": None,
+ "u_wind": "u-component_of_wind_isobaric",
+ "v_wind": "v-component_of_wind_isobaric",
+ }
+ RAP_LEGACY = {
"time": "time",
"latitude": "lat",
"longitude": "lon",
@@ -90,6 +161,19 @@ class WeatherModelMapping:
"u_wind": "ugrdprs",
"v_wind": "vgrdprs",
}
+ CMC_LEGACY = {
+ "time": "time",
+ "latitude": "lat",
+ "longitude": "lon",
+ "level": "lev",
+ "ensemble": "ens",
+ "temperature": "tmpprs",
+ "surface_geopotential_height": None,
+ "geopotential_height": "hgtprs",
+ "geopotential": None,
+ "u_wind": "ugrdprs",
+ "v_wind": "vgrdprs",
+ }
GEFS = {
"time": "time",
"latitude": "lat",
@@ -103,6 +187,19 @@ class WeatherModelMapping:
"u_wind": "ugrdprs",
"v_wind": "vgrdprs",
}
+ GEFS_LEGACY = {
+ "time": "time",
+ "latitude": "lat",
+ "longitude": "lon",
+ "level": "lev",
+ "ensemble": "ens",
+ "temperature": "tmpprs",
+ "surface_geopotential_height": None,
+ "geopotential_height": "hgtprs",
+ "geopotential": None,
+ "u_wind": "ugrdprs",
+ "v_wind": "vgrdprs",
+ }
HIRESW = {
"time": "time",
"latitude": "lat",
@@ -114,6 +211,17 @@ class WeatherModelMapping:
"u_wind": "ugrdprs",
"v_wind": "vgrdprs",
}
+ HIRESW_LEGACY = {
+ "time": "time",
+ "latitude": "lat",
+ "longitude": "lon",
+ "level": "lev",
+ "temperature": "tmpprs",
+ "surface_geopotential_height": "hgtsfc",
+ "geopotential_height": "hgtprs",
+ "u_wind": "ugrdprs",
+ "v_wind": "vgrdprs",
+ }
MERRA2 = {
"time": "time",
"latitude": "lat",
@@ -127,29 +235,78 @@ class WeatherModelMapping:
"u_wind": "U",
"v_wind": "V",
}
+ MERRA2_LEGACY = {
+ "time": "time",
+ "latitude": "lat",
+ "longitude": "lon",
+ "level": "lev",
+ "temperature": "T",
+ "surface_geopotential_height": None,
+ "surface_geopotential": "PHIS",
+ "geopotential_height": "H",
+ "geopotential": None,
+ "u_wind": "U",
+ "v_wind": "V",
+ }
def __init__(self):
- """Initialize the class, creates a dictionary with all the weather models
- available and their respective dictionaries with the variables."""
+ """Build the lookup table with default and legacy mapping aliases."""
self.all_dictionaries = {
"GFS": self.GFS,
+ "GFS_LEGACY": self.GFS_LEGACY,
"NAM": self.NAM,
+ "NAM_LEGACY": self.NAM_LEGACY,
"ECMWF_v0": self.ECMWF_v0,
"ECMWF": self.ECMWF,
"NOAA": self.NOAA,
+ "NOAA_LEGACY": self.NOAA_LEGACY,
"RAP": self.RAP,
+ "RAP_LEGACY": self.RAP_LEGACY,
"CMC": self.CMC,
+ "CMC_LEGACY": self.CMC_LEGACY,
"GEFS": self.GEFS,
+ "GEFS_LEGACY": self.GEFS_LEGACY,
"HIRESW": self.HIRESW,
+ "HIRESW_LEGACY": self.HIRESW_LEGACY,
"MERRA2": self.MERRA2,
+ "MERRA2_LEGACY": self.MERRA2_LEGACY,
}
def get(self, model):
+ """Return a mapping dictionary by model alias (case-insensitive).
+
+ Parameters
+ ----------
+ model : str
+ Mapping alias name, such as ``"GFS"`` or ``"GFS_LEGACY"``.
+
+ Returns
+ -------
+ dict
+ Dictionary mapping RocketPy canonical weather keys to dataset
+ variable names.
+
+ Raises
+ ------
+ KeyError
+ If ``model`` is unknown or not a string.
+ """
+ if not isinstance(model, str):
+ raise KeyError(
+ f"Model {model} not found in the WeatherModelMapping. "
+ f"The available models are: {self.all_dictionaries.keys()}"
+ )
+
try:
return self.all_dictionaries[model]
- except KeyError as e:
+ except KeyError as exc:
+ model_casefold = model.casefold()
+ for key, value in self.all_dictionaries.items():
+ if key.casefold() == model_casefold:
+ return value
+
raise KeyError(
f"Model {model} not found in the WeatherModelMapping. "
f"The available models are: {self.all_dictionaries.keys()}"
- ) from e
+ ) from exc
diff --git a/tests/integration/environment/test_environment.py b/tests/integration/environment/test_environment.py
index c5e1103bd..5802650dc 100644
--- a/tests/integration/environment/test_environment.py
+++ b/tests/integration/environment/test_environment.py
@@ -232,8 +232,11 @@ def test_gefs_atmosphere(mock_show, example_spaceport_env): # pylint: disable=u
example_spaceport_env : rocketpy.Environment
Example environment object to be tested.
"""
- example_spaceport_env.set_atmospheric_model(type="Ensemble", file="GEFS")
- assert example_spaceport_env.all_info() is None
+ with pytest.raises(
+ ValueError,
+ match="GEFS latest-model shortcut is currently unavailable",
+ ):
+ example_spaceport_env.set_atmospheric_model(type="Ensemble", file="GEFS")
@pytest.mark.slow
@@ -288,13 +291,15 @@ def test_hiresw_ensemble_atmosphere(mock_show, example_spaceport_env): # pylint
example_spaceport_env.set_date(date_info)
- example_spaceport_env.set_atmospheric_model(
- type="Forecast",
- file="HIRESW",
- dictionary="HIRESW",
- )
-
- assert example_spaceport_env.all_info() is None
+ with pytest.raises(
+ ValueError,
+ match="HIRESW latest-model shortcut is currently unavailable",
+ ):
+ example_spaceport_env.set_atmospheric_model(
+ type="Forecast",
+ file="HIRESW",
+ dictionary="HIRESW",
+ )
@pytest.mark.skip(reason="CMC model is currently not working")
diff --git a/tests/unit/environment/test_environment.py b/tests/unit/environment/test_environment.py
index 6ad3e51db..6d04c089f 100644
--- a/tests/unit/environment/test_environment.py
+++ b/tests/unit/environment/test_environment.py
@@ -7,6 +7,7 @@
from rocketpy import Environment
from rocketpy.environment.tools import geodesic_to_utm, utm_to_geodesic
+from rocketpy.environment.weather_model_mapping import WeatherModelMapping
@pytest.mark.parametrize(
@@ -243,3 +244,70 @@ def test_environment_export_environment_exports_valid_environment_json(
)
os.remove("environment.json")
+
+
+class _DummyDataset:
+ """Small test double that mimics a netCDF dataset variables mapping."""
+
+ def __init__(self, variable_names):
+ self.variables = {name: object() for name in variable_names}
+
+
+def test_resolve_dictionary_keeps_compatible_mapping(example_plain_env):
+ """Keep the user-selected mapping when it already matches dataset keys."""
+ gfs_mapping = example_plain_env._Environment__weather_model_map.get("GFS")
+ dataset = _DummyDataset(
+ [
+ "time",
+ "lat",
+ "lon",
+ "isobaric",
+ "Temperature_isobaric",
+ "Geopotential_height_isobaric",
+ "u-component_of_wind_isobaric",
+ "v-component_of_wind_isobaric",
+ ]
+ )
+
+ resolved = example_plain_env._Environment__resolve_dictionary_for_dataset(
+ gfs_mapping, dataset
+ )
+
+ assert resolved is gfs_mapping
+
+
+def test_resolve_dictionary_falls_back_to_legacy_mapping(example_plain_env):
+ """Fallback to a compatible built-in mapping for legacy NOMADS-style files."""
+ thredds_gfs_mapping = example_plain_env._Environment__weather_model_map.get("GFS")
+ dataset = _DummyDataset(
+ [
+ "time",
+ "lat",
+ "lon",
+ "lev",
+ "tmpprs",
+ "hgtprs",
+ "ugrdprs",
+ "vgrdprs",
+ ]
+ )
+
+ resolved = example_plain_env._Environment__resolve_dictionary_for_dataset(
+ thredds_gfs_mapping, dataset
+ )
+
+ # Explicit legacy mappings should be preferred over unrelated model mappings.
+ assert resolved == example_plain_env._Environment__weather_model_map.get(
+ "GFS_LEGACY"
+ )
+ assert resolved["level"] == "lev"
+ assert resolved["temperature"] == "tmpprs"
+ assert resolved["geopotential_height"] == "hgtprs"
+
+
+def test_weather_model_mapping_exposes_legacy_aliases():
+ """Legacy mapping names should be available and case-insensitive."""
+ mapping = WeatherModelMapping()
+
+ assert mapping.get("GFS_LEGACY")["temperature"] == "tmpprs"
+ assert mapping.get("gfs_legacy")["temperature"] == "tmpprs"