-
Notifications
You must be signed in to change notification settings - Fork 67
feat: 402 time interpolation mass conserving interpolation for accumulated fields #544
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
training/src/anemoi/training/config/diagnostics/evaluation.yaml
Outdated
Show resolved
Hide resolved
…olve_mass_conservation function around the
for more information, see https://pre-commit.ci
…interpolation-for-accumulated-fields
…erpolator subschema validation
…lation-mass-conserving-interpolation-for-accumulated-fields
…interpolation-for-accumulated-fields
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks Rilwan for addressing the comments and the nice work. Before merging it would be good if you can update the branch with main and check tests still pass https://github.com/ecmwf/anemoi-core/actions/runs/18645927148 and https://github.com/ecmwf/anemoi-core/actions/runs/18645957033.
@Rilwan-Adewoyin currently there seems to be some errors with flash_attention - the integration tests are trying to picked that up and failing. Please check that before merging
| - diagnostics: evaluation | ||
| - hardware: example | ||
| - graph: multi_scale | ||
| - model: graphtransformer |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
the integration tests errors are likely coming from this. We don't test yet the transformer as part of our integration tests, as flash attention can't be installed and we had issues when compiling flex attention. I'd change this back to graph-transformer for now
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
noted, updated in b16e868 - I'll check if tests are now passing soon
|
Thanks for this contribution @Rilwan-Adewoyin this is very valuable. |
| batch: torch.Tensor, | ||
| validation_mode: bool = False, | ||
| ) -> tuple[torch.Tensor, Mapping[str, torch.Tensor]]: | ||
| ) -> tuple[torch.Tensor, Mapping[str, torch.Tensor], list[torch.Tensor]]: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This now doesn't match the signature of the abstract method on the parent class.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for highlighting this - It turns out the abstract method had the wrong signature - it seems that when step is called even the base class expects three outputs in training_step, validation_step as shown below:
train_loss, _, _ = self._step(batch)
I updated the base class _step signature in 99923bc
…interpolation-for-accumulated-fields
Hi @cosunae - hope to have experiments detailing this specific thing you've mentioned done by next week. Will update when ready |
closes #402
Description
Context and Discussion on PR -->: #402
What problem does this change solve?
Non Breaking change only affects Time Interpolator
Implements the mass conservation for temporal interpolation of accumulated variables
Implements a processor to ensure input accumulated data has the correct format
Updates the interpolator config
Summary of Changes
What issue or task does this change relate to?
#402
Additional notes
As a contributor to the Anemoi framework, please ensure that your changes include unit tests, updates to any affected dependencies and documentation, and have been tested in a parallel setting (i.e., with multiple GPUs). As a reviewer, you are also responsible for verifying these aspects and requesting changes if they are not adequately addressed. For guidelines about those please refer to https://anemoi.readthedocs.io/en/latest/
By opening this pull request, I affirm that all authors agree to the Contributor License Agreement.
@Magnus-SI @radiradev @frazane @sahahner
📚 Documentation preview 📚: https://anemoi-training--544.org.readthedocs.build/en/544/
📚 Documentation preview 📚: https://anemoi-graphs--544.org.readthedocs.build/en/544/
📚 Documentation preview 📚: https://anemoi-models--544.org.readthedocs.build/en/544/