Models

Collection of (generative) models for ultrasound imaging.

zea contains a collection of models for various tasks, all located in the zea.models package.

Currently, the following models are available (all inherited from zea.models.BaseModel):

Presets for these models can be found in zea.models.presets.

To use these models, you can import them directly from the zea.models module and load the pretrained weights using the from_preset() method. For example:

from zea.models import UNet

model = UNet.from_preset("unet-echonet-inpainter")

You can list all available presets using the presets attribute:

presets = list(UNet.presets.keys())
print(f"Available built-in zea presets for UNet: {presets}")

zea generative models

In addition to models, zea provides both classical and deep generative models for tasks such as image generation, inpainting, and denoising. These models inherit from zea.models.generative.GenerativeModel or zea.models.deepgenerative.DeepGenerativeModel. Typically, these models have some additional methods, such as:

  • fit() for training the model on data

  • sample() for generating new samples from the learned distribution

  • posterior_sample() for drawing samples from the posterior given measurements

  • log_density() for computing the log-probability of data under the model

The following generative models are currently available:

An example of how to use the zea.models.diffusion.DiffusionModel is shown below:

from zea.models import DiffusionModel

model = DiffusionModel.from_preset("diffusion-echonet-dynamic")
samples = model.sample(n_samples=4)

Contributing and adding new models

Please follow the guidelines in the Contributing page if you would like to contribute a new model to zea.

The following steps are recommended when adding a new model:

  1. Create a new module in the zea.models package for your model: zea.models.mymodel.

  2. Add a model class that inherits from zea.models.base.Model. For generative models, inherit from zea.models.generative.GenerativeModel or zea.models.deepgenerative.DeepGenerativeModel as appropriate. Make sure you implement the call() method.

  3. Upload the pretrained model weights to our Hugging Face. Should be a config.json and a model.weights.h5 file. See Keras documentation how those can be saved from your model. Simply drag and drop the files to the Hugging Face website to upload them.

    Tip

    It is recommended to use the mentioned saving procedure. However, alternate saving methods are also possible, see the zea.models.echonet.EchoNet module for an example. You do now have to implement a custom_load_weights() method in your model class.

  4. Add a preset for the model in zea.models.presets. This basically allows you to have multiple weights presets for a given model architecture.

  5. Make sure to register the presets in your model module by importing the presets module and calling register_presets with the model class as an argument.