zea.models.lpips

LPIPS model for perceptual similarity.

See original code: https://github.com/richzhang/PerceptualSimilarity As well as the paper: https://arxiv.org/abs/1801.03924

Functions

linear_model()

Get the linear head model for LPIPS.

perceptual_model()

Get the VGG16 model for perceptual loss.

Classes

LPIPS(*args, **kwargs)

Learned Perceptual Image Patch Similarity (LPIPS) metric.

class zea.models.lpips.LPIPS(*args, **kwargs)[source]

Bases: BaseModel

Learned Perceptual Image Patch Similarity (LPIPS) metric.

call(inputs)[source]

Compute the LPIPS metric.

Parameters:

inputs (list) – List of two input images of shape [B, H, W, C] or [H, W, C]. Images should be in the range [-1, 1].

custom_load_weights(preset, **kwargs)[source]

Load the weights for the VGG and linear models.

static preprocess_input(image)[source]

Preprocess the input images

Parameters:

image (Tensor) – Input image tensor of shape [B, H, W, C] and values in the range [-1, 1].

Returns:

Preprocessed image tensor of shape [B, H, W, C]

and standardized values for VGG model.

Return type:

Tensor

zea.models.lpips.linear_model()[source]

Get the linear head model for LPIPS.

zea.models.lpips.perceptual_model()[source]

Get the VGG16 model for perceptual loss.