trw.transforms
¶
This module is dedicated to data augmentations. In particular we strive to have a numpy and pytorch implementation for each augmentation so that we could if perform it on GPU
Transforms are designed to work for n-dimensional data.
Submodules¶
trw.transforms.copy
trw.transforms.crop
trw.transforms.cutout_function
trw.transforms.flip
trw.transforms.normalize
trw.transforms.pad
trw.transforms.renormalize
trw.transforms.resize
trw.transforms.stack
trw.transforms.transforms
trw.transforms.transforms_compose
trw.transforms.transforms_normalize
trw.transforms.transforms_random_crop
trw.transforms.transforms_random_cutout
trw.transforms.transforms_random_flip
trw.transforms.transforms_resize
Package Contents¶
Classes¶
Abstraction of a batch transform |
|
Helper function to apply a given transform function on features that satisfy a criteria |
|
Add padding on a numpy array of samples and random crop to original size |
|
Add random padding & cropping on a numpy or Torch arrays. The arrays are joints and the same padding/cropping applied on all the arrays |
|
Randomly flip the axis of selected features |
|
Randomly flip the axis of selected features in a joint fashion (if a feature is selected and a sample is |
|
Randomly flip the axis of selected features |
|
Resize a tensor to a fixed size |
|
Normalize a tensor image with mean and standard deviation. |
|
Sequentially apply a list of transformations |
Functions¶
|
Randomly crop a numpy array of samples given a target size. This works for an arbitrary number of dimensions |
|
Add padding on a numpy array of samples. This works for an arbitrary number of dimensions |
|
Add padding on a numpy array of samples. This works for an arbitrary number of dimensions |
|
Add padding on a numpy array of samples. This works for an arbitrary number of dimensions |
|
Remove a part of the image randomly |
|
Transform the data so that it has desired mean and standard deviation element wise |
|
Return True if the feature name belongs to a given set of names |
|
Return True if the feature is a numpy or torch array dim >= 3 |
- trw.transforms.transform_batch_random_crop(array, crop_shape, offsets=None, return_offsets=False)¶
Randomly crop a numpy array of samples given a target size. This works for an arbitrary number of dimensions
- Parameters
array – a numpy or Torch array. Samples are stored in the first dimension
crop_shape – a sequence of size len(array.shape)-1 indicating the shape of the crop
offsets – if None, offsets will be randomly created to crop with crop_shape, else an array indicating the crop position for each sample
return_offsets – if True, returns a tuple (cropped array, offsets)
- Returns
a cropped array
- trw.transforms.transform_batch_pad_numpy(array, padding, mode='edge', constant_value=0)¶
Add padding on a numpy array of samples. This works for an arbitrary number of dimensions
- Parameters
array – a numpy array. Samples are stored in the first dimension
padding – a sequence of size len(array.shape)-1 indicating the width of the padding to be added at the beginning and at the end of each dimension (except for dimension 0)
mode – numpy.pad mode
- Returns
a padded array
- trw.transforms.transform_batch_pad_torch(array, padding, mode='edge', constant_value=0)¶
Add padding on a numpy array of samples. This works for an arbitrary number of dimensions
This function mimics the API of transform_batch_pad_numpy so they can be easily interchanged.
- Parameters
array – a Torch array. Samples are stored in the first dimension
padding – a sequence of size len(array.shape)-1 indicating the width of the padding to be added at the beginning and at the end of each dimension (except for dimension 0)
mode – numpy.pad mode. Currently supported are (‘constant’, ‘edge’, ‘symmetric’)
- Returns
a padded array
- trw.transforms.transform_batch_pad(array, padding, mode='edge', constant_value=0)¶
Add padding on a numpy array of samples. This works for an arbitrary number of dimensions
- Parameters
array – a numpy array. Samples are stored in the first dimension
padding – a sequence of size len(array.shape)-1 indicating the width of the padding to be added at the beginning and at the end of each dimension (except for dimension 0)
mode – numpy.pad mode
- Returns
a padded array
- trw.transforms.flip(array, axis)¶
Flip an axis of an array
- Parameters
array – a
numpy.ndarray
ortorch.Tensor
n-dimensional arrayaxis – the xis to flip
- Returns
an array with specified axis flipped
- trw.transforms.copy(array)¶
Copy an array
- Parameters
array – a
numpy.ndarray
ortorch.Tensor
n-dimensional array- Returns
an array with specified axis flipped
- trw.transforms.cutout(image, cutout_size, cutout_value_fn)¶
Remove a part of the image randomly
- Parameters
array – a
numpy.ndarray
ortorch.Tensor
n-dimensional array. Samples are stored on axis 0cutout_size – the cutout_size of the regions to be occluded
cutout_value_fn – the function value used for occlusion. Must take as argument image and modify directly the image
- Returns
None
- trw.transforms.resize(array, size, mode='linear')¶
Resize the array
- Parameters
array – a N-dimensional tensor, representing 1D to 3D data (3 to 5 dimensional data with dim 0 for the samples and dim 1 for filters)
size – a (N-2) list to which the array will be upsampled or downsampled
mode – string among (‘nearest’, ‘linear’) specifying the resampling method
- Returns
a resized N-dimensional tensor
- trw.transforms.stack(sequence, axis=0)¶
stack an array
- Parameters
sequence – a
numpy.ndarray
ortorch.Tensor
n-dimensional arrayaxis – the xis to flip
- Returns
an array stacked
- trw.transforms.normalize(array, mean, std)¶
Normalize a tensor image with mean and standard deviation.
Given mean: (M1,…,Mn) and std: (S1,..,Sn) for n channels, this transform will normalize each channel of the input torch.Tensor, input[channel] = (input[channel] - mean[channel]) / std[channel]
- Parameters
array – the torch array to normalize. Expected layout is (sample, filter, d0, … dN)
mean – a N-dimensional sequence
std – a N-dimensional sequence
- Returns
A normalized tensor such that the mean is 0 and std is 1
- trw.transforms.renormalize(data, desired_mean, desired_std, current_mean=None, current_std=None)¶
Transform the data so that it has desired mean and standard deviation element wise
- Parameters
data – a torch or numpy array
desired_mean – the mean to transform data to
desired_std – the std to transform data to
current_mean – if the mean if known, do not recalculate it (e.g., training mean to be used in validation split)
current_std – if the std if known, do not recalculate it (e.g., training std to be used in validation split)
- Returns
a data with mean desired_mean and std desired_std
- class trw.transforms.TransformBatchWithCriteria(criteria_fn, transform_fn)¶
Bases:
Transform
Helper function to apply a given transform function on features that satisfy a criteria
- __call__(self, batch)¶
- trw.transforms.criteria_feature_name(feature_name, feature_value, feature_names)¶
Return True if the feature name belongs to a given set of names
- trw.transforms.criteria_is_array_3_or_above(feature_name, feature_value)¶
Return True if the feature is a numpy or torch array dim >= 3
- class trw.transforms.TransformRandomCrop(padding, criteria_fn=None, mode='edge', constant_value=0, size=None)¶
Bases:
trw.transforms.transforms.TransformBatchWithCriteria
Add padding on a numpy array of samples and random crop to original size
- Parameters
padding – a sequence of size len(array.shape)-1 indicating the width of the padding to be added at the beginning and at the end of each dimension (except for dimension 0). If None, no padding added
criteria_fn – function applied on each feature. If satisfied, the feature will be transformed, if not the original feature is returned
mode – numpy.pad mode. Currently supported are (‘constant’, ‘edge’, ‘symmetric’)
size – the size of the cropped image. If None, same size as input image
- Returns
a randomly cropped batch
- class trw.transforms.TransformRandomCropJoint(feature_names, padding, mode='edge', constant_value=0, size=None)¶
Bases:
trw.transforms.transforms.TransformBatchJointWithCriteria
Add random padding & cropping on a numpy or Torch arrays. The arrays are joints and the same padding/cropping applied on all the arrays
- Parameters
feature_names – these are the features that will be jointly padded and cropped
padding – a sequence of size len(array.shape)-1 indicating the width of the padding to be added at the beginning and at the end of each dimension (except for dimension 0). If None, no padding added
criteria_fn – function applied on each feature. If satisfied, the feature will be transformed, if not the original feature is returned
mode – numpy.pad mode. Currently supported are (‘constant’, ‘edge’, ‘symmetric’)
size – the size of the cropped image. If None, same size as input image
- Returns
a randomly cropped batch
- class trw.transforms.TransformRandomFlip(axis, flip_probability=0.5, criteria_fn=None)¶
Bases:
trw.transforms.transforms.TransformBatchWithCriteria
Randomly flip the axis of selected features
- class trw.transforms.TransformRandomFlipJoint(feature_names, axis, flip_probability=0.5)¶
Bases:
trw.transforms.transforms.TransformBatchWithCriteria
- Randomly flip the axis of selected features in a joint fashion (if a feature is selected and a sample is
flipped, it will be flipped for all selected features)
- class trw.transforms.TransformRandomCutout(cutout_size, criteria_fn=None, cutout_value_fn=functools.partial(cutout_function.cutout_value_fn_constant, value=0))¶
Bases:
trw.transforms.transforms.TransformBatchWithCriteria
Randomly flip the axis of selected features
- class trw.transforms.TransformResize(size, criteria_fn=None, mode='linear')¶
Bases:
trw.transforms.transforms.TransformBatchWithCriteria
Resize a tensor to a fixed size
- class trw.transforms.TransformNormalize(mean, std, criteria_fn=None)¶
Bases:
trw.transforms.transforms.TransformBatchWithCriteria
Normalize a tensor image with mean and standard deviation.
Given mean: (M1,…,Mn) and std: (S1,..,Sn) for n channels, this transform will normalize each channel of the input torch.Tensor, input[channel] = (input[channel] - mean[channel]) / std[channel]
- Parameters
array – the torch array to normalize. Expected layout is (sample, filter, d0, … dN)
mean – a N-dimensional sequence
std – a N-dimensional sequence
criteria_fn – function applied on each feature. If satisfied, the feature will be transformed, if not the original feature is returned
- Returns
A normalized batch such that the mean is 0 and std is 1 for the selected features
- class trw.transforms.TransformCompose(transforms)¶
Bases:
trw.transforms.transforms.Transform
Sequentially apply a list of transformations
- __call__(self, batch)¶