zamba.pytorch.utils¶
Functions¶
build_multilayer_perceptron(input_size: int, hidden_layer_sizes: Optional[Tuple[int]], output_size: int, activation: Optional[torch.nn.Module] = torch.nn.ReLU, dropout: Optional[float] = None, output_dropout: Optional[float] = None, output_activation: Optional[torch.nn.Module] = None) -> torch.nn.Sequential
¶
Builds a multilayer perceptron.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
input_size |
int
|
Size of first input layer. |
required |
hidden_layer_sizes |
tuple of int
|
If provided, size of hidden layers. |
required |
output_size |
int
|
Size of the last output layer. |
required |
activation |
torch.nn.Module
|
Activation layer between each pair of layers. |
torch.nn.ReLU
|
dropout |
float
|
If provided, insert dropout layers with the following dropout rate in between each pair of layers. |
None
|
output_dropout |
float
|
If provided, insert a dropout layer with the following dropout rate before the output. |
None
|
output_activation |
torch.nn.Module
|
Activation layer after the final layer. |
None
|
Returns:
Type | Description |
---|---|
torch.nn.Sequential
|
torch.nn.Sequential |
Source code in zamba/pytorch/utils.py
6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 |
|