initializer

Constant

paddle.fluid.initializer.Constant

alias of ConstantInitializer

Uniform

paddle.fluid.initializer.Uniform

alias of UniformInitializer

Normal

paddle.fluid.initializer.Normal

alias of NormalInitializer

Xavier

paddle.fluid.initializer.Xavier

alias of XavierInitializer

Bilinear

paddle.fluid.initializer.Bilinear

alias of BilinearInitializer

force_init_on_cpu

paddle.fluid.initializer.force_init_on_cpu()

The flag of whether force to init variables on CPU.

Examples

if force_init_on_cpu():
    pass

init_on_cpu

paddle.fluid.initializer.init_on_cpu(*args, **kwds)

Force the variable to be inited on CPU.

Examples

with init_on_cpu():
    step = layers.create_global_var()

ConstantInitializer

class paddle.fluid.initializer.ConstantInitializer(value=0.0, force_cpu=False)

Implements the constant initializer

Parameters:value (float) – constant value to initialize the variable

Examples

fc = fluid.layers.fc(input=x, size=10,
    param_attr=fluid.initializer.Constant(value=2.0))

UniformInitializer

class paddle.fluid.initializer.UniformInitializer(low=-1.0, high=1.0, seed=0)

Implements the random uniform distribution initializer

Parameters:
  • low (float) – lower boundary of the uniform distribution
  • high (float) – upper boundary of the uniform distribution
  • seed (int) – random seed

Examples

fc = fluid.layers.fc(input=x, size=10,
    param_attr=fluid.initializer.Uniform(low=-0.5, high=0.5))

NormalInitializer

class paddle.fluid.initializer.NormalInitializer(loc=0.0, scale=1.0, seed=0)

Implements the Random Normal(Gaussian) distribution initializer

Parameters:
  • loc (float) – mean of the normal distribution
  • scale (float) – standard deviation of the normal distribution
  • seed (int) – random seed

Examples

fc = fluid.layers.fc(input=x, size=10,
    param_attr=fluid.initializer.Normal(loc=0.0, scale=2.0))

XavierInitializer

class paddle.fluid.initializer.XavierInitializer(uniform=True, fan_in=None, fan_out=None, seed=0)

This class implements the Xavier weight initializer from the paper Understanding the difficulty of training deep feedforward neural networks by Xavier Glorot and Yoshua Bengio.

This initializer is designed to keep the scale of the gradients approximately same in all the layers. In case of Uniform distribution, the range is [-x, x], where

\[x = \sqrt{\frac{6.0}{fan\_in + fan\_out}}\]

In case of Normal distribution, the mean is 0 and the standard deviation is

\[\sqrt{\frac{2.0}{fan\_in + fan\_out}}\]
Parameters:
  • uniform (bool) – whether to use uniform or normal distribution
  • fan_in (float) – fan_in for Xavier initialization. If None, it is inferred from the variable.
  • fan_out (float) – fan_out for Xavier initialization. If None, it is inferred from the variable.
  • seed (int) – random seed

Note

It is recommended to set fan_in and fan_out to None for most cases.

Examples

fc = fluid.layers.fc(
    input=queries, size=10,
    param_attr=fluid.initializer.Xavier(uniform=False))

BilinearInitializer

class paddle.fluid.initializer.BilinearInitializer

This initializer can be used in transposed convolution operator to act as upsampling. Users can upsample a feature map with shape of (B, C, H, W) by any integer factor. The usage is:

Examples

factor = 2
w_attr = ParamAttr(learning_rate=0., regularizer=L2Decay(0.),
                   initializer=Bilinear())
conv_up = fluid.layers.conv2d_transpose(
    input,
    num_filters=C,
    output_size=None,
    filter_size=2 * factor - factor % 2,
    padding=ceil((factor - 1) / 2.),
    stride=factor,
    groups=C,
    param_attr=w_attr,
    bias_attr=False)

Where, num_filters=C and groups=C means this is channel-wise transposed convolution. The filter shape will be (C, 1, K, K) where K is filer_size, This initializer will set a (K, K) interpolation kernel for every channel of the filter identically. The resulting shape of the output feature map will be (B, C, factor * H, factor * W). Note that the learning rate and the weight decay are set to 0 in order to keep coefficient values of bilinear interpolation unchanged during training.