# initializer¶

## Constant¶

paddle.fluid.initializer.Constant

alias of ConstantInitializer

## Uniform¶

paddle.fluid.initializer.Uniform

alias of UniformInitializer

## Normal¶

paddle.fluid.initializer.Normal

alias of NormalInitializer

## Xavier¶

paddle.fluid.initializer.Xavier

alias of XavierInitializer

## MSRA¶

paddle.fluid.initializer.MSRA

alias of MSRAInitializer

## ConstantInitializer¶

class paddle.fluid.initializer.ConstantInitializer(value=0.0, force_cpu=False)

Implements the constant initializer

## UniformInitializer¶

class paddle.fluid.initializer.UniformInitializer(low=-1.0, high=1.0, seed=0)

Implements the random uniform distribution initializer

## NormalInitializer¶

class paddle.fluid.initializer.NormalInitializer(loc=0.0, scale=1.0, seed=0)

Implements the random Normal(Gaussian) distribution initializer

## XavierInitializer¶

class paddle.fluid.initializer.XavierInitializer(uniform=True, fan_in=None, fan_out=None, seed=0)

Implements the Xavier initializer

This class implements the Xavier weight initializer from the paper Understanding the difficulty of training deep feedforward neural networks[1] by Xavier Glorot and Yoshua Bengio.

This initializer is designed to keep the scale of the gradients approximately same in all the layers. In case of Uniform distribution, the range is [-x, x], where x = sqrt(6 / (fan_in + fan_out)). In case of Normal distribution, the mean is 0 and the standard deviation is sqrt(2/ (fan_in + fan_out)).

References

[1] Understanding the difficulty of training deep feedforward neural
networks. International conference on artificial intelligence and statistics. (http://proceedings.mlr.press/v9/glorot10a.html)

## MSRAInitializer¶

class paddle.fluid.initializer.MSRAInitializer(uniform=True, fan_in=None, seed=0)

Implements the MSRA initializer a.k.a. Kaiming Initializer

This class implements the weight initialization from the paper Delving Deep into Rectifiers: Surpassing Human-Level Performance on ImageNet Classification[1] by Kaiming He, Xiangyu Zhang, Shaoqing Ren and Jian Sun. This is a robust initialization method that particularly considers the rectifier nonlinearities. In case of Uniform distribution, the range is [-x, x], where x = sqrt(6 / fan_in). In case of Normal distribution, the mean is 0 and the standard deviation is sqrt(2/ fan_in).

References

[1] Delving Deep into Rectifiers: Surpassing Human-Level Performance
on ImageNet Classification (https://arxiv.org/abs/1502.01852)