Scheaven
2021-09-18 291deeb1fcf45dbf39a24aa72a213ff3fd6b3405
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
B
ÁŽ—_È ã@s¤dZddlZddlZddlmmZddlmZdddddd    gZGd
d„dejƒZ    Gd d„dej
ƒZ
Gd d„dej ƒZ Gd d„de    ƒZ Gdd„de    ƒZdd    „ZdS)z9
@author:  liaoxingyu
@contact: sherlockliao01@gmail.com
éN)ÚnnÚ    BatchNormÚIBNÚGhostBatchNormÚFrozenBatchNormÚ SyncBatchNormÚget_normcseZdZd‡fdd„    Z‡ZS)    rçñh㈵øä>皙™™™™¹?Fçð?çc     sbtƒj|||d|dk    r*tj |j|¡|dk    rBtj |j|¡|j | ¡|j | ¡dS)N)ÚepsÚmomentum)ÚsuperÚ__init__rÚinitÚ    constant_ÚweightÚbiasÚrequires_grad_)    ÚselfÚ num_featuresr rÚ weight_freezeÚ bias_freezeÚ weight_initÚ    bias_initÚkwargs)Ú    __class__©ú./layers/batch_norm.pyrszBatchNorm.__init__)r    r
FFr r )Ú__name__Ú
__module__Ú __qualname__rÚ __classcell__rr)rrrscseZdZd‡fdd„    Z‡ZS)    rçñh㈵øä>皙™™™™¹?Fçð?çcsbtƒj|||d|dk    r*tj |j|¡|dk    rBtj |j|¡|j | ¡|j | ¡dS)N)r r)rrrrrrrr)rrr rrrrr)rrrr"szSyncBatchNorm.__init__)r$r%FFr&r')r r!r"rr#rr)rrr!scs$eZdZ‡fdd„Zdd„Z‡ZS)rc sLtt|ƒ ¡t|dƒ}||_||}tj|dd|_t||f|Ž|_    dS)NéT)Úaffine)
rrrÚintÚhalfrÚInstanceNorm2dÚINrÚBN)rÚplanesÚbn_normrZhalf1Zhalf2)rrrr,s  z IBN.__init__cCsHt ||jd¡}| |d ¡¡}| |d ¡¡}t ||fd¡}|S)Nér)ÚtorchÚsplitr+r-Ú
contiguousr.Úcat)rÚxr3Zout1Zout2ÚoutrrrÚforward4s
z IBN.forward)r r!r"rr8r#rr)rrr+s cs&eZdZd‡fdd„    Zdd„Z‡ZS)rr1c s>tƒj|f|Ž||_| dt |¡¡| dt |¡¡dS)NÚ running_meanÚ running_var)rrÚ
num_splitsÚregister_bufferr2ÚzerosÚones)rrr;r)rrrr=szGhostBatchNorm.__init__c
Csô|j\}}}}|js|jsÌ|j |j¡|_|j |j¡|_t |     d||j||¡|j|j|j
 |j¡|j  |j¡d|j |j ¡     ||||¡}tj|j     |j|j¡dd|_tj|j     |j|j¡dd|_|St ||j|j|j
|j d|j |j ¡SdS)NéÿÿÿÿTr)ÚdimF)ÚshapeÚtrainingÚtrack_running_statsr9Úrepeatr;r:ÚFÚ
batch_normÚviewrrrr r2Úmeanr)rÚinputÚNÚCÚHÚWÚoutputsrrrr8Cs 
zGhostBatchNorm.forward)r1)r r!r"rr8r#rr)rrr<scsNeZdZdZdZd‡fdd„    Zdd„Z‡fdd    „Zd
d „Ze    d d „ƒZ
‡Z S)ra(
    BatchNorm2d where the batch statistics and the affine parameters are fixed.
    It contains non-trainable buffers called
    "weight" and "bias", "running_mean", "running_var",
    initialized to perform identity transformation.
    The pre-trained backbone models from Caffe2 only contain "weight" and "bias",
    which are computed from the original four parameters of BN.
    The affine transform `x * weight + bias` will perform the equivalent
    computation of `(x - running_mean) / sqrt(running_var) * weight + bias`.
    When loading a backbone model from Caffe2, "running_mean" and "running_var"
    will be left unchanged as identity transformation.
    Other pre-trained backbone models may contain all 4 parameters.
    The forward is implemented by `F.batch_norm(..., training=False)`.
    éçñh㈵øä>c s*tƒj|fdddœ|—Ž||_||_dS)NT)rr)rrrr )rrr r)rrrrgszFrozenBatchNorm.__init__c    Cs~|jrX|j|j|j ¡}|j|j|}| dddd¡}| dddd¡}|||Stj    ||j|j|j|jd|jdSdS)Nr1r?F)rBr )
Ú requires_gradrr:r Úrsqrtrr9ÚreshaperErF)rr6Úscalerrrrr8ls zFrozenBatchNorm.forwardc
    s¾| dd¡}|dks|dkr\|d|kr<t |j¡||d<|d|kr\t |j¡||d<|dk    r¢|dkr¢t t¡}    |         d 
|  d¡¡¡||d|j 8<t ƒ |||||||¡dS)NÚversionr(r9r:rOz,FrozenBatchNorm {} is upgraded to version 3.Ú.)Úgetr2Ú
zeros_liker9Ú    ones_liker:ÚloggingÚ    getLoggerr ÚinfoÚformatÚrstripr rÚ_load_from_state_dict)
rÚ
state_dictÚprefixÚlocal_metadataÚstrictÚ missing_keysÚunexpected_keysÚ
error_msgsrUÚlogger)rrrr_‚s   
z%FrozenBatchNorm._load_from_state_dictcCsd |j|j¡S)Nz*FrozenBatchNorm2d(num_features={}, eps={}))r]rr )rrrrÚ__repr__™szFrozenBatchNorm.__repr__cCs´tjj}|j|jf}|}t||ƒr|||jƒ}|jrZ|jj     
¡  ¡|j_    |j j     
¡  ¡|j _    |j j    |j _    |jj    |j_    |j|_n4x2| ¡D]&\}}| |¡}||k    r†| ||¡q†W|S)až
        Convert BatchNorm/SyncBatchNorm in module into FrozenBatchNorm.
        Args:
            module (torch.nn.Module):
        Returns:
            If module is BatchNorm/SyncBatchNorm, returns a new module.
            Otherwise, in-place convert module and return it.
        Similar to convert_sync_batchnorm in
        https://github.com/pytorch/pytorch/blob/master/torch/nn/modules/batchnorm.py
        )rÚmodulesÚ    batchnormÚ BatchNorm2drÚ
isinstancerr)rÚdataÚcloneÚdetachrr9r:r Únamed_childrenÚconvert_frozen_batchnormÚ
add_module)ÚclsÚmoduleZ    bn_moduleÚresÚnameÚchildÚ    new_childrrrrqœs   
 
 
 
z(FrozenBatchNorm.convert_frozen_batchnorm)rP) r r!r"Ú__doc__Ú_versionrr8r_rhÚ classmethodrqr#rr)rrrUs cKs>t|tƒr2t|ƒdkrdStttdd„tdœ|}||f|ŽS)aZ
    Args:
        norm (str or callable): either one of BN, GhostBN, FrozenBN, GN or SyncBN;
            or a callable that thakes a channel number and returns
            the normalization layer as a nn.Module
        out_channels: number of channels for normalization layer
 
    Returns:
        nn.Module or None: the normalization layer
    rNc[s t d|¡S)Né )rÚ    GroupNorm)ÚchannelsÚargsrrrÚ<lambda>Íózget_norm.<locals>.<lambda>)r.ZGhostBNZFrozenBNZGNZsyncBN)rlÚstrÚlenrrrr)ÚnormÚ out_channelsrrrrr»s
 )ryrZr2Útorch.nn.functionalrÚ
functionalrEÚ__all__rkrrÚModulerrrrrrrrÚ<module>s  
 
f