Skip to content

Commit ec6c48b

Browse files
committed
norm not needed when reusing attention in lookvit
1 parent 547bf94 commit ec6c48b

File tree

2 files changed

+2
-2
lines changed

2 files changed

+2
-2
lines changed

setup.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@
66
setup(
77
name = 'vit-pytorch',
88
packages = find_packages(exclude=['examples']),
9-
version = '1.7.1',
9+
version = '1.7.2',
1010
license='MIT',
1111
description = 'Vision Transformer (ViT) - Pytorch',
1212
long_description=long_description,

vit_pytorch/look_vit.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -77,7 +77,7 @@ def __init__(
7777

7878
self.split_heads = Rearrange('b n (h d) -> b h n d', h = heads)
7979

80-
self.norm = LayerNorm(dim)
80+
self.norm = LayerNorm(dim) if not reuse_attention else nn.Identity()
8181
self.attend = nn.Softmax(dim = -1)
8282
self.dropout = nn.Dropout(dropout)
8383

0 commit comments

Comments
 (0)