-
Notifications
You must be signed in to change notification settings - Fork 6
Description
Thank you for your repo. I meet with some problems reading the codes and the paper.
Q1. In fd_waterbird.py, I can't find the code coresponding to torch.distributions.binomial.Binomial rather than actually modeling
Q2. The codes following the drop mentioned in Q1 are quite confusing:
for jj in range(args.samples - 1):
if add_n:
binomial = torch.distributions.binomial.Binomial(
probs=1 - p)
fea = feature * binomial.sample(
feature.size()).cuda() * (1.0 / (1 - p))
else:
fea = feature
logit_compose = logit_compose + classifier(
fea, Xp[j * bs_m:(j + 1) * bs_m, :, :, :]) # TODO:I seems that logit_compose = logit_compose + classifier(fea, Xp[j * bs_m:(j + 1) * bs_m, :, :, :]) is run for multiple times without any modification.
Q3. I can't find the variables representing Ni and Nj described in the paper (5.3.Experimental Settings: We set Nj = 256 and Ni = 10 for all experiments and denotes it as Ours). I seems that codes mentioned in Q1 and A2 are the key to the implementation, but I don't know how they relate to the front-door formula derived in the paper.
Expecting your reply. Thanks in advance.