Skip to content

questions about implementation details #2

@yuanhangtangle

Description

@yuanhangtangle

Thank you for your repo. I meet with some problems reading the codes and the paper.

Q1. In fd_waterbird.py, I can't find the code coresponding to $P(R|X)$. It seems that you are simply dropping some elements with torch.distributions.binomial.Binomial rather than actually modeling $P(R|X)$. Then what is $P(R|X)$ for?

Q2. The codes following the drop mentioned in Q1 are quite confusing:

for jj in range(args.samples - 1):
      if add_n:
          binomial = torch.distributions.binomial.Binomial(
              probs=1 - p)
          fea = feature * binomial.sample(
              feature.size()).cuda() * (1.0 / (1 - p))
      else:
          fea = feature
      logit_compose = logit_compose + classifier(
          fea, Xp[j * bs_m:(j + 1) * bs_m, :, :, :])  # TODO:

I seems that logit_compose = logit_compose + classifier(fea, Xp[j * bs_m:(j + 1) * bs_m, :, :, :]) is run for multiple times without any modification.

Q3. I can't find the variables representing Ni and Nj described in the paper (5.3.Experimental Settings: We set Nj = 256 and Ni = 10 for all experiments and denotes it as Ours). I seems that codes mentioned in Q1 and A2 are the key to the implementation, but I don't know how they relate to the front-door formula derived in the paper.

Expecting your reply. Thanks in advance.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions