Skip to content

alpha value is not consistent with paper? #10

@eedavidwu

Description

@eedavidwu

It seems the loss=-alpha*(1-y')^{gamma}log(y') when y=1 in the paper.
But in the code:
loss=-(1-alpha)*(1-y')^{gamma}log(y') when y=1
So the alpha should be set in an oppisite way?...
In the code:
{
(alpha,(float,int)): self.alpha = torch.Tensor([alpha,1-alpha]),
at = self.alpha.gather(0,target.data.view(-1))
logpt = logpt * Variable(at)
}
Is this right?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions