GithubHelp home page GithubHelp logo

Comments (5)

lsongx avatar lsongx commented on August 17, 2024 2

Glad that I have addressed your concerns.

The equation in the paper seems wrong.
It should be

Thanks for your question!

from weighted-soft-label-distillation.

HolmesShuan avatar HolmesShuan commented on August 17, 2024 1

Hi @LcDog , thanks for your quick reply!

I have double-checked the derivation of Heskes's paper and gradually understood the decomposition for the expected error defined on the KL Div. Unfortunately, I still did not get the idea of how E(y) and E(\bar{y}) are all equals to 1 leads to

I am really sorry for bothering you again :(

As far as I can tell,

I am not sure how to use E(y) and E(\bar{y}) are all equals to 1 in the above equation. It seems that

Many thanks!

from weighted-soft-label-distillation.

HolmesShuan avatar HolmesShuan commented on August 17, 2024 1

Cool! The derivation is very clear!

It seems that

according to

The notation is ok but I found E[y]=1 and E[\overline y]=1 quite confusing, which makes me wonder is there any

tricks in Equation(2). 😅

By the way, could you further explain how to get the equation in your first comment?

As for

I just quoted the conclusion in the ICLR paper "The derivation of the variance term is based on the facts that $\frac{\log \overline{y}{ce}}{\mathbb{E}\mathcal{D}[\log \hat{y}_{ce}]}$ is a constant and ...".

Thanks for your responses!

from weighted-soft-label-distillation.

lsongx avatar lsongx commented on August 17, 2024

Hi @HolmesShuan , thanks for your question!
The reason is that E(y) and E(\bar{y}) are all equals to 1. Please refer to the equation above Eqn 4 from Heskes's paper: an extra \bar{p}(y) in the second row.
This follows the definition used in Eqn 2 from Heskes's paper. In their paper, it is defined \int dy a(y) = 1. (constraints in Eqn 2)
In our notations, y becomes x and a() becomes y=t(). We originally kept all notations the same as Heskes's paper, but a reviewer thinks the notations are not appropriate thus we changed all notations.
Sorry for the confusion! Please let me know if there are any further questions.

from weighted-soft-label-distillation.

lsongx avatar lsongx commented on August 17, 2024

No problem! You are very welcome to ask anything~

No need to decompose the two terms. We first have

then

so

similar we have

By the way, could you further explain how to get the equation in your first comment?

Seems like that our notations are somewhat misleading

from weighted-soft-label-distillation.

Related Issues (8)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.