GithubHelp home page GithubHelp logo

How to install? about diffcse HOT 5 OPEN

voidism avatar voidism commented on May 25, 2024
How to install?

from diffcse.

Comments (5)

voidism avatar voidism commented on May 25, 2024 1

Hi! I just changed the code in diffcse/tool.py by f724b2c to make it always use the [CLS] vector before the pooler. In the experiment that works better. By doing so, we are not going to use ['roberta.pooler.dense.weight', 'roberta.pooler.dense.bias'] so it's ok to ignore that warning.

from diffcse.

voidism avatar voidism commented on May 25, 2024 1

Hi @JhonDan1999

The warning is caused by loading our DiffCSE model (encoder-decoder arch) into the RoBERTa model (encoder-only arch).
So the decoder weights (in the names starting withaux_bert.) are not used when initializing RobertaModel.

If you only want to extract sentence embeddings without training, it should be fine, as you don't need the decoder to extract.

For the second warning, if you're using RoBERTa, just make sure that your sentence length is not longer than 514 (including BOS/EOS), then it should be fine.

from diffcse.

voidism avatar voidism commented on May 25, 2024

You need to be in the DiffCSE folder to run from diffcse import DiffCSE.
If you want to import diffcse from anywhere, you can run pip install . in the DiffCSE folder to install DiffCSE globally.

from diffcse.

stephenleo avatar stephenleo commented on May 25, 2024

Thanks! was able to import and run on STS-B dataset. However, loading the model throws this warning. Is it expected?

from DiffCSE.diffcse import DiffCSE
model = DiffCSE('voidism/diffcse-roberta-base-sts')
Some weights of RobertaModel were not initialized from the model checkpoint at voidism/diffcse-roberta-base-sts and are newly initialized: ['roberta.pooler.dense.weight', 'roberta.pooler.dense.bias']
You should probably TRAIN this model on a down-stream task to be able to use it for predictions and inference.

from diffcse.

JhonDan1999 avatar JhonDan1999 commented on May 25, 2024

I am having the issue when I tried to use the model through huggingface like this:

Screenshot 2023-10-01 at 11 20 03 AM

it gave me this long warning:

Some weights of the model checkpoint at voidism/diffcse-roberta-base-sts were not used when initializing RobertaModel:
then it shows all the layers like this:

aux_bert.encoder.layer.7.output.dense.bias', 'aux_bert.encoder.layer.8.intermediate.dense.bias', 'aux_bert.encoder.layer.10.intermediate.dense.bias', 'aux_bert.encoder.layer.4.intermediate.dense.bias', 'aux_bert.encoder.layer.9.intermediate.dense.weight', 'aux_bert.encoder.layer.11.attention.self.query.weight', 'generator.lm_head.dense.bias', 'generator.roberta.encoder.layer.5.attention.output.dense.weight', 'aux_bert.encoder.layer.7.output.LayerNorm.bias', 'generator.roberta.encoder.layer.0.attention.output.dense.bias', 'aux_bert.encoder.layer.6.output.LayerNorm.weight', 'aux_bert.encoder.layer.4.output.dense.weight', 'aux_bert.encoder.layer.2.attention.output.LayerNorm.bias', 'generator.roberta.encoder.layer.5.intermediate.dense.bias', 'generator.roberta.encoder.layer.2.attention.self.query.weight', 'aux_bert.encoder.layer.9.intermediate.dense.bias', 'generator.roberta.encoder.layer.4.attention.self.key.weight', 'generator.roberta.encoder.layer.4.attention.output.dense.weight', 'generator.roberta.encoder.layer.1.output.LayerNorm.bias', 'generator.roberta.embeddings.position_ids', 'generator.roberta.encoder.layer.3.intermediate.dense.bias', 'aux_bert.encoder.layer.1.attention.self.key.bias', 'aux_bert.encoder.layer.2.output.dense.weight', 'generator.roberta.encoder.layer.2.intermediate.dense.bias', 'aux_bert.encoder.layer.8.attention.self.value.weight', 'aux_bert.encoder.layer.10.attention.self.key.bias', 'aux_bert.encoder.layer.9.attention.self.value.weight', 'generator.lm_head.decoder.bias', 'generator.roberta.encoder.layer.3.attention.output.dense.weight', 'aux_bert.encoder.layer.3.output.dense.bias', 'aux_bert.encoder.layer.0.output.LayerNorm.weight', 'aux_bert.encoder.layer.1.attention.output.LayerNorm.bias', 'generator.roberta.encoder.layer.2.attention.self.key.weight', 'generator.roberta.encoder.layer.0.output.dense.weight', 'aux_bert.encoder.layer.5.attention.self.key.weight', 'aux_bert.encoder.layer.11.intermediate.dense.weight', 'aux_bert.encoder.layer.6.attention.output.dense.weight']
- This IS expected if you are initializing RobertaModel from the checkpoint of a model trained on another task or with another architecture (e.g. initializing a BertForSequenceClassification model from a BertForPreTraining model).
- This IS NOT expected if you are initializing RobertaModel from the checkpoint of a model that you expect to be exactly identical (initializing a BertForSequenceClassification model from a BertForSequenceClassification model).
Some weights of RobertaModel were not initialized from the model checkpoint at voidism/diffcse-roberta-base-sts and are newly initialized: ['roberta.pooler.dense.bias', 'roberta.pooler.dense.weight'

at the end it give this:
You should probably TRAIN this model on a down-stream task to be able to use it for predictions and inference. Asking to truncate to max_length but no maximum length is provided and the model has no predefined maximum length. Default to no truncation.

I am not sure if I can ignore this and use the model please confirm to me

from diffcse.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.