GithubHelp home page GithubHelp logo

core's People

Contributors

dwymi02 avatar jojoin avatar

Watchers

 avatar  avatar  avatar

core's Issues

Summary of Data Races associated with [*Transaction_2_Simple]

Summary of Data Races associated with [*Transaction_2_Simple]

  • Text file of instrumented Diamond Miner [DiamondMiner_log_Sep_21_2021.txt], available via Github Gist [https://gist.github.com/dwymi02/95140687b566efa22939dec3cd14dd6d]

  • The instrumented binary was built from refreshed code, dated Sep-21-2021.
    The source code repository was wiped clean and "git clone ..." was used to create the updated code base.

  • The instrumented code was built with "-race", e.g., "go build -race".

  • The directory [hacash_mainnet_data/] was completely removed, to allow the Hacash blockchain to be built from scratch, also allowing for additional Data Races to be uncovered, if any.

  • In searching for Data Races I tried to group them based on specific Hacash components.
    Grep was used to search for known Hacash components, following is a list of potential Data Races, based on Hacash components:

    • A search for [*P2P] returned 109 hits
    • A search for [*Block_v1] returned 4 hits
    • A search for [*ChainState] returned 3 hits
    • A search for [*DiamondMiner] returned 45 hits
    • A search for [*DeprecatedApiService] returned 33 hits
    • A search for [*Transaction_2_Simple] returned 14 hits
      While there are large numbers of potential hits, there Hacash components may or may not be proximate to the actual Data Race.

    For each Hacash component I will list the specific Data Race.
    When the current code does not match what was previously built, a [*** NOTE ***] will be included
    In the event the stack trace that is too lengthy, the results will be truncated.
    If it is necessary, the full log will be provided, as needed.

    For the specific Read and Write that are involved in the Data Race, the [<===] will be used to identify where the specific Read/Write occurred, based on current source available ...

    Once the Data Race has been documented, an issue will be opened on Github.
    I will leave it to your team to decide how the Data Race is to be handled.
    You can opt to fix the Data Race or simply close the issue. as you so choose.
    This decision I leave with you and your team ...

=-=-=-=-=-=-=-=-=-=-=-=-=-

Data Races associated with [*Transaction_2_Simple]

  • Transaction_2_Simple Data Race # 1:
    [Diamond Miner] Diamond 38588 add to txpool, hx a378eecaafc42f7761a96fdc1d16a9c0ed92b01a3b618cf851480d49674e2dbb.

    WARNING: DATA RACE
    Write at 0x00c42110c020 by goroutine 71:
    core/transactions.(*Transaction_2_Simple).SetFee()
    core/transactions/type2.go:549 +0x49
    miner/diamondminer.(*DiamondMiner).doAutoBidForMyDiamond()
    miner/diamondminer/autobid.go:72 +0x871

    Previous read at 0x00c42110c020 by goroutine 60:
    core/transactions.(*Transaction_2_Simple).SerializeNoSignEx()
    core/transactions/type2.go:119 +0x44c
    core/transactions.(*Transaction_2_Simple).SerializeNoSign()
    core/transactions/type2.go:107 +0x41
    core/transactions.(*Transaction_2_Simple).Serialize()
    core/transactions/type2.go:70 +0x50
    node/backend.(*Backend).broadcastNewTxSubmit()
    node/backend/event.go:41 +0x167

    Write:
    [core/transactions/type2.go:549]
    func (trs *Transaction_2_Simple) SetFee(fee *fields.Amount) {
    trs.Fee = *fee <=== Write
    trs.ClearHash() // 重置哈希缓存
    }

    Read:
    [core/transactions/type2.go:119]
    func (trs *Transaction_2_Simple) SerializeNoSignEx(hasfee bool) ([]byte, error) {
    var buffer = new(bytes.Buffer)
    buffer.Write([]byte{trs.Type()}) // type
    b1, _ := trs.Timestamp.Serialize()
    buffer.Write(b1)
    b2, _ := trs.MainAddress.Serialize()
    buffer.Write(b2)
    if hasfee { // 是否需要 fee 字段
    b3, _ := trs.Fee.Serialize() <=== Read
    buffer.Write(b3) // 费用付出者签名 需要fee字段, 否则不需要
    }
    b4, _ := trs.ActionCount.Serialize()
    buffer.Write(b4)
    for i := 0; i < len(trs.Actions); i++ {
    var bi, e = trs.Actions[i].Serialize()
    if e != nil {
    return nil, e
    }
    buffer.Write(bi)
    }
    //if nofee {
    // fmt.Println( "SerializeNoSignEx: " + hex.EncodeToString(buffer.Bytes()))
    //}
    return buffer.Bytes(), nil
    }

=-=-=-=-=-

  • Transaction_2_Simple Data Race # 2:
    WARNING: DATA RACE
    Write at 0x00c42110c0a0 by goroutine 71:
    core/transactions.(*Transaction_2_Simple).SetFee()
    core/transactions/type2.go:54 +0xb0
    miner/diamondminer.(*DiamondMiner).doAutoBidForMyDiamond()
    miner/diamondminer/autobid.go:72 +0x871

    Previous read at 0x00c42110c0a0 by goroutine 60:
    core/transactions.(*Transaction_2_Simple).HashWithFee()
    core/transactions/type2.go:212 +0x48
    node/backend.(*Backend).broadcastNewTxSubmit()
    node/backend/event.go:38 +0xd1

    Write:
    [core/transactions/type2.go:54]
    func (trs *Transaction_2_Simple) ClearHash() {
    trs.hashwithfee = nil <=== Write
    trs.hashnofee = nil
    }

    Read:
    [core/transactions/type2.go:212]
    func (trs *Transaction_2_Simple) HashWithFee() fields.Hash {
    if trs.hashwithfee == nil { <=== Read
    return trs.HashWithFeeFresh()
    }
    return trs.hashwithfee
    }

=-=-=-=-=-

  • Transaction_2_Simple Data Race # 3:
    WARNING: DATA RACE
    Write at 0x00c42110c0b8 by goroutine 71:
    core/transactions.(*Transaction_2_Simple).SetFee()
    core/transactions/type2.go:55 +0xe9
    miner/diamondminer.(*DiamondMiner).doAutoBidForMyDiamond()
    miner/diamondminer/autobid.go:72 +0x871

    Previous read at 0x00c42110c0b8 by goroutine 69:
    core/transactions.(*Transaction_2_Simple).Hash()
    core/transactions/type2.go:226 +0x48
    miner/diamondminer.(*DiamondMiner).successFindDiamondAddTxPool()
    miner/diamondminer/success.go:38 +0x6c2

    Write:
    [core/transactions/type2.go:55]
    func (trs *Transaction_2_Simple) ClearHash() {
    trs.hashwithfee = nil
    trs.hashnofee = nil <=== Write
    }

    Read:
    [core/transactions/type2.go:226 +0x48]
    func (trs *Transaction_2_Simple) Hash() fields.Hash {
    if trs.hashnofee == nil { <=== Read
    return trs.HashFresh()
    }
    return trs.hashnofee
    }

  • Transaction_2_Simple Data Race # 4:
    WARNING: DATA RACE
    Write at 0x00c42110c0a0 by goroutine 71:
    core/transactions.(*Transaction_2_Simple).SetFee()
    core/transactions/type2.go:54 +0xb0
    miner/diamondminer.(*DiamondMiner).doAutoBidForMyDiamond()
    miner/diamondminer/autobid.go:72 +0x871

    Previous read at 0x00c42110c0a0 by goroutine 60:
    core/transactions.(*Transaction_2_Simple).HashWithFee()
    core/transactions/type2.go:212 +0x48
    node/backend.(*Backend).broadcastNewTxSubmit()
    node/backend/event.go:38 +0xd1

    Write:
    [core/transactions/type2.go:54]
    func (trs *Transaction_2_Simple) ClearHash() {
    trs.hashwithfee = nil <=== Write
    trs.hashnofee = nil
    }

    Read:
    [core/transactions/type2.go:212]
    func (trs *Transaction_2_Simple) HashWithFee() fields.Hash {
    if trs.hashwithfee == nil { <=== Read
    return trs.HashWithFeeFresh()
    }
    return trs.hashwithfee
    }

=-=-=-=-=-

  • Transaction_2_Simple Data Race # 5:
    WARNING: DATA RACE
    Write at 0x00c4211100a8 by goroutine 71:
    core/transactions.(*Transaction_2_Simple).addOneSign()
    core/transactions/type2.go:379 +0x3c0
    core/transactions.(*Transaction_2_Simple).FillNeedSigns()
    core/transactions/type2.go:335 +0x179
    miner/diamondminer.(*DiamondMiner).doAutoBidForMyDiamond()
    miner/diamondminer/autobid.go:78 +0xa28

    Previous read at 0x00c4211100a8 by goroutine 60:
    core/fields.(*Sign).Serialize()
    core/fields/sign.go:23 +0xdf
    core/transactions.(*Transaction_2_Simple).Serialize()
    core/transactions/type2.go:83 +0x1ad
    node/backend.(*Backend).broadcastNewTxSubmit()
    node/backend/event.go:41 +0x167

    Write:
    [core/transactions/type2.go:379]
    func (trs *Transaction_2_Simple) addOneSign(hash []byte, addrPrivates map[string][]byte, address []byte) error {
    // 判断私钥是否存在
    privitebytes, has := addrPrivates[string(address)]
    if !has {
    return fmt.Errorf("Private Key '" + account.Base58CheckEncode(address) + "' necessary")
    }
    privite, e1 := account.GetAccountByPriviteKey(privitebytes)
    if e1 != nil {
    return fmt.Errorf("Private Key '" + account.Base58CheckEncode(address) + "' error")
    }
    // 判断签名是否已经存在,如果存在则去掉重新加入
    var alreadly = -1
    for i, sig := range trs.Signs {
    if bytes.Compare(sig.PublicKey, privite.PublicKey) == 0 {
    alreadly = i
    break
    }
    }
    // 计算签名
    signature, e2 := privite.Private.Sign(hash)
    if e2 != nil {
    return fmt.Errorf("Private Key '" + account.Base58CheckEncode(address) + "' do sign error")
    }
    sigObjSave := fields.Sign{
    PublicKey: privite.PublicKey,
    Signature: signature.Serialize64(),
    }
    if alreadly > -1 {
    // replace
    trs.Signs[alreadly] = sigObjSave <=== Write
    } else {
    // append
    trs.SignCount += 1
    trs.Signs = append(trs.Signs, sigObjSave)
    }

    //// test ////
    //verok := signature.Verify(hash, privite.Private.PubKey())
    //if !verok {
    // panic("false")
    //}

    return nil
    }

    Read:
    [core/transactions/type2.go:83]
    func (trs *Transaction_2_Simple) Serialize() ([]byte, error) {
    body, e0 := trs.SerializeNoSign()
    if e0 != nil {
    return nil, e0
    }
    var buffer = new(bytes.Buffer)
    buffer.Write(body)
    // sign
    b1, e1 := trs.SignCount.Serialize()
    if e1 != nil {
    return nil, e1
    }
    buffer.Write(b1)
    for i := 0; i < int(trs.SignCount); i++ {
    var bi, e = trs.Signs[i].Serialize() <=== Read
    if e != nil {
    return nil, e
    }
    buffer.Write(bi)
    }
    // muilt sign
    b2, e2 := trs.MultisignCount.Serialize()
    if e2 != nil {
    return nil, e2
    }
    buffer.Write(b2)
    for i := 0; i < int(trs.MultisignCount); i++ {
    var bi, e = trs.Multisigns[i].Serialize()
    if e != nil {
    return nil, e
    }
    buffer.Write(bi)
    }
    // ok
    return buffer.Bytes(), nil
    }

=-=-=-=-=-

Request: Reading data from blk.dat files

Is there a program which read blk.dat files from hacash_mainnet_data

I found a method to read those binary files and convert them into readable format. [https://github.com/mrqc/bitcoin-blk-file-reader]. However as I browser up for transactions using this method and tried searching it into the explorer, data are not found.

Is there a way to read blk.dat files? If I am correct, I need to modify some part of analyze.py to make it compatible to Hacash?

Summary of Data Races associated with [*Block_v1]

Summary of Data Races associated with [*Block_v1]

  • Text file of instrumented Diamond Miner [DiamondMiner_log_Sep_21_2021.txt], available via Github Gist [https://gist.github.com/dwymi02/95140687b566efa22939dec3cd14dd6d]

  • The instrumented binary was built from refreshed code, dated Sep-21-2021.
    The source code repository was wiped clean and "git clone ..." was used to create the updated code base.

  • The instrumented code was built with "-race", e.g., "go build -race".

  • The directory [hacash_mainnet_data/] was completely removed, to allow the Hacash blockchain to be built from scratch, also allowing for additional Data Races to be uncovered, if any.

  • In searching for Data Races I tried to group them based on specific Hacash components.
    Grep was used to search for known Hacash components, following is a list of potential Data Races, based on Hacash components:

    • A search for [*P2P] returned 109 hits
    • A search for [*Block_v1] returned 4 hits
    • A search for [*ChainState] returned 3 hits
    • A search for [*DiamondMiner] returned 45 hits
    • A search for [*DeprecatedApiService] returned 33 hits
    • A search for [*Transaction_2_Simple] returned 14 hits
      While there are large numbers of potential hits, there Hacash components may or may not be proximate to the actual Data Race.

    For each Hacash component I will list the specific Data Race.
    When the current code does not match what was previously built, a [*** NOTE ***] will be included
    In the event the stack trace that is too lengthy, the results will be truncated.
    If it is necessary, the full log will be provided, as needed.

    For the specific Read and Write that are involved in the Data Race, the [<===] will be used to identify where the specific Read/Write occurred, based on current source available ...

    Once the Data Race has been documented, an issue will be opened on Github.
    I will leave it to your team to decide how the Data Race is to be handled.
    You can opt to fix the Data Race or simply close the issue. as you so choose.
    This decision I leave with you and your team ...

=-=-=-=-=-=-=-=-=-=-=-=-=-

Data Races associated with [*Block_v1]

  • Block_v1 Data Race # 1:
    sync blocks from peer hacash_beijing(182.92.163.225:3337): 3001... [P2P] find 8 nearest public nodes to update topology table, connected peers: 5 public, 0 private, total: 5 nodes, 5 conns.
    got blocks: 3001 ~ 4000, inserting... OK
    sync blocks from peer hacash_beijing(182.92.163.225:3337): 4001... got blocks: 4001 ~ 5000, inserting... ==================
    WARNING: DATA RACE
    Write at 0x00c420d4e748 by goroutine 122:
    core/blocks.(*Block_v1).HashFresh()
    core/blocks/version1.go:341 +0x7d
    mint/blockchain.(*BlockChain).tryValidateAppendNewBlockToChainStateAndStore()
    mint/blockchain/append.go:50 +0x3c5
    mint/blockchain.(*BlockChain).InsertBlock()
    mint/blockchain/append.go:27 +0x5a
    node/handler.GetBlocksData()
    node/handler/blockdata.go:43 +0x5ad
    node/backend.(*Backend).OnMsgData()
    node/backend/message.go:62 +0x342
    node/p2pv2.(*P2P).handleConnMsg()
    node/p2pv2/tcp_msg_handler.go:39 +0x8f5

    Previous read at 0x00c420d4e748 by goroutine 123:
    core/blocks.(*Block_v1).Hash()
    core/blocks/version1.go:335 +0x9c
    node/handler.SendStatusToPeer()
    node/handler/status.go:82 +0x105
    node/backend.(*Backend).OnMsgData()
    node/backend/message.go:37 +0x58c
    node/p2pv2.(*P2P).handleConnMsg()
    node/p2pv2/tcp_msg_handler.go:39 +0x8f5

    Write:
    [core/blocks/version1.go:341]
    [*** NOTE ***] The reported Data Race does not match the reported line ...
    func (block *Block_v1) hashFreshUnsafe() fields.Hash {
    block.hash = CalculateBlockHash(block) <=== Actual Write
    return block.hash <=== Reported Write
    }

    Read:
    [core/blocks/version1.go:335]
    [*** NOTE ***] The reported Data Race does not match the reported line ...
    func (block *Block_v1) HashFresh() fields.Hash {
    block.insertLock.Lock()
    defer block.insertLock.Unlock()
    <=== Reported Read
    return block.hashFreshUnsafe()
    }

=-=-=-=-=-=-=-=-=-=-=-=-=-

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.