GithubHelp home page GithubHelp logo

dmdcoin / diamond-node Goto Github PK

View Code? Open in Web Editor NEW
0.0 0.0 4.0 47.83 MB

bit.diamonds node software for network version 4

License: GNU General Public License v3.0

Rust 99.54% Solidity 0.22% Shell 0.13% PowerShell 0.01% Dockerfile 0.10%

diamond-node's People

Contributors

0x7cfe avatar 5chdn avatar andresilva avatar arkpar avatar ascjones avatar craiggleso avatar debris avatar derhuerst avatar dforsten avatar dvdplm avatar gavofyork avatar general-beck avatar grbizl avatar jacogr avatar jesuscript avatar kaikun213 avatar maciejhirsz avatar ngotchac avatar niklasad1 avatar nikvolf avatar ordian avatar rakita avatar rphmeier avatar sorpaas avatar sunce86 avatar surfingnerd avatar svyatonik avatar tomaka avatar tomusdrw avatar varasev avatar

Watchers

 avatar  avatar  avatar  avatar

diamond-node's Issues

reported fault: UnknownSender

... happens on epoch switch when a node receives message after switching to the new epoch.

we might keep the old hbbft instance, to either check messages or
stop any message tracking for the old epoch, and finalize the epoch in the hbbft message memorial.

Error on handling HoneyBadger message from NodeId(0x7ba3…0aa2) in epoch 76088 error: UnknownSender

Alpha2 with Node Software 0.9.3. as frequently Nodes from UnknownSender.
This is an example:

2023-11-04 06:54:03  Error on handling HoneyBadger message from NodeId(0x7ba3…0aa2) in epoch 76088 error: UnknownSender
2023-11-04 06:54:03  Block 76088 Node NodeId(0x7ba3…0aa2) reported fault: UnknownSender
2023-11-04 06:54:03  Error on handling HoneyBadger message from NodeId(0x7ba3…0aa2) in epoch 76088 error: UnknownSender
2023-11-04 06:54:03  Block 76088 Node NodeId(0x7ba3…0aa2) reported fault: UnknownSender
2023-11-04 06:54:03  Error on handling HoneyBadger message from NodeId(0x7ba3…0aa2) in epoch 76088 error: UnknownSender
2023-11-04 06:54:03  Block 76088 Node NodeId(0x7ba3…0aa2) reported fault: UnknownSender
2023-11-04 06:54:03  Error on handling HoneyBadger message from NodeId(0x7ba3…0aa2) in epoch 76088 error: UnknownSender
2023-11-04 06:54:03  Block 76088 Node NodeId(0x7ba3…0aa2) reported fault: UnknownSender
2023-11-04 06:54:03  Error on handling HoneyBadger message from NodeId(0x7ba3…0aa2) in epoch 76088 error: UnknownSender
2023-11-04 06:54:03  Block 76088 Node NodeId(0x7ba3…0aa2) reported fault: UnknownSender
2023-11-04 06:54:03  Error on handling HoneyBadger message from NodeId(0x7ba3…0aa2) in epoch 76088 error: UnknownSender
2023-11-04 06:54:03  Block 76088 Node NodeId(0x7ba3…0aa2) reported fault: UnknownSender
2023-11-04 06:54:03  Error on handling HoneyBadger message from NodeId(0x7ba3…0aa2) in epoch 76088 error: UnknownSender
2023-11-04 06:54:03  Block 76088 Node NodeId(0x7ba3…0aa2) reported fault: UnknownSender
2023-11-04 06:54:04  Imported #76088 0x4939…e7db (0 txs, 0.00 Mgas, 8303 ms, 0.59 KiB)
2023-11-04 06:54:21   42/50 peers   6 MiB chain 0 bytes queue  RPC:  0 conn,    0 req/s,    0 µs
2023-11-04 06:54:51   42/50 peers   6 MiB chain 0 bytes queue  RPC:  0 conn,    0 req/s,    0 µs
2023-11-04 06:54:51  Block creation: Honeybadger epoch 76089, Transactions subset target size: 4, actual size: 0, from available 0.

devp2p monitoring

For the upcoming Bonus point system we have to track the reliability of Validator Partner Nodes.

3 Seconds block time

Expectation:

  • Reduces Echo Blocks.

TODO:

  • quantify echo Blocks. ()

Performance Impact:

  • Measure Performance before change.

  • Measure performance after.

  • measure number of echo blocks before and after.

Definition of an Echo Block:
Block with 0 Transactions outside the Key Gen Phase.

is major syncing information is wrong.

A lot of logic are activated depending if the node is syncing or not.
for example: writing of Keys should not be done if the node is syncing.

there was already speculation about, if this value behaves correct or not,
Or in other words: Does the mechanic for figuring out if a node is syncing always returns the correct value.

i have picked up the value in the prometheus interface and now we clearly know that the used method does not work reliable.

this is a root case problem for a lot of follow up issues.
The prometheus metric is known as "is_major_syncing"

long running tests are causing issues

Our Tests that are executed every checking are very large.
Especially those that test with historic data.

Questions about best practices:
It seems to be the standard in automated testing to build the whole system from scratch,
even things like setting up the rust development environment.
pulling docker images that have already solved some the basic setup steps on the other hand seems not to add a huge amounts of points of failures,

Another big part are the regression tests that use huge datasets and test quite a lot.
Those run for hours as well.
We could reduce the frequency of those large tests and for example run the tests only on pull requests or antoher trigger.
Drafting a release could be such a trigger, but i don't know if this can be well integrated into the github pipeline.

Another problem that showed up that we run out of free minutes of CI Pipelne jobs for open source repos.
It happeded quite often that a job could not get picked up within 24 hours.

hbbft message memorial - unprocessable messages

messages that could not be stored in any epoch keep getting processed again and again.
i think this happens when a node receives a message that was from the staking epoch before the current one.

Autoshutdown after a period of 30 minutes without block import

The Feature is tagged as shutdown-on-missing-block-import.

Configuration:
(Those example are defining the current default values)

CLI Args:
--shutdown-on-missing-block-import=1800

node.toml:

[Misc]
shutdown_on_missing_block_import = 1800

details

Shuts down if no block has been imported for N seconds. Defaults to None. Set to None or 0 to disable this feature

Note that the timer is not accurate.
defining 1800 seconds means that the check happens every 1800 seconds, and if not a single block was created within this timeframe, the node shuts down.

occasionally wrong version info.

The binary version sometimes does not always match the version of diamond-node.

I think the reason behind it is the util/version crate in incremental builds, because the rust cargo does not rebuild the crate if there is no change in the version info gets fetched during the build.

deadlock on locking HBBFT State after receiving Agreements

The reason for not producing blocks is an infinite lock on the hbbft worker thread that tries to write the hbbft state:

  1. State Write: ethcore::engines::hbbft::hbbft_engine::HoneyBadgerBFT::start_hbbft_epoch
  2. State Write ethcore::engines::hbbft::hbbft_engine::HoneyBadgerBFT::process_hb_message
  3. State Write: ethcore::engines::hbbft::hbbft_engine::HoneyBadgerBFT::replay_cached_messages

This bug is new and might be connected to the integration of either peer management or hbbft message tracking

Woker HBBFT 0 (This is the Engine Main Loop)
syscall (@syscall:12)
parking_lot_core::thread_parker::imp::ThreadParker::futex_wait (/home/sn/.cargo/registry/src/github.com-1ecc6299db9ec823/parking_lot_core-0.8.3/src/thread_parker/linux.rs:112)
<parking_lot_core::thread_parker::imp::ThreadParker as parking_lot_core::thread_parker::ThreadParkerT>::park (/home/sn/.cargo/registry/src/github.com-1ecc6299db9ec823/parking_lot_core-0.8.3/src/thread_parker/linux.rs:66)
parking_lot_core::parking_lot::park::{{closure}} (/home/sn/.cargo/registry/src/github.com-1ecc6299db9ec823/parking_lot_core-0.8.3/src/parking_lot.rs:611)
parking_lot_core::parking_lot::with_thread_data (/home/sn/.cargo/registry/src/github.com-1ecc6299db9ec823/parking_lot_core-0.8.3/src/parking_lot.rs:183)
parking_lot_core::parking_lot::park (/home/sn/.cargo/registry/src/github.com-1ecc6299db9ec823/parking_lot_core-0.8.3/src/parking_lot.rs:576)
parking_lot::raw_rwlock::RawRwLock::lock_common (/home/sn/.cargo/registry/src/github.com-1ecc6299db9ec823/parking_lot-0.11.1/src/raw_rwlock.rs:1110)
parking_lot::raw_rwlock::RawRwLock::lock_exclusive_slow (/home/sn/.cargo/registry/src/github.com-1ecc6299db9ec823/parking_lot-0.11.1/src/raw_rwlock.rs:628)
<parking_lot::raw_rwlock::RawRwLock as lock_api::rwlock::RawRwLock>::lock_exclusive (/home/sn/.cargo/registry/src/github.com-1ecc6299db9ec823/parking_lot-0.11.1/src/raw_rwlock.rs:74)
lock_api::rwlock::RwLock<R,T>::write (/home/sn/.cargo/registry/src/github.com-1ecc6299db9ec823/lock_api-0.4.3/src/rwlock.rs:461)
ethcore::engines::hbbft::hbbft_engine::HoneyBadgerBFT::start_hbbft_epoch (/home/sn/dmd/diamond-node/crates/ethcore/src/engines/hbbft/hbbft_engine.rs:724)
<ethcore::engines::hbbft::hbbft_engine::TransitionHandler as ethcore_io::IoHandler<()>>::timeout (/home/sn/dmd/diamond-node/crates/ethcore/src/engines/hbbft/hbbft_engine.rs:210)
ethcore_io::worker::Worker::do_work (/home/sn/dmd/diamond-node/crates/runtime/io/src/worker.rs:135)
ethcore_io::worker::Worker::new::{{closure}}::{{closure}} (/home/sn/dmd/diamond-node/crates/runtime/io/src/worker.rs:98)
futures::future::loop_fn::loop_fn (/home/sn/.cargo/registry/src/github.com-1ecc6299db9ec823/futures-0.1.29/src/future/loop_fn.rs:79)
ethcore_io::worker::Worker::new::{{closure}} (/home/sn/dmd/diamond-node/crates/runtime/io/src/worker.rs:86)
std::sys_common::backtrace::__rust_begin_short_backtrace (@std::sys_common::backtrace::__rust_begin_short_backtrace:17)
std::thread::Builder::spawn_unchecked_::{{closure}}::{{closure}} (@core::ops::function::FnOnce::call_once{{vtable.shim}}:70)
<core::panic::unwind_safe::AssertUnwindSafe<F> as core::ops::function::FnOnce<()>>::call_once (@core::ops::function::FnOnce::call_once{{vtable.shim}}:58)
std::panicking::try::do_call (@core::ops::function::FnOnce::call_once{{vtable.shim}}:58)

a second thread that tries to write on the state is

Worker Client 1
syscall (@syscall:12)
parking_lot_core::thread_parker::imp::ThreadParker::futex_wait (/home/sn/.cargo/registry/src/github.com-1ecc6299db9ec823/parking_lot_core-0.8.3/src/thread_parker/linux.rs:112)
<parking_lot_core::thread_parker::imp::ThreadParker as parking_lot_core::thread_parker::ThreadParkerT>::park (/home/sn/.cargo/registry/src/github.com-1ecc6299db9ec823/parking_lot_core-0.8.3/src/thread_parker/linux.rs:66)
parking_lot_core::parking_lot::park::{{closure}} (/home/sn/.cargo/registry/src/github.com-1ecc6299db9ec823/parking_lot_core-0.8.3/src/parking_lot.rs:611)
parking_lot_core::parking_lot::with_thread_data (/home/sn/.cargo/registry/src/github.com-1ecc6299db9ec823/parking_lot_core-0.8.3/src/parking_lot.rs:183)
parking_lot_core::parking_lot::park (/home/sn/.cargo/registry/src/github.com-1ecc6299db9ec823/parking_lot_core-0.8.3/src/parking_lot.rs:576)
parking_lot::raw_rwlock::RawRwLock::lock_common (/home/sn/.cargo/registry/src/github.com-1ecc6299db9ec823/parking_lot-0.11.1/src/raw_rwlock.rs:1110)
parking_lot::raw_rwlock::RawRwLock::lock_exclusive_slow (/home/sn/.cargo/registry/src/github.com-1ecc6299db9ec823/parking_lot-0.11.1/src/raw_rwlock.rs:628)
<parking_lot::raw_rwlock::RawRwLock as lock_api::rwlock::RawRwLock>::lock_exclusive (/home/sn/.cargo/registry/src/github.com-1ecc6299db9ec823/parking_lot-0.11.1/src/raw_rwlock.rs:74)
lock_api::rwlock::RwLock<R,T>::write (/home/sn/.cargo/registry/src/github.com-1ecc6299db9ec823/lock_api-0.4.3/src/rwlock.rs:461)
ethcore::engines::hbbft::hbbft_engine::HoneyBadgerBFT::process_hb_message (/home/sn/dmd/diamond-node/crates/ethcore/src/engines/hbbft/hbbft_engine.rs:524)
<ethcore::engines::hbbft::hbbft_engine::HoneyBadgerBFT as ethcore::engines::Engine<ethcore::machine::impls::EthereumMachine>>::handle_message (/home/sn/dmd/diamond-node/crates/ethcore/src/engines/hbbft/hbbft_engine.rs:1305)
<ethcore::client::client::Client as ethcore::client::traits::IoClient>::queue_consensus_message::{{closure}} (/home/sn/dmd/diamond-node/crates/ethcore/src/client/client.rs:2976)
ethcore::client::client::IoChannelQueue::queue::{{closure}} (/home/sn/dmd/diamond-node/crates/ethcore/src/client/client.rs:3520)
<ethcore_service::service::ClientIoHandler as ethcore_io::IoHandler<ethcore::client::io_message::ClientIoMessage>>::message (/home/sn/dmd/diamond-node/crates/ethcore/service/src/service.rs:229)
ethcore_io::worker::Worker::do_work (/home/sn/dmd/diamond-node/crates/runtime/io/src/worker.rs:139)
ethcore_io::worker::Worker::new::{{closure}}::{{closure}} (/home/sn/dmd/diamond-node/crates/runtime/io/src/worker.rs:98)
<futures::future::loop_fn::LoopFn<A,F> as futures::future::Future>::poll (/home/sn/.cargo/registry/src/github.com-1ecc6299db9ec823/futures-0.1.29/src/future/loop_fn.rs:95)
futures::task_impl::Spawn<T>::poll_future_notify::{{closure}} (/home/sn/.cargo/registry/src/github.com-1ecc6299db9ec823/futures-0.1.29/src/task_impl/mod.rs:329)
futures::task_impl::Spawn<T>::enter::{{closure}} (/home/sn/.cargo/registry/src/github.com-1ecc6299db9ec823/futures-0.1.29/src/task_impl/mod.rs:399)
Worker Client 2
syscall (@syscall:12)
parking_lot_core::thread_parker::imp::ThreadParker::futex_wait (/home/sn/.cargo/registry/src/github.com-1ecc6299db9ec823/parking_lot_core-0.8.3/src/thread_parker/linux.rs:112)
<parking_lot_core::thread_parker::imp::ThreadParker as parking_lot_core::thread_parker::ThreadParkerT>::park (/home/sn/.cargo/registry/src/github.com-1ecc6299db9ec823/parking_lot_core-0.8.3/src/thread_parker/linux.rs:66)
parking_lot_core::parking_lot::park::{{closure}} (/home/sn/.cargo/registry/src/github.com-1ecc6299db9ec823/parking_lot_core-0.8.3/src/parking_lot.rs:611)
parking_lot_core::parking_lot::with_thread_data (/home/sn/.cargo/registry/src/github.com-1ecc6299db9ec823/parking_lot_core-0.8.3/src/parking_lot.rs:183)
parking_lot_core::parking_lot::park (/home/sn/.cargo/registry/src/github.com-1ecc6299db9ec823/parking_lot_core-0.8.3/src/parking_lot.rs:576)
parking_lot::raw_rwlock::RawRwLock::lock_common (/home/sn/.cargo/registry/src/github.com-1ecc6299db9ec823/parking_lot-0.11.1/src/raw_rwlock.rs:1110)
parking_lot::raw_rwlock::RawRwLock::lock_exclusive_slow (/home/sn/.cargo/registry/src/github.com-1ecc6299db9ec823/parking_lot-0.11.1/src/raw_rwlock.rs:628)
<parking_lot::raw_rwlock::RawRwLock as lock_api::rwlock::RawRwLock>::lock_exclusive (/home/sn/.cargo/registry/src/github.com-1ecc6299db9ec823/parking_lot-0.11.1/src/raw_rwlock.rs:74)
lock_api::rwlock::RwLock<R,T>::write (/home/sn/.cargo/registry/src/github.com-1ecc6299db9ec823/lock_api-0.4.3/src/rwlock.rs:461)
ethcore::engines::hbbft::hbbft_engine::HoneyBadgerBFT::join_hbbft_epoch (/home/sn/dmd/diamond-node/crates/ethcore/src/engines/hbbft/hbbft_engine.rs:708)
ethcore::engines::hbbft::hbbft_engine::HoneyBadgerBFT::process_hb_message (/home/sn/dmd/diamond-node/crates/ethcore/src/engines/hbbft/hbbft_engine.rs:546)
<ethcore::engines::hbbft::hbbft_engine::HoneyBadgerBFT as ethcore::engines::Engine<ethcore::machine::impls::EthereumMachine>>::handle_message (/home/sn/dmd/diamond-node/crates/ethcore/src/engines/hbbft/hbbft_engine.rs:1305)
<ethcore::client::client::Client as ethcore::client::traits::IoClient>::queue_consensus_message::{{closure}} (/home/sn/dmd/diamond-node/crates/ethcore/src/client/client.rs:2976)
ethcore::client::client::IoChannelQueue::queue::{{closure}} (/home/sn/dmd/diamond-node/crates/ethcore/src/client/client.rs:3520)
<ethcore_service::service::ClientIoHandler as ethcore_io::IoHandler<ethcore::client::io_message::ClientIoMessage>>::message (/home/sn/dmd/diamond-node/crates/ethcore/service/src/service.rs:229)
ethcore_io::worker::Worker::do_work (/home/sn/dmd/diamond-node/crates/runtime/io/src/worker.rs:139)
ethcore_io::worker::Worker::new::{{closure}}::{{closure}} (/home/sn/dmd/diamond-node/crates/runtime/io/src/worker.rs:98)
<futures::future::loop_fn::LoopFn<A,F> as futures::future::Future>::poll (/home/sn/.cargo/registry/src/github.com-1ecc6299db9ec823/futures-0.1.29/src/future/loop_fn.rs:95)
futures::task_impl::Spawn<T>::poll_future_notify::{{closure}} (/home/sn/.cargo/registry/src/github.com-1ecc6299db9ec823/futures-0.1.29/src/task_impl/mod.rs:329)

pruning as root cause for stage 3 errors

the pruning could be the root cause why nodes sometimes are unable to verify seals.
the information to build the shared key is gone in this case.
see also: #24

since we know what stage of block we need, the engine could tell the DB system a limit for pruning.

explain block production reasons

extend: "Block creation: Honeybadger " to find out why a block was created.
add heartbeat for heartbeat blocks.
add keygen for keygen blocks.

automatic peer managemant: could not add validators

... source could be that we never remove pending validators.

disconnect_pending_validators

example log output:

2023-07-12 12:49:11  Block creation: Honeybadger epoch 382619, Transactions subset target size: 5, actual size: 5, from available 13.
2023-07-12 12:49:21  could not add pending validator to reserved peers: 0xf001…9b63
2023-07-12 12:49:21  could not add pending validator to reserved peers: 0xd79c…b568
2023-07-12 12:49:21  could not add pending validator to reserved peers: 0x0289…963f
2023-07-12 12:49:21  could not add pending validator to reserved peers: 0xb21f…22e3
2023-07-12 12:49:21  could not add pending validator to reserved peers: 0x4070…c6bd
2023-07-12 12:49:21  could not add pending validator to reserved peers: 0xa886…a93f
2023-07-12 12:49:21  could not add pending validator to reserved peers: 0xd2d0…5411
2023-07-12 12:49:21  could not add pending validator to reserved peers: 0xc0b0…9edd
2023-07-12 12:49:21  could not add pending validator to reserved peers: 0x04f4…e43e
2023-07-12 12:49:21  could not add pending validator to reserved peers: 0x443c…7b54
2023-07-12 12:49:21  could not add pending validator to reserved peers: 0x4f44…d038
2023-07-12 12:49:21  could not add pending validator to reserved peers: 0x967c…ae97
2023-07-12 12:49:21  could not add pending validator to reserved peers: 0x2e23…c096
2023-07-12 12:49:21  could not add pending validator to reserved peers: 0xa988…a2c3
2023-07-12 12:49:21  could not add pending validator to reserved peers: 0x9382…b0a0
2023-07-12 12:49:21  could not add pending validator to reserved peers: 0xd7c3…7c9f
2023-07-12 12:49:21  could not add pending validator to reserved peers: 0x63d0…b420
2023-07-12 12:49:21  could not add pending validator to reserved peers: 0xc49e…f6ef
2023-07-12 12:49:21  could not add pending validator to reserved peers: 0xaf63…01da
2023-07-12 12:49:21  could not add pending validator to reserved peers: 0xc633…1ebc
2023-07-12 12:49:21  could not add pending validator to reserved peers: 0xb618…a505
2023-07-12 12:49:21  could not add pending validator to reserved peers: 0xaaeb…0a07
2023-07-12 12:49:21  could not add pending validator to reserved peers: 0x0ea7…fba6
2023-07-12 12:49:21  could not add pending validator to reserved peers: 0x7c7c…345a

failed to resolve: could not find `EthabfunctionsContract` in `ethabi_derive` on some systems

does not appear on every system.
It's known that that problem appeared with the hardhat upgrade, but in this case, the contracts that cause the problem have not been updated.
tested with rust 1.59 and 1.64.

another Hint is how Cargo.lock is working

error[E0433]: failed to resolve: could not find `EthabfunctionsContract` in `ethabi_derive`
  --> crates/concensus/miner/src/service_transaction_checker.rs:26:1
   |
26 | / use_contract!(
27 | |     service_transaction,
28 | |     "res/contracts/service_transaction.json"
29 | | );
   | |_^ could not find `EthabfunctionsContract` in `ethabi_derive`
   |
   = note: this error originates in the macro `use_contract` (in Nightly builds, run with -Z macro-backtrace for more info)

error: cannot find attribute `ethabi_contract_options` in this scope
  --> crates/concensus/miner/src/service_transaction_checker.rs:26:1
   |
26 | / use_contract!(
27 | |     service_transaction,
28 | |     "res/contracts/service_transaction.json"
29 | | );
   | |_^
   |
   = note: this error originates in the macro `use_contract` (in Nightly builds, run with -Z macro-backtrace for more info)

error[E0433]: failed to resolve: could not find `functions` in `service_transaction`
   --> crates/concensus/miner/src/service_transaction_checker.rs:121:52
    |
121 |         let (data, decoder) = service_transaction::functions::certified::call(sender);
    |                                                    ^^^^^^^^^ could not find `functions` in `service_transaction`

Block creation: Pending transaction with nonce too low,

what is the origin of this error:

  • Are the transactions with the "Pending transaction with nonce too low" only because they got just imported in the block before ?
  • are they in the "Transaction Queue" with a Nonce Gap ?
  • is it a transaction without nonce gap ?

todo:
improve logging with more infos about the transaction.

2023-10-31 15:22:16 UTC Imported #38308 0xf9b0…f62e (10 txs, 82.49 Mgas, 372 ms, 126.35 KiB)
2023-10-31 15:22:17 UTC Block creation: Pending transaction with nonce too low, got 905, expected at least 906
2023-10-31 15:22:17 UTC Block creation: Pending transaction with nonce too low, got 1442, expected at least 1443
2023-10-31 15:22:17 UTC Block creation: Pending transaction with nonce too low, got 1164, expected at least 1165
2023-10-31 15:22:17 UTC Block creation: Pending transaction with nonce too low, got 1095, expected at least 1096
2023-10-31 15:22:17 UTC Block creation: Pending transaction with nonce too low, got 1587, expected at least 1588
2023-10-31 15:22:17 UTC Block creation: Pending transaction with nonce too low, got 1555, expected at least 1556
2023-10-31 15:22:17 UTC Block creation: Pending transaction with nonce too low, got 40, expected at least 41
2023-10-31 15:22:17 UTC Block creation: Pending transaction with nonce too low, got 1573, expected at least 1574
2023-10-31 15:22:17 UTC Block creation: Pending transaction with nonce too low, got 1362, expected at least 1363
2023-10-31 15:22:17 UTC Block creation: Pending transaction with nonce too low, got 1177, expected at least 1178
2023-10-31 15:22:17 UTC Block creation: Pending transaction with nonce too low, got 1351, expected at least 1352
2023-10-31 15:22:17 UTC Block creation: Pending transaction with nonce too low, got 1630, expected at least 1631
2023-10-31 15:22:17 UTC Block creation: Pending transaction with nonce too low, got 1483, expected at least 1484
2023-10-31 15:22:17 UTC Block creation: Pending transaction with nonce too low, got 1342, expected at least 1343
2023-10-31 15:22:17 UTC Block creation: Pending transaction with nonce too low, got 1376, expected at least 1377
2023-10-31 15:22:17 UTC Block creation: Pending transaction with nonce too low, got 1167, expected at least 1168
2023-10-31 15:22:17 UTC Block creation: Pending transaction with nonce too low, got 1103, expected at least 1104
2023-10-31 15:22:17 UTC Block creation: Pending transaction with nonce too low, got 1333, expected at least 1334
2023-10-31 15:22:17 UTC Block creation: Pending transaction with nonce too low, got 1409, expected at least 1410
2023-10-31 15:22:17 UTC Block creation: Pending transaction with nonce too low, got 1372, expected at least 1373
2023-10-31 15:22:17 UTC Block creation: Pending transaction with nonce too low, got 1550, expected at least 1551
2023-10-31 15:22:17 UTC Block creation: Pending transaction with nonce too low, got 1484, expected at least 1485
2023-10-31 15:22:17 UTC Block creation: Pending transaction with nonce too low, got 1633, expected at least 1634
2023-10-31 15:22:17 UTC Block creation: Pending transaction with nonce too low, got 1347, expected at least 1348
2023-10-31 15:22:17 UTC Block creation: Honeybadger epoch 38309, Transactions subset target size: 6, actual size: 2, from available 26.
2023-10-31 15:22:17 UTC No session exists for peerId 68
2023-10-31 15:22:17 UTC No session exists for peerId 0
2023-10-31 15:22:18 UTC No session exists for peerId 46
2023-10-31 15:22:19 UTC Block creation: Batch received for epoch 38309, total 50 contributions, with 2 unique transactions.
2023-10-31 15:22:19 UTC Error sending keygen transactions ReturnValueInvalid
2023-10-31 15:22:19 UTC Error sending keygen transactions ReturnValueInvalid
2023-10-31 15:22:19 UTC Imported #38309 0x396d…6a43 (2 txs, 16.51 Mgas, 178 ms, 25.74 KiB)

time drift penalization

every node runs it's own clock.
when we calculate the median time of all reported block time stamps,
we can figure out if a nodes clock has a drift into the future or in the past.

we can use this information to create a malus score.

Nonce problem with announce availability and setting IP address

The Nonce management does not work in the case where we have more operations that send transaction at about the same time.
Here in this example, we are announcing availability and we are announcing our own IP address.
One of those transaction gets replaced,
but the Node software does not to try to send this transaction again, because it already did send a transaction.

We did have Nodes that were marked as Unavailable, but they were running,
and we did have Nodes that should have written their IP address, but they did not.

Probably the Nonce problem was the root of this problem.
A Node restart solved that problem, because the Node had only to either announce availability, or announce the IP address, whatever transaction was dropped & replaced at the last boot.

Log Entry:

2023-04-11 16:18:25 UTC Worker Hbbft2 WARN consensus  do_validator_engine_actions
2023-04-11 16:18:25 UTC Worker Hbbft2 INFO engine  sending announce availability transaction
2023-04-11 16:18:25 UTC Worker Hbbft2 INFO consensus  sending announce availability with nonce: 0
2023-04-11 16:18:25 UTC Worker Hbbft2 TRACE txqueue  Checking service transaction checker contract from 0x9640…84ce
2023-04-11 16:18:25 UTC Worker Hbbft2 TRACE txqueue  Checking service transaction checker contract from 0x9640…84ce
2023-04-11 16:18:25 UTC Worker Hbbft2 DEBUG txqueue  Service tx 0xeefaf483ec4fa37d5b63de03fb5956d068128db4b7f813387a7ffff678b3a83f below minimal gas price accepted
2023-04-11 16:18:25 UTC Worker Hbbft2 DEBUG txqueue  [0xeefaf483ec4fa37d5b63de03fb5956d068128db4b7f813387a7ffff678b3a83f] priority: Local
2023-04-11 16:18:25 UTC Worker Hbbft2 DEBUG txqueue  importing pool status: LightStatus { mem_usage: 0, transaction_count: 0, senders: 0 }
2023-04-11 16:18:25 UTC Worker Hbbft2 DEBUG txqueue  [0xeefaf483ec4fa37d5b63de03fb5956d068128db4b7f813387a7ffff678b3a83f] Added to the pool.
2023-04-11 16:18:25 UTC Worker Hbbft2 DEBUG txqueue  [0xeefaf483ec4fa37d5b63de03fb5956d068128db4b7f813387a7ffff678b3a83f] Sender: 0x9640…84ce, nonce: 0, gasPrice: 0, gas: 1000000, value: 0, dataLen: 68))
2023-04-11 16:18:25 UTC Worker Hbbft2 WARN engine  checking if internet address needs to be updated.
2023-04-11 16:18:25 UTC Worker Hbbft2 WARN engine  current Endpoint: 192.168.0.130:30303
2023-04-11 16:18:25 UTC Worker Hbbft2 WARN engine  stored validator address0.0.0.1:0
2023-04-11 16:18:25 UTC Worker Hbbft2 INFO consensus  set_validator_internet_address: ip: 192.168.0.130:30303 nonce: 0
2023-04-11 16:18:25 UTC Worker Hbbft2 TRACE txqueue  Checking service transaction checker contract from 0x9640…84ce
2023-04-11 16:18:25 UTC Worker Hbbft2 TRACE txqueue  Checking service transaction checker contract from 0x9640…84ce
2023-04-11 16:18:25 UTC Worker Hbbft2 DEBUG txqueue  Service tx 0xf7896b843fe6b66a2f6c0a403bd1e45009d6f96a53799880c8ae6b7987d696c7 below minimal gas price accepted
2023-04-11 16:18:25 UTC Worker Hbbft2 DEBUG txqueue  [0xf7896b843fe6b66a2f6c0a403bd1e45009d6f96a53799880c8ae6b7987d696c7] priority: Local
2023-04-11 16:18:25 UTC Worker Hbbft2 DEBUG txqueue  importing pool status: LightStatus { mem_usage: 72, transaction_count: 1, senders: 1 }
2023-04-11 16:18:25 UTC Worker Hbbft2 DEBUG txqueue  [0xf7896b843fe6b66a2f6c0a403bd1e45009d6f96a53799880c8ae6b7987d696c7] Added to the pool.
2023-04-11 16:18:25 UTC Worker Hbbft2 DEBUG txqueue  [0xf7896b843fe6b66a2f6c0a403bd1e45009d6f96a53799880c8ae6b7987d696c7] Sender: 0x9640…84ce, nonce: 0, gasPrice: 0, gas: 100000, value: 0, dataLen: 68))
2023-04-11 16:18:25 UTC Worker Hbbft2 DEBUG txqueue  [0xeefaf483ec4fa37d5b63de03fb5956d068128db4b7f813387a7ffff678b3a83f] Dropped. Replaced by [0xf7896b843fe6b66a2f6c0a403bd1e45009d6f96a53799880c8ae6b7987d696c7]
2

Softer Node Restart

currently nodes are scheduling themselves for a reboot in a hard way.
This was just a experimental implementation and should get replaced with a clean implementation that allows engine to request a node shutdown.
This is possible the reason for a lot of databases that cannot continue in the sync.

Header not found, Parent not found, Stage 5 verification failed

example:

2023-09-08 08:28:18  Header not found? : retract_step 2 n: 2793 h: 0x50cb…bd0a last_imported_block: 2794 last_imported_hash: 0xd41b…55b7 oldest_reorg 2719
2023-09-08 08:28:19  Stage 5 verification failed for #2794 (0xc72a…187c)
Block is ancient (current best block: #2794).
2023-09-08 08:28:19  Block import failed for #2795 (0xeb42…c312): Parent not found (0xc72a…187c) 
2023-09-08 08:28:19  
Bad block detected: Error(Msg("Parent not found"), State { next_error: None, backtrace: InternalBacktrace { backtrace: None } })
RLP: f90254f9024fa0c72a9de932fe45fd6841d1b1694da058fdd8f7b096d12191bc171748b0aa187ca01dcc4de8dec75d7aab85b567b6ccd41ad312451b948a7413f0a142fd40d49347940000000000000000000000000000000000000000a0e0ebb9f839743d6066ba5e54f69185bac0870938ce9945d91a97ff33145886dea056e81f171bcc55a6ff8345e692c0f86e5b48e01b996cadc001622fb5e363b421a056e81f171bcc55a6ff8345e692c0f86e5b48e01b996cadc001622fb5e363b421b901000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000001820aeb8411e1a300808464fb192ca09930f25f236a98804bad8cea85ef68979759f64109928f666ee379fd27f0a089b860b9e39c3e44128b1b1329efa6af0d2feb307d88727b17e4a4ec96827a474b45ef6a5dcfe0fb19f90079d996222443be0c0c2c73712ed5e3bfff35eadab0877c6275a81b40ed154a99cc31151a408b642cc7f0819b5b7d7328bab4ed8110f4474cc0c0
Header: Header { parent_hash: 0xc72a9de932fe45fd6841d1b1694da058fdd8f7b096d12191bc171748b0aa187c, timestamp: 1694177580, number: 2795, author: 0x0000000000000000000000000000000000000000, transactions_root: 0x56e81f171bcc55a6ff8345e692c0f86e5b48e01b996cadc001622fb5e363b421, uncles_hash: 0x1dcc4de8dec75d7aab85b567b6ccd41ad312451b948a7413f0a142fd40d49347, extra_data: [153, 48, 242, 95, 35, 106, 152, 128, 75, 173, 140, 234, 133, 239, 104, 151, 151, 89, 246, 65, 9, 146, 143, 102, 110, 227, 121, 253, 39, 240, 160, 137], state_root: 0xe0ebb9f839743d6066ba5e54f69185bac0870938ce9945d91a97ff33145886de, receipts_root: 0x56e81f171bcc55a6ff8345e692c0f86e5b48e01b996cadc001622fb5e363b421, log_bloom: 0x00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000, gas_used: 0, gas_limit: 300000000, difficulty: 1, seal: [[184, 96, 185, 227, 156, 62, 68, 18, 139, 27, 19, 41, 239, 166, 175, 13, 47, 235, 48, 125, 136, 114, 123, 23, 228, 164, 236, 150, 130, 122, 71, 75, 69, 239, 106, 93, 207, 224, 251, 25, 249, 0, 121, 217, 150, 34, 36, 67, 190, 12, 12, 44, 115, 113, 46, 213, 227, 191, 255, 53, 234, 218, 176, 135, 124, 98, 117, 168, 27, 64, 237, 21, 74, 153, 204, 49, 21, 26, 64, 139, 100, 44, 199, 240, 129, 155, 91, 125, 115, 40, 186, 180, 237, 129, 16, 244, 71, 76]], base_fee_per_gas: None, hash: Some(0xeb42fe464afb0f8b342f0879ac964ca2ffa765b0f6d18659041450987d17c312) }
Uncles: 
Transactions:

2023-09-08 08:28:21  Syncing    #2794 0xd41b…55b7     0.00 blk/s    0.0 tx/s    0.0 Mgas/s      0+    0 Qed LI:#2920    9/27 peers   132 KiB chain 0 bytes queue  RPC:  0 conn,    0 req/s,    0 µs
2023-09-08 08:28:26  Syncing
``

Stage 3 verification failed.

Synckeygen failed with error: CallFailed("Transaction execution error (Couldn't find the transaction block's state in the chain).")
2023-06-15 09:43:42 Invalid seal for block #354286
!
2023-06-15 09:43:42 Stage 3 block verification failed for #354286
extra data: [208, 97, 175, 68, 68, 50, 17, 117, 129, 193, 122, 144, 51, 35, 43, 185, 152, 80, 51, 159, 19, 233, 250, 40, 88, 176, 226, 230, 215, 95, 159, 244] (0x4527…
ea55)
Error: Error(Block(InvalidSeal), State { next_error: None, backtrace: InternalBacktrace { backtrace: None } })
2023-06-15 09:43:42
Bad block detected: Error(Block(InvalidSeal), State { next_error: None, backtrace: InternalBacktrace { backtrace: None } })
RLP: f90255f90250a0e6868d33e1de2f3add4ac996fc6d3c9e0bbc4694d2072a86a9a440b45ddb89eda01dcc4de8dec75d7aab85b567b6ccd41ad312451b948a7413f0a142fd40d49347940000000000000000000000000000000000000000a03c6df055eee1d9405c326dc3dfd46f4de67f1e82e141471ae1f61a83be6096b4a056e81f171bcc55a6ff8345e692c0f86e5b48e01b996cadc001622fb5e363b421a056e81f171bcc55a6ff8345e692c0f86e5b48e01b996cadc001622fb5e363b421b901000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000001830567ee843b9aca00808464830dfea0d061af444432117581c17a9033232bb99850339f13e9fa2858b0e2e6d75f9ff4b860a36e328e40b14fae3ec09f3084c5d5df9a9bcb3527b579282da8bbbcdd180491fded8559b16170d4743ead0e7566bd6a068e8e133638248ed6506710415d0c543cef2834fa6e0873a9a01ae4bf791a755cf02bbb67157d30efc75e5933321693c0c0
Header: Header { parent_hash: 0xe6868d33e1de2f3add4ac996fc6d3c9e0bbc4694d2072a86a9a440b45ddb89ed, timestamp: 1686310398, number: 354286, author: 0x0000000000000000000000000000000000000000, transactions_root: 0x56e81f171bcc55a6ff8345e692c0f86e5b48e01b996cadc001622fb5e363b421, uncles_hash: 0x1dcc4de8dec75d7aab85b567b6ccd41ad312451b948a7413f0a142fd40d49347, extra_data: [208, 97, 175, 68, 68, 50, 17, 117, 129, 193, 122, 144, 51, 35, 43, 185, 152, 80, 51, 159, 19, 233, 250, 40, 88, 176, 226, 230, 215, 95, 159, 244], state_root: 0x3c6df055eee1d9405c326dc3dfd46f4de67f1e82e141471ae1f61a83be6096b4, receipts_root: 0x56e81f171bcc55a6ff8345e692c0f86e5b48e01b996cadc001622fb5e363b421, log_bloom: 0x00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000, gas_used: 0, gas_limit: 1000000000, difficulty: 1, seal: [[184, 96, 163, 110, 50, 142, 64, 177, 79, 174, 62, 192, 159, 48, 132, 197, 213, 223, 154, 155, 203, 53, 39, 181, 121, 40, 45, 168, 187, 188, 221, 24, 4, 145, 253, 237, 133, 89, 177, 97, 112, 212, 116, 62, 173, 14, 117, 102, 189, 106, 6, 142, 142, 19, 54, 56, 36, 142, 214, 80, 103, 16, 65, 93, 12, 84, 60, 239, 40, 52, 250, 110, 8, 115, 169, 160, 26, 228, 191, 121, 26, 117, 92, 240, 43, 187, 103, 21, 125, 48, 239, 199, 94, 89, 51, 50, 22, 147]], base_fee_per_gas: None, hash: Some(0x45276fdaabcf20e29a4e4687dbe235c13bb6597fab2248d5032f86934a3fea55) }
Uncles:
Transactions:
7:45
023-06-15 09:43:50 Synckeygen failed with error: CallFailed("Transaction execution error (Couldn't find the transaction block's state in the chain).")
2023-06-15 09:43:50 Invalid seal for block #354297
!

Prometheus logging can result in a deadlock

happened on Hbbft2:

2023-05-14 13:17:43  removing 0 reserved peers, because they are neither a pending validator nor a current validator.
2023-05-14 13:17:43  POSDAO epoch changed from 5751 to 5752.
2023-05-14 13:17:43  report new epoch: 5752
2023-05-14 13:17:56  1 deadlock(s) detected
2023-05-14 13:17:56  Deadlock #0
2023-05-14 13:17:56  Thread Id 140085339076352
2023-05-14 13:17:56     0:     0x55be3c7a3656 - backtrace::backtrace::libunwind::trace::he8f90d654b8dc5ea
                               at /home/dmdnode/.cargo/registry/src/github.com-1ecc6299db9ec823/backtrace-0.3.56/src/backtrace/libunwind.rs:90:5
                           backtrace::backtrace::trace_unsynchronized::he1343eb1eec60890
                               at /home/dmdnode/.cargo/registry/src/github.com-1ecc6299db9ec823/backtrace-0.3.56/src/backtrace/mod.rs:66:5
                           backtrace::backtrace::trace::ha23987a553685e3a
                               at /home/dmdnode/.cargo/registry/src/github.com-1ecc6299db9ec823/backtrace-0.3.56/src/backtrace/mod.rs:53:14
                           backtrace::capture::Backtrace::create::h415567cce1e4ac38
                               at /home/dmdnode/.cargo/registry/src/github.com-1ecc6299db9ec823/backtrace-0.3.56/src/capture.rs:176:9
   1:     0x55be3c7a3583 - backtrace::capture::Backtrace::new::h16b94d5845ee5022
                               at /home/dmdnode/.cargo/registry/src/github.com-1ecc6299db9ec823/backtrace-0.3.56/src/capture.rs:140:22
   2:     0x55be3c78b0b9 - parking_lot_core::parking_lot::deadlock_impl::on_unpark::h2768f2c8fd347845
                               at /home/dmdnode/.cargo/registry/src/github.com-1ecc6299db9ec823/parking_lot_core-0.8.3/src/parking_lot.rs:1179:32
   3:     0x55be3b86e6b8 - parking_lot_core::parking_lot::deadlock::on_unpark::he364b6c75340f6a3
                               at /home/dmdnode/.cargo/registry/src/github.com-1ecc6299db9ec823/parking_lot_core-0.8.3/src/parking_lot.rs:1112:9
                           parking_lot_core::parking_lot::park::{{closure}}::hef55225fa08bc95a
                               at /home/dmdnode/.cargo/registry/src/github.com-1ecc6299db9ec823/parking_lot_core-0.8.3/src/parking_lot.rs:613:17
                           parking_lot_core::parking_lot::with_thread_data::h01cef5c602b00f0e
                               at /home/dmdnode/.cargo/registry/src/github.com-1ecc6299db9ec823/parking_lot_core-0.8.3/src/parking_lot.rs:183:5
                           parking_lot_core::parking_lot::park::hbff0db6976c66144
                               at /home/dmdnode/.cargo/registry/src/github.com-1ecc6299db9ec823/parking_lot_core-0.8.3/src/parking_lot.rs:576:5
                           parking_lot::raw_rwlock::RawRwLock::lock_common::hbe012f6602ff15f9
                               at /home/dmdnode/.cargo/registry/src/github.com-1ecc6299db9ec823/parking_lot-0.11.1/src/raw_rwlock.rs:1110:17
                           parking_lot::raw_rwlock::RawRwLock::lock_shared_slow::hc30010f2017ce8cb
                               at /home/dmdnode/.cargo/registry/src/github.com-1ecc6299db9ec823/parking_lot-0.11.1/src/raw_rwlock.rs:714:9
   4:     0x55be3c176ad7 - <parking_lot::raw_rwlock::RawRwLock as lock_api::rwlock::RawRwLock>::lock_shared::hcc166ebfb197ce55
                               at /home/dmdnode/.cargo/registry/src/github.com-1ecc6299db9ec823/parking_lot-0.11.1/src/raw_rwlock.rs:110:26
                           lock_api::rwlock::RwLock<R,T>::read::hfbb023e42636a8a5
                               at /home/dmdnode/.cargo/registry/src/github.com-1ecc6299db9ec823/lock_api-0.4.6/src/rwlock.rs:448:9
                           <ethcore::client::client::Client as ethcore::client::traits::BlockChainClient>::pruning_info::h5b36257a895e08fe
                               at /home/dmdnode/v0.8.6-rc1/diamond-node-git/crates/ethcore/src/client/client.rs:2798:29
   5:     0x55be3c188f3c - <ethcore::client::client::Client as stats::PrometheusMetrics>::prometheus_metrics::h9eb2723d82cc59c1
                               at /home/dmdnode/v0.8.6-rc1/diamond-node-git/crates/ethcore/src/client/client.rs:3624:24
   6:     0x55be3b9a6042 - diamond_node::metrics::handle_request::hae3a7c30a9af123c
                               at /home/dmdnode/v0.8.6-rc1/diamond-node-git/bin/oe/metrics.rs:53:13
                           diamond_node::metrics::start_prometheus_metrics::{{closure}}::{{closure}}::h6c9dfa62f925fdb3
                               at /home/dmdnode/v0.8.6-rc1/diamond-node-git/bin/oe/metrics.rs:108:17
                           <hyper::service::service::ServiceFnOk<F,ReqBody> as hyper::service::service::Service>::call::he59da67aac3a97da
                               at /home/dmdnode/.cargo/registry/src/github.com-1ecc6299db9ec823/hyper-0.12.35/src/service/service.rs:155:20
   7:     0x55be3b99b2ab - <hyper::proto::h1::dispatch::Server<S> as hyper::proto::h1::dispatch::Dispatch>::recv_msg::h87817851a5449a69
                               at /home/dmdnode/.cargo/registry/src/github.com-1ecc6299db9ec823/hyper-0.12.35/src/proto/h1/dispatch.rs:431:31
   8:     0x55be3bb454bd - hyper::proto::h1::dispatch::Dispatcher<D,Bs,I,T>::poll_read_head::hd647fefd711084ff
                           hyper::proto::h1::dispatch::Dispatcher<D,Bs,I,T>::poll_read::h55f90ac45b8901b0
                               at /home/dmdnode/.cargo/registry/src/github.com-1ecc6299db9ec823/hyper-0.12.35/src/proto/h1/dispatch.rs:163:28
   9:     0x55be3b9e75b9 - hyper::proto::h1::dispatch::Dispatcher<D,Bs,I,T>::poll_loop::h23a2ec8dd8c18868
                               at /home/dmdnode/.cargo/registry/src/github.com-1ecc6299db9ec823/hyper-0.12.35/src/proto/h1/dispatch.rs:129:13
                           hyper::proto::h1::dispatch::Dispatcher<D,Bs,I,T>::poll_inner::h22f06291f1e3355b
                               at /home/dmdnode/.cargo/registry/src/github.com-1ecc6299db9ec823/hyper-0.12.35/src/proto/h1/dispatch.rs:106:20
                           hyper::proto::h1::dispatch::Dispatcher<D,Bs,I,T>::poll_catch::h33e0e63de1d9dbab
                               at /home/dmdnode/.cargo/registry/src/github.com-1ecc6299db9ec823/hyper-0.12.35/src/proto/h1/dispatch.rs:93:9
                           <hyper::proto::h1::dispatch::Dispatcher<D,Bs,I,T> as futures::future::Future>::poll::h310c433feb1b5afa
                               at /home/dmdnode/.cargo/registry/src/github.com-1ecc6299db9ec823/hyper-0.12.35/src/proto/h1/dispatch.rs:374:9
                           <futures::future::either::Either<A,B> as futures::future::Future>::poll::h634d73ffa93abc69
                               at /home/dmdnode/.cargo/registry/src/github.com-1ecc6299db9ec823/futures-0.1.29/src/future/either.rs:35:37
                           futures::future::option::<impl futures::future::Future for core::option::Option<F>>::poll::h31b358d9eadb1d3c
                               at /home/dmdnode/.cargo/registry/src/github.com-1ecc6299db9ec823/futures-0.1.29/src/future/option.rs:12:32
                           <hyper::server::conn::upgrades::UpgradeableConnection<I,S,E> as futures::future::Future>::poll::hd345d9f3420d51eb
                               at /home/dmdnode/.cargo/registry/src/github.com-1ecc6299db9ec823/hyper-0.12.35/src/server/conn.rs:948:23
  10:     0x55be3b9e6248 - <hyper::server::conn::spawn_all::NewSvcTask<I,N,S,E,W> as futures::future::Future>::poll::h2ab98e29eea72558
                               at /home/dmdnode/.cargo/registry/src/github.com-1ecc6299db9ec823/hyper-0.12.35/src/server/conn.rs:888:32
  11:     0x55be3c613de4 - <alloc::boxed::Box<F> as futures::future::Future>::poll::h894625b41fa8f2bd
                               at /home/dmdnode/.cargo/registry/src/github.com-1ecc6299db9ec823/futures-0.1.29/src/future/mod.rs:113:13
                           futures::task_impl::Spawn<T>::poll_future_notify::{{closure}}::h93d92e96b2881a7a
                               at /home/dmdnode/.cargo/registry/src/github.com-1ecc6299db9ec823/futures-0.1.29/src/task_impl/mod.rs:329:45
                           futures::task_impl::Spawn<T>::enter::{{closure}}::h5bebde7ebfe368df
                               at /home/dmdnode/.cargo/registry/src/github.com-1ecc6299db9ec823/futures-0.1.29/src/task_impl/mod.rs:399:27
                           futures::task_impl::std::set::hea93e4d2353a0145
                               at /home/dmdnode/.cargo/registry/src/github.com-1ecc6299db9ec823/futures-0.1.29/src/task_impl/std/mod.rs:83:13
                           futures::task_impl::Spawn<T>::enter::h0143ad0126ebbd90
                               at /home/dmdnode/.cargo/registry/src/github.com-1ecc6299db9ec823/futures-0.1.29/src/task_impl/mod.rs:399:9
                           futures::task_impl::Spawn<T>::poll_fn_notify::h185eec319dc08456
                               at /home/dmdnode/.cargo/registry/src/github.com-1ecc6299db9ec823/futures-0.1.29/src/task_impl/mod.rs:291:9
                           futures::task_impl::Spawn<T>::poll_future_notify::h2a95cfc7c5c5074c
                               at /home/dmdnode/.cargo/registry/src/github.com-1ecc6299db9ec823/futures-0.1.29/src/task_impl/mod.rs:329:9
                           tokio_threadpool::task::Task::run::{{closure}}::h6e9446654b8f2809
                               at /home/dmdnode/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-threadpool-0.1.18/src/task/mod.rs:145:17
                           core::ops::function::FnOnce::call_once::h94329b367c79520a
                               at /rustc/9eb3afe9ebe9c7d2b84b71002d44f4a0edac95e0/library/core/src/ops/function.rs:250:5
                           <core::panic::unwind_safe::AssertUnwindSafe<F> as core::ops::function::FnOnce<()>>::call_once::h48a5ce0f3b771954
                               at /rustc/9eb3afe9ebe9c7d2b84b71002d44f4a0edac95e0/library/core/src/panic/unwind_safe.rs:271:9
                           std::panicking::try::do_call::h31fdaa88d5ddd6bc
                               at /rustc/9eb3afe9ebe9c7d2b84b71002d44f4a0edac95e0/library/std/src/panicking.rs:483:40
                           std::panicking::try::ha5afbd13d59f9472
                               at /rustc/9eb3afe9ebe9c7d2b84b71002d44f4a0edac95e0/library/std/src/panicking.rs:447:19
                           std::panic::catch_unwind::h1dc64362ca43cec6
                               at /rustc/9eb3afe9ebe9c7d2b84b71002d44f4a0edac95e0/library/std/src/panic.rs:140:14
                           tokio_threadpool::task::Task::run::hf2592af154934a1b
                               at /home/dmdnode/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-threadpool-0.1.18/src/task/mod.rs:130:19
                           tokio_threadpool::worker::Worker::run_task2::h37f47a2c6e34ea87
                               at /home/dmdnode/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-threadpool-0.1.18/src/worker/mod.rs:567:9
                           tokio_threadpool::worker::Worker::run_task::h1395086bcd164435
                               at /home/dmdnode/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-threadpool-0.1.18/src/worker/mod.rs:459:19
  12:     0x55be3c61299d - tokio_threadpool::worker::Worker::try_run_owned_task::h88e88a8241a88ce6
                               at /home/dmdnode/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-threadpool-0.1.18/src/worker/mod.rs:390:17
                           tokio_threadpool::worker::Worker::try_run_task::h4d3c033cc0c18eca
                               at /home/dmdnode/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-threadpool-0.1.18/src/worker/mod.rs:297:12
                           tokio_threadpool::worker::Worker::run::hb58558875c254828
                               at /home/dmdnode/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-threadpool-0.1.18/src/worker/mod.rs:241:21
  13:     0x55be3c5f954d - tokio::runtime::threadpool::builder::Builder::build::{{closure}}::{{closure}}::{{closure}}::{{closure}}::haf65d9a91a3265c7
                               at /home/dmdnode/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-0.1.22/src/runtime/threadpool/builder.rs:390:29
                           tokio_timer::timer::handle::with_default::h8ff29d78e66c5a2f
                               at /home/dmdnode/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-timer-0.2.13/src/timer/handle.rs:74:5
                           tokio::runtime::threadpool::builder::Builder::build::{{closure}}::{{closure}}::{{closure}}::hc642dd644d817fae
                               at /home/dmdnode/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-0.1.22/src/runtime/threadpool/builder.rs:382:25
                           tokio_timer::clock::clock::with_default::h15778afb3910f45b
                               at /home/dmdnode/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-timer-0.2.13/src/clock/clock.rs:125:5
                           tokio::runtime::threadpool::builder::Builder::build::{{closure}}::{{closure}}::hcca6c07048a98f22
                               at /home/dmdnode/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-0.1.22/src/runtime/threadpool/builder.rs:381:21
                           tokio_reactor::with_default::h12973e649dffe57b
                               at /home/dmdnode/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-reactor-0.1.12/src/lib.rs:220:5
                           tokio::runtime::threadpool::builder::Builder::build::{{closure}}::ha76f4488573222df
                               at /home/dmdnode/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-0.1.22/src/runtime/threadpool/builder.rs:380:17
  14:     0x55be3c60a2dc - tokio_threadpool::callback::Callback::call::h5c4044e51c4aa315
                               at /home/dmdnode/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-threadpool-0.1.18/src/callback.rs:22:9
                           tokio_threadpool::worker::Worker::do_run::{{closure}}::{{closure}}::hdf2eeda86de91be9
                               at /home/dmdnode/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-threadpool-0.1.18/src/worker/mod.rs:127:21
                           tokio_executor::global::with_default::{{closure}}::h16490220ce3391a5
                               at /home/dmdnode/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-executor-0.1.10/src/global.rs:221:9
                           std::thread::local::LocalKey<T>::try_with::h362cf44613c9e3b1
                               at /rustc/9eb3afe9ebe9c7d2b84b71002d44f4a0edac95e0/library/std/src/thread/local.rs:446:16
                           std::thread::local::LocalKey<T>::with::hd0b3959126c34ae3
                               at /rustc/9eb3afe9ebe9c7d2b84b71002d44f4a0edac95e0/library/std/src/thread/local.rs:422:9
                           tokio_executor::global::with_default::h37a6882753676a2a
                               at /home/dmdnode/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-executor-0.1.10/src/global.rs:190:5
                           tokio_threadpool::worker::Worker::do_run::{{closure}}::hf085a17262bdab46
                               at /home/dmdnode/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-threadpool-0.1.18/src/worker/mod.rs:125:13
                           std::thread::local::LocalKey<T>::try_with::h950fa03ac57b299f
                               at /rustc/9eb3afe9ebe9c7d2b84b71002d44f4a0edac95e0/library/std/src/thread/local.rs:446:16
                           std::thread::local::LocalKey<T>::with::hb70fb4ab5e7e9da9
                               at /rustc/9eb3afe9ebe9c7d2b84b71002d44f4a0edac95e0/library/std/src/thread/local.rs:422:9
                           tokio_threadpool::worker::Worker::do_run::h6cd755f3ee1320fb
                               at /home/dmdnode/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-threadpool-0.1.18/src/worker/mod.rs:116:9
                           tokio_threadpool::pool::Pool::spawn_thread::{{closure}}::h3252b07619505c46
                               at /home/dmdnode/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-threadpool-0.1.18/src/pool/mod.rs:345:21
                           std::sys_common::backtrace::__rust_begin_short_backtrace::h5dacdbf32b05d787
                               at /rustc/9eb3afe9ebe9c7d2b84b71002d44f4a0edac95e0/library/std/src/sys_common/backtrace.rs:121:18
  15:     0x55be3c60b293 - std::thread::Builder::spawn_unchecked_::{{closure}}::{{closure}}::hc77fd360e874fd99
                               at /rustc/9eb3afe9ebe9c7d2b84b71002d44f4a0edac95e0/library/std/src/thread/mod.rs:558:17
                           <core::panic::unwind_safe::AssertUnwindSafe<F> as core::ops::function::FnOnce<()>>::call_once::h2185beb7115eb0af
                               at /rustc/9eb3afe9ebe9c7d2b84b71002d44f4a0edac95e0/library/core/src/panic/unwind_safe.rs:271:9
                           std::panicking::try::do_call::hb3ddaf96778e8429
                               at /rustc/9eb3afe9ebe9c7d2b84b71002d44f4a0edac95e0/library/std/src/panicking.rs:483:40
                           std::panicking::try::he141366359aeb8b1
                               at /rustc/9eb3afe9ebe9c7d2b84b71002d44f4a0edac95e0/library/std/src/panicking.rs:447:19
                           std::panic::catch_unwind::h19935dc880e1c695
                               at /rustc/9eb3afe9ebe9c7d2b84b71002d44f4a0edac95e0/library/std/src/panic.rs:140:14
                           std::thread::Builder::spawn_unchecked_::{{closure}}::h861244ed4e7a379a
                               at /rustc/9eb3afe9ebe9c7d2b84b71002d44f4a0edac95e0/library/std/src/thread/mod.rs:557:30
                           core::ops::function::FnOnce::call_once{{vtable.shim}}::heed0ba115bab04eb
                               at /rustc/9eb3afe9ebe9c7d2b84b71002d44f4a0edac95e0/library/core/src/ops/function.rs:250:5
  16:     0x55be3ca572f3 - <alloc::boxed::Box<F,A> as core::ops::function::FnOnce<Args>>::call_once::h3205ec2d7fc232c5
                               at /rustc/9eb3afe9ebe9c7d2b84b71002d44f4a0edac95e0/library/alloc/src/boxed.rs:1988:9
                           <alloc::boxed::Box<F,A> as core::ops::function::FnOnce<Args>>::call_once::h3bb5daec8177f56b
                               at /rustc/9eb3afe9ebe9c7d2b84b71002d44f4a0edac95e0/library/alloc/src/boxed.rs:1988:9
                           std::sys::unix::thread::Thread::new::thread_start::had7b8061e306bb5c
                               at /rustc/9eb3afe9ebe9c7d2b84b71002d44f4a0edac95e0/library/std/src/sys/unix/thread.rs:108:17
  17:     0x7f683a5b06db - start_thread
  18:     0x7f6839d3761f - clone
  19:                0x0 - <unknown>

2023-05-14 13:18:06  eth_accounts is deprecated and will be removed in future versions: Account management is being phased out see #9997 for alternatives.
2023-05-14 13:18:39  do_validator_engine_actions

"protection from hang" could lead to slow sync

The wrong information about the available state of a peer leads to ignorance of this block sync information.
it could be the case that "all" nodes send wrong information and the Node cannot process further blocks in the block sync.

this is just a theory now and requires confirmation.

block creation stuck without error message

an active validator hbbft10 stopped participating in the block production.

2023-09-19 03:27:10  Imported #74739 0x639c…edb4 (0 txs, 0.00 Mgas, 8 ms, 0.59 KiB)
2023-09-19 03:27:10  No session exists for peerId 46
2023-09-19 03:27:27   36/50 peers   4 MiB chain 0 bytes queue  RPC:  0 conn,    0 req/s,    0 µs
2023-09-19 03:28:02   35/50 peers   4 MiB chain 0 bytes queue  RPC:  0 conn,    0 req/s,    0 µs
2023-09-19 03:28:08  Block creation: Honeybadger epoch 74740, Transactions subset target size: 4, actual size: 0, from available 0.
2023-09-19 03:28:09  No session exists for peerId 24
2023-09-19 03:28:09  No session exists for peerId 40
2023-09-19 03:28:09  No session exists for peerId 38
2023-09-19 03:28:10  Block creation: Batch received for epoch 74740, total 0 contributions, with 0 unique transactions.2023-09-19 03:28:10  Detected an attempt to send a hbbft contribution for block 74741 before the previous block was imported to the chain.

2023-09-19 03:28:10  Imported #74740 0x0a34…b018 (0 txs, 0.00 Mgas, 7 ms, 0.59 KiB)
2023-09-19 03:28:10  No session exists for peerId 5
2023-09-19 03:28:13  do_validator_engine_actions
2023-09-19 03:28:32   36/50 peers   4 MiB chain 0 bytes queue  RPC:  0 conn,    0 req/s,    0 µs
2023-09-19 03:28:45  Block creation: Honeybadger epoch 74741, Transactions subset target size: 4, actual size: 1, from available 1.
2023-09-19 03:28:45  No session exists for peerId 93
2023-09-19 03:28:46  Block creation: Batch received for epoch 74741, total 20 contributions, with 1 unique transactions.
2023-09-19 03:28:47  Imported #74741 0xfdcb…6de6 (1 txs, 0.21 Mgas, 9 ms, 0.76 KiB)
2023-09-19 03:29:02   34/50 peers   4 MiB chain 0 bytes queue  RPC:  0 conn,    0 req/s,    0 µs
2023-09-19 03:29:09  Block creation: Honeybadger epoch 74742, Transactions subset target size: 4, actual size: 0, from available 0.
2023-09-19 03:29:09  No session exists for peerId 27
2023-09-19 03:29:09  No session exists for peerId 12
2023-09-19 03:29:09  No session exists for peerId 73
2023-09-19 03:29:11  Block creation: Batch received for epoch 74742, total 0 contributions, with 0 unique transactions.
2023-09-19 03:29:12  Detected an attempt to send a hbbft contribution for block 74743 before the previous block was imported to the chain.
2023-09-19 03:29:12  Imported #74742 0x9cba…3408 (0 txs, 0.00 Mgas, 27 ms, 0.59 KiB)
2023-09-19 03:29:13  Block creation: Honeybadger epoch 74743, Transactions subset target size: 4, actual size: 0, from available 0.
2023-09-19 03:29:13  No session exists for peerId 14
2023-09-19 03:29:14  No session exists for peerId 11
2023-09-19 03:29:14  No session exists for peerId 73
2023-09-19 03:29:15  No session exists for peerId 73
2023-09-19 03:29:15  No session exists for peerId 70
2023-09-19 03:29:16  No session exists for peerId 36
2023-09-19 03:29:16  No session exists for peerId 38
2023-09-19 03:29:16  No session exists for peerId 87
2023-09-19 03:29:20  Sealing message for block #74744 could not be processed due to missing/mismatching network info.
2023-09-19 03:29:20  Sealing message for block #74744 could not be processed due to missing/mismatching network info.
2023-09-19 03:29:20  Sealing message for block #74744 could not be processed due to missing/mismatching network info.
2023-09-19 03:29:20  Sealing message for block #74744 could not be processed due to missing/mismatching network info.
2023-09-19 03:29:20  Sealing message for block #74744 could not be processed due to missing/mismatching network info.
2023-09-19 03:29:20  Sealing message for block #74744 could not be processed due to missing/mismatching network info.
2023-09-19 03:29:20  Sealing message for block #74744 could not be processed due to missing/mismatching network info.
2023-09-19 03:29:20  Sealing message for block #74744 could not be processed due to missing/mismatching network info.
2023-09-19 03:29:20  Sealing message for block #74744 could not be processed due to missing/mismatching network info.
2023-09-19 03:29:20  Sealing message for block #74744 could not be processed due to missing/mismatching network info.
2023-09-19 03:29:20  Sealing message for block #74744 could not be processed due to missing/mismatching network info.
2023-09-19 03:29:20  Sealing message for block #74744 could not be processed due to missing/mismatching network info.
2023-09-19 03:29:20  Sealing message for block #74744 could not be processed due to missing/mismatching network info.
2023-09-19 03:29:20  Sealing message for block #74744 could not be processed due to missing/mismatching network info.
2023-09-19 03:29:20  No session exists for peerId 13
2023-09-19 03:29:20  No session exists for peerId 43
2023-09-19 03:29:20  Block creation: Batch received for epoch 74743, total 0 contributions, with 0 unique transactions.
2023-09-19 03:29:20  adding reserved peer: enode://3e55dcaf18c9d00d2c55cf409cb756670c74add47d915deb1ebc684e3826e977823219eceb76d13811ff2ab13e5c62f1ae8740bf6b2f43b893d8a3b736254864@85.190.254.238:40301
2023-09-19 03:29:21  added reserved peer: ValidatorConnectionData { staking_address: 0xf3cadd08b874f64b922885a05b8fcd4a156b6e02, socket_addr: 85.190.254.238:40301, public_key: 3e..64, peer_string: "enode://3e55dcaf18c9d00d2c55cf409cb756670c74add47d915deb1ebc684e3826e977823219eceb76d13811ff2ab13e5c62f1ae8740bf6b2f43b893d8a3b736254864@85.190.254.238:40301", mining_address: 0xdaf1846c6821fca2a38f4ea102bd5a0ca46f8fa1 }
2023-09-19 03:29:21  adding reserved peer: enode://6f5736987891dca46a4e9e6afccd2980e49b2746382bd2a2a42661c32e93582d5a4c2697538d842abded125eaa51d99cae653c7b227fc175f6da7f885366009a@144.126.144.51:31314
2023-09-19 03:29:21  added reserved peer: ValidatorConnectionData { staking_address: 0x12c5f5b3ab11156c1e8d55c78ce32f14416e5723, socket_addr: 144.126.144.51:31314, public_key: 6f..9a, peer_string: "enode://6f5736987891dca46a4e9e6afccd2980e49b2746382bd2a2a42661c32e93582d5a4c2697538d842abded125eaa51d99cae653c7b227fc175f6da7f885366009a@144.126.144.51:31314", mining_address: 0x33c31926feddebfb9f2c495425cd98c882af79fe }
2023-09-19 03:29:21  adding reserved peer: enode://563150f60630d8bb4c5d8e094b5a626e8922cf01e7b56f7037d62d253597b0686bdcd1d2fb16cdbca5e2275185d4abc3d5118b5b1c44365cce9f13ddc02d8c0b@194.163.160.191:31311
2023-09-19 03:29:21  added reserved peer: ValidatorConnectionData { staking_address: 0x4a22c0512aacccf5b4cd7a53e15fb56bf2a73d0e, socket_addr: 194.163.160.191:31311, public_key: 56..0b, peer_string: "enode://563150f60630d8bb4c5d8e094b5a626e8922cf01e7b56f7037d62d253597b0686bdcd1d2fb16cdbca5e2275185d4abc3d5118b5b1c44365cce9f13ddc02d8c0b@194.163.160.191:31311", mining_address: 0x246cfddc104e4a25897b16999d8d086270c18292 }
2023-09-19 03:29:21  Imported #74743 0x07c1…7c2d (0 txs, 0.00 Mgas, 7 ms, 0.59 KiB)
2023-09-19 03:29:21  No session exists for peerId 13
2023-09-19 03:29:21  No session exists for peerId 18
2023-09-19 03:29:21  No session exists for peerId 43
2023-09-19 03:29:21  No session exists for peerId 48
2023-09-19 03:29:21  No session exists for peerId 67
2023-09-19 03:29:21  No session exists for peerId 101
2023-09-19 03:29:21  No session exists for peerId 52
2023-09-19 03:29:21  No session exists for peerId 89
2023-09-19 03:29:21  Block creation: Honeybadger epoch 74744, Transactions subset target size: 5, actual size: 5, from available 17.
2023-09-19 03:29:24  Sealing message for block #74745 could not be processed due to missing/mismatching network info.
2023-09-19 03:29:24  Sealing message for block #74745 could not be processed due to missing/mismatching network info.
2023-09-19 03:29:24  Sealing message for block #74745 could not be processed due to missing/mismatching network info.
2023-09-19 03:29:24  Sealing message for block #74745 could not be processed due to missing/mismatching network info.
``

syncing nodes communicate with the current hbbft set.

example on alpha testnet:

2023-07-10 20:06:10  Worker devp2p2 TRACE consensus  Cached Messages: Trying to send cached messages to 0xd66dfb161550df9dd259a6495dec9e80a8dc66b94b377c9b0339d960035c9084201ead161eddb81f96b3424d5e3fa4d33931e4c72eaa666a50db89e776285244
2023-07-10 20:06:11  Worker Client2 TRACE consensus  Received message of idx 17  Message { epoch: 380988, content: Subset(Message { proposer_id: fc..f4, content: Broadcast(Echo(Proof { #8, root_hash: 59b6..e9cf, value: 2024..6d1a, .. })) }) } from NodeId(0xd66d…5244)
2023-07-10 20:06:11  Worker Client1 TRACE consensus  Received message of idx 24  Message { epoch: 9160, content: Subset(Message { proposer_id: f5..51, content: Broadcast(Value(Proof { #24, root_hash: 3ebe..4c63, value: 83fc..e3d5, .. })) }) } from NodeId(0xf515…bb51)
2023-07-10 20:06:11  Worker Client3 TRACE consensus  Received message of idx 25  Message { epoch: 9160, content: Subset(Message { proposer_id: f5..51, content: Broadcast(Echo(Proof { #22, root_hash: 3ebe..4c63, value: e335..e829, .. })) }) } from NodeId(0xf515…bb51)
2023-07-10 20:06:13  Worker Client2 TRACE consensus  Received message of idx 24  Message { epoch: 345, content: Subset(Message { proposer_id: 9f..f6, content: Broadcast(Value(Proof { #24, root_hash: c7f4..b494, value: 1ded..8659, .. })) }) } from NodeId(0x9f49…b8f6)
2023-07-10 20:06:13  Worker Client3 TRACE consensus  Received message of idx 25  Message { epoch: 345, content: Subset(Message { proposer_id: 9f..f6, content: Broadcast(Echo(Proof { #13, root_hash: c7f4..b494, value: 3f09..1a88, .. })) }) } from NodeId(0x9f49…b8f6)

it looks like that the nodes that are currently syncing are sending hbbft messages.
this is unexpected behavior.
a Node that is currently in a sync process should not send hbbft messages.

Test fail: test_moc_to_first_validator

The test fails probably to a known reason:

New Validators are not
treated as "active" after adding them.

They need to send their announce availability,
but the current simulated network implementation does not do that.

this is just a theory that needs to be confirmed.

prometheus: is_major_syncing

a wrong information about is_major_syncing could be the root cause for some of the problems we encounter.
to target that we need this information available in grafana.

Block information

There is not log output on which block the node software is currently running on.
TODO:
Extend the logout and provide the information on the Hbbft initialization (what data got used to initialize hbbft)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.