Comments (5)
Some questions that should help solving this problem:
- Your master is up?
- Is there any worker connected to this master? How many?
When you receive the message "no decoder available", the server is telling that there's no free worker connected to the master at that moment. You should check why this is happening (maybe someone else is using your server)
from docker-kaldi-gstreamer-server.
Hi!
Today I ran into the same issue. The worker does not come up with this model.
root@b14b54d5a69c:/opt# cat worker.log
libudev: udev_has_devtmpfs: name_to_handle_at on /dev: Operation not permitted
libdc1394 error: Failed to initialize libdc1394
DEBUG 2019-01-23 15:42:39,754 Starting up worker
2019-01-23 15:42:39 - INFO: decoder2: Creating decoder using conf: {'post-processor': "perl -npe 'BEGIN {use IO::Handle; STDOUT->autoflush(1);} s/(.*)/\\1./;'", 'logging': {'version': 1, 'root': {'level': 'DEBUG', 'handlers': ['console']}, 'formatters': {'simpleFormater': {'datefmt': '%Y-%m-%d %H:%M:%S', 'format': '%(asctime)s - %(levelname)7s: %(name)10s: %(message)s'}}, 'disable_existing_loggers': False, 'handlers': {'console': {'formatter': 'simpleFormater', 'class': 'logging.StreamHandler', 'level': 'DEBUG'}}}, 'use-vad': False, 'decoder': {'ivector-extraction-config': 'de_350k_nnet3chain_tdnn1f_1024_sp_bi/ivector_extractor/ivector_extractor.conf', 'lattice-beam': 5.0, 'acoustic-scale': 1.0, 'do-endpointing': True, 'beam': 5.0, 'mfcc-config': 'de_350k_nnet3chain_tdnn1f_1024_sp_bi/conf/mfcc_hires.conf', 'traceback-period-in-secs': 0.25, 'nnet-mode': 3, 'endpoint-silence-phones': '1:2:3:4:5:6', 'word-syms': 'de_350k_nnet3chain_tdnn1f_1024_sp_bi/words.txt', 'num-nbest': 10, 'frame-subsampling-factor': 3, 'phone-syms': 'de_350k_nnet3chain_tdnn1f_1024_sp_bi/phones.txt', 'max-active': 10000, 'fst': 'de_350k_nnet3chain_tdnn1f_1024_sp_bi/HCLG.fst', 'use-threaded-decoder': True, 'model': 'de_350k_nnet3chain_tdnn1f_1024_sp_bi/final.mdl', 'chunk-length-in-secs': 0.25}, 'silence-timeout': 15, 'out-dir': 'tmp', 'use-nnet2': True}
2019-01-23 15:42:39 - INFO: decoder2: Setting decoder property: nnet-mode = 3
2019-01-23 15:42:39 - INFO: decoder2: Setting decoder property: ivector-extraction-config = de_350k_nnet3chain_tdnn1f_1024_sp_bi/ivector_extractor/ivector_extractor.conf
2019-01-23 15:42:39 - INFO: decoder2: Setting decoder property: lattice-beam = 5.0
2019-01-23 15:42:39 - INFO: decoder2: Setting decoder property: acoustic-scale = 1.0
2019-01-23 15:42:39 - INFO: decoder2: Setting decoder property: do-endpointing = True
2019-01-23 15:42:39 - INFO: decoder2: Setting decoder property: beam = 5.0
2019-01-23 15:42:39 - INFO: decoder2: Setting decoder property: mfcc-config = de_350k_nnet3chain_tdnn1f_1024_sp_bi/conf/mfcc_hires.conf
2019-01-23 15:42:39 - INFO: decoder2: Setting decoder property: traceback-period-in-secs = 0.25
2019-01-23 15:42:39 - INFO: decoder2: Setting decoder property: endpoint-silence-phones = 1:2:3:4:5:6
2019-01-23 15:42:39 - INFO: decoder2: Setting decoder property: word-syms = de_350k_nnet3chain_tdnn1f_1024_sp_bi/words.txt
ERROR: SymbolTable::ReadText: Can't open file de_350k_nnet3chain_tdnn1f_1024_sp_bi/words.txt
2019-01-23 15:42:39 - INFO: decoder2: Setting decoder property: num-nbest = 10
2019-01-23 15:42:39 - INFO: decoder2: Setting decoder property: frame-subsampling-factor = 3
2019-01-23 15:42:39 - INFO: decoder2: Setting decoder property: phone-syms = de_350k_nnet3chain_tdnn1f_1024_sp_bi/phones.txt
ERROR: SymbolTable::ReadText: Can't open file de_350k_nnet3chain_tdnn1f_1024_sp_bi/phones.txt
2019-01-23 15:42:39 - INFO: decoder2: Setting decoder property: max-active = 10000
2019-01-23 15:42:39 - INFO: decoder2: Setting decoder property: chunk-length-in-secs = 0.25
2019-01-23 15:42:39 - INFO: decoder2: Setting decoder property: fst = de_350k_nnet3chain_tdnn1f_1024_sp_bi/HCLG.fst
ERROR ([5.4.176~1-be967]:Input():kaldi-io.cc:756) Error opening input stream de_350k_nnet3chain_tdnn1f_1024_sp_bi/HCLG.fst
[ Stack-Trace: ]
kaldi::MessageLogger::HandleMessage(kaldi::LogMessageEnvelope const&, char const*)
kaldi::MessageLogger::~MessageLogger()
kaldi::Input::Input(std::string const&, bool*)
fst::ReadFstKaldiGeneric(std::string, bool)
g_object_set_property
PyEval_EvalFrameEx
PyEval_EvalFrameEx
.
.
.
python() [0x4b988b]
PyEval_EvalFrameEx
PyEval_EvalFrameEx
PyEval_EvalCodeEx
python() [0x50160f]
PyRun_FileExFlags
PyRun_SimpleFileExFlags
Py_Main
__libc_start_main
python() [0x497b8b]
2019-01-23 15:42:39 - INFO: decoder2: Setting decoder property: model = de_350k_nnet3chain_tdnn1f_1024_sp_bi/final.mdl
ERROR ([5.4.176~1-be967]:Input():kaldi-io.cc:756) Error opening input stream de_350k_nnet3chain_tdnn1f_1024_sp_bi/final.mdl
[ Stack-Trace: ]
kaldi::MessageLogger::HandleMessage(kaldi::LogMessageEnvelope const&, char const*)
kaldi::MessageLogger::~MessageLogger()
kaldi::Input::Input(std::string const&, bool*)
g_object_set_property
PyEval_EvalFrameEx
PyEval_EvalFrameEx
python() [0x4e4518]
.
.
.
python() [0x4b988b]
PyEval_EvalFrameEx
PyEval_EvalFrameEx
PyEval_EvalCodeEx
python() [0x50160f]
PyRun_FileExFlags
PyRun_SimpleFileExFlags
Py_Main
__libc_start_main
python() [0x497b8b]
2019-01-23 15:42:39 - INFO: decoder2: Created GStreamer elements
2019-01-23 15:42:39 - DEBUG: decoder2: Adding <__main__.GstAppSrc object at 0x7f80e31c9370 (GstAppSrc at 0x13f79b0)> to the pipeline
2019-01-23 15:42:39 - DEBUG: decoder2: Adding <__main__.GstDecodeBin object at 0x7f80e31c9320 (GstDecodeBin at 0x13fc060)> to the pipeline
2019-01-23 15:42:39 - DEBUG: decoder2: Adding <__main__.GstAudioConvert object at 0x7f80e31c9410 (GstAudioConvert at 0x14058f0)> to the pipeline
2019-01-23 15:42:39 - DEBUG: decoder2: Adding <__main__.GstAudioResample object at 0x7f80e31c92d0 (GstAudioResample at 0x11f1f70)> to the pipeline
2019-01-23 15:42:39 - DEBUG: decoder2: Adding <__main__.GstTee object at 0x7f80e31c93c0 (GstTee at 0x1413000)> to the pipeline
2019-01-23 15:42:39 - DEBUG: decoder2: Adding <__main__.GstQueue object at 0x7f80e31c94b0 (GstQueue at 0x1416170)> to the pipeline
2019-01-23 15:42:39 - DEBUG: decoder2: Adding <__main__.GstFileSink object at 0x7f80e31c9500 (GstFileSink at 0x141a400)> to the pipeline
2019-01-23 15:42:39 - DEBUG: decoder2: Adding <__main__.GstQueue object at 0x7f80e31c9550 (GstQueue at 0x1416460)> to the pipeline
2019-01-23 15:42:39 - DEBUG: decoder2: Adding <__main__.Gstkaldinnet2onlinedecoder object at 0x7f80e31c95a0 (Gstkaldinnet2onlinedecoder at 0x1434150)> to the pipeline
2019-01-23 15:42:39 - DEBUG: decoder2: Adding <__main__.GstFakeSink object at 0x7f80e31c95f0 (GstFakeSink at 0x1252c10)> to the pipeline
2019-01-23 15:42:39 - INFO: decoder2: Linking GStreamer elements
ERROR ([5.4.176~1-be967]:ReadConfigFile():parse-options.cc:469) Cannot open config file: de_350k_nnet3chain_tdnn1f_1024_sp_bi/conf/mfcc_hires.conf
[ Stack-Trace: ]
kaldi::MessageLogger::HandleMessage(kaldi::LogMessageEnvelope const&, char const*)
kaldi::MessageLogger::~MessageLogger()
kaldi::ParseOptions::ReadConfigFile(std::string const&)
void kaldi::ReadConfigFromFile<kaldi::MfccOptions>(std::string, kaldi::MfccOptions*)
kaldi::OnlineNnet2FeaturePipelineInfo::OnlineNnet2FeaturePipelineInfo(kaldi::OnlineNnet2FeaturePipelineConfig const&)
gst_pad_query
gst_pad_query_caps
gst_element_get_compatible_pad
gst_element_link_pads_full
.
.
.
python() [0x4b988b]
PyEval_EvalFrameEx
PyEval_EvalFrameEx
PyEval_EvalCodeEx
python() [0x50160f]
PyRun_FileExFlags
PyRun_SimpleFileExFlags
Py_Main
__libc_start_main
python() [0x497b8b]
terminate called after throwing an instance of 'std::runtime_error'
what():
root@b14b54d5a69c:/opt#
from docker-kaldi-gstreamer-server.
It seems that it cannot open all the files that are referenced in the yaml file.
Files are there in the mentioned subfolder.
Any idea?
from docker-kaldi-gstreamer-server.
Got it working. It seems to be an issue with the paths.
I have edited the yaml file and replace all paths with absolute paths starting with /opt/models/
After that I have restarted everything and the worker comes up. I have just down my first german STT using this. Great!
from docker-kaldi-gstreamer-server.
It is working, but I had to struggle with this issue here:
alumae/kaldi-gstreamer-server#168
So commenting the post-processing line in a yaml file seems to sove this issue.
from docker-kaldi-gstreamer-server.
Related Issues (20)
- Can I update the Kaldi version in the Docker container? HOT 2
- Unknown component type TdnnComponent HOT 2
- response['transcript'] did not return final:true
- Build fails: MKL libraries could not be found HOT 1
- I want to get the start and end time of the sentence.
- No decoder available ,try again later HOT 1
- greek characters
- Cannot build on a Raspberry Pi 4 HOT 1
- recognition occurs, no result returned to client, then worker/master hangs HOT 2
- Portuguese model avaliable
- How to api provided with ajax request
- core happened using chain model HOT 3
- Command in README.md causes incorrect rocognition on speech containing "test"
- How to tune Kaldi gsteamer for using NNET3 decoder
- ReadFstKaldiGeneric():kaldi-fst-io.cc:82) Could not read fst from {MODEL_FILE}
- Docker build errors on Ubuntu 18.04 HOT 1
- No responses returned on websocket
- worker is errors
- Received error from server (status 9) Error message: No decoder available, try again later HOT 1
- failed: Connection refused. wget https://phon.ioc.ee/~tanela/tedlium_nnet_ms_sp_online.tgz** HOT 1
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from docker-kaldi-gstreamer-server.