RNNLM model support lm_num_thread and lm_provider setting (#173)

* rnnlm model inference supports num_threads setting

* rnnlm params decouple num_thread and provider with Transducer.

* fix python csrc bug which offline-lm-config.cc and online-lm-config.cc arguments problem

* lm_num_threads and lm_provider set default values

---------

Co-authored-by: cuidongcai1035 <cuidongcai1035@wezhuiyi.com>
This commit is contained in:
keanu
2023-06-12 15:51:27 +08:00
committed by GitHub
parent 13b33fcc08
commit 1a1b9fd236
18 changed files with 67 additions and 31 deletions

View File

@@ -20,7 +20,7 @@ class OnlineRnnLM : public OnlineLM {
public:
~OnlineRnnLM() override;
explicit OnlineRnnLM(const OnlineRecognizerConfig &config);
explicit OnlineRnnLM(const OnlineLMConfig &config);
std::pair<Ort::Value, std::vector<Ort::Value>> GetInitStates() override;