This project is no longer actively maintained. While the code remains available for reference and use, no updates, bug fixes, or new features will be provided. Users are encouraged to seek alternative ...
Abstract: The Mixture of Experts (MoE) model is a promising approach for handling code-switching speech recognition (CS-ASR) tasks. However, the existing CS-ASR work on MoE has yet to leverage the ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results