Popular repositories Loading
-
-
german-moe-gpt-v8
german-moe-gpt-v8 PublicDieses Repository enthält die Skripte und Konfigurationen zum Trainieren des deutschen 149.6M Parameter Mixture-of-Experts (MoE) Sprachmodells.
Python
Something went wrong, please refresh the page to try again.
If the problem persists, check the GitHub status page or contact support.
If the problem persists, check the GitHub status page or contact support.