Sugoi Toolkit and Sugoi 14B/32B
Edit User-Settings.json to point to your local instance (LM Studio: localhost:1234, oobabooga: localhost:5000 or Ollama: localhost:11434).
lm_studio pathway + model name (e.g.,lm_studio/sugoi14b), serverhttp://127.0.0.1:1234/v1and keysk-any-random-key.

oobabooga pathway + model name (e.g.,oobabooga/sugoi14b) serverhttp://127.0.0.1:5000without any key.

ollama pathway + model name (e.g.,ollama/sugoi14b) serverhttp://127.0.0.1:11434without any key.

Note: Sugoi uses internal system prompt which optimized for cloud llm (Translation-API-Server/LLM/Translator.py). Might have accuracy hit on offline model.Run Sugoi Toolkit
- Open sugoi toolkit and click on Sugoi Translator LLM

- Wait until the server is ready

- Copy your text

Note: Streaming output currently unsupported by the Sugoi Japanese Translator frontend.