OSSS.ai.config.llm_config¶ OSSS.ai.config.llm_config ¶ load_llm_config(path=None) ¶ Load JSON config for LLM settings. Precedence 1) explicit path arg (if provided) 2) OSSS_LLM_CONFIG_PATH env var (if set) 3) /llm.json (default) Returns a dict. If missing/invalid, returns {}. Caches the result in-module.