OSSS.ai.llm.utils¶ OSSS.ai.llm.utils ¶ call_llm_text(llm, prompt) async ¶ Call an LLM wrapper that might accept either messages: List[{"role": "...", "content": "..."}] (preferred) prompt: str (fallback) Returns raw provider response (whatever the wrapper returns).