Cortex Inference
Returns the LLMs available for the current sessionÂļ
GET/api/v2/cortex/models
Returns the LLMs available for the current session
ResponseÂļ
Code | Description |
---|---|
200 | OK |
Perform LLM text completion inferenceÂļ
POST/api/v2/cortex/inference:complete
Perform LLM text completion inference, similar to snowflake.cortex.Complete.
ResponseÂļ
Code | Description |
---|---|
200 | OK |