config
optimus_dl.recipe.serve.config
¶
ServeCommonConfig
dataclass
¶
ServeCommonConfig(checkpoint_path: str | None = None, model: Any = None, tokenizer: optimus_dl.modules.tokenizer.config.BaseTokenizerConfig = '???', device: str = 'auto')
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
checkpoint_path
|
str | None
|
Path to model checkpoint |
None
|
model
|
Any
|
Model to build (if you want to load model not from checkpoint) |
None
|
tokenizer
|
BaseTokenizerConfig
|
|
'???'
|
device
|
str
|
Device to use (cuda, cpu, auto) |
'auto'
|
Source code in optimus_dl/recipe/serve/config.py
ServeConfig
dataclass
¶
ServeConfig(serve: optimus_dl.recipe.serve.config.ServeRecipeConfig =
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
serve
|
ServeRecipeConfig
|
ServeRecipeConfig(port: int = 8000, host: str = '0.0.0.0') |
<dynamic>
|
common
|
ServeCommonConfig
|
ServeCommonConfig(checkpoint_path: str | None = None, model: Any = None, tokenizer: optimus_dl.modules.tokenizer.config.BaseTokenizerConfig = '???', device: str = 'auto') |
<dynamic>
|
Source code in optimus_dl/recipe/serve/config.py
ServeRecipeConfig
dataclass
¶
ServeRecipeConfig(port: int = 8000, host: str = '0.0.0.0')
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
port
|
int
|
Port to serve on |
8000
|
host
|
str
|
Host to serve on |
'0.0.0.0'
|