Configuration management for ML Research Tools.
This module handles loading configuration from both a file and command line arguments.
The configuration file is stored at ~/.config/ml_research_tools/config.yaml by default.
-
class ml_research_tools.core.config.LoggingConfig(level='INFO', file=None)[source]
Bases: object
Logging configuration.
- Parameters:
-
-
level:
str = 'INFO'
-
file:
Optional[str] = None
-
class ml_research_tools.core.config.RedisConfig(host='localhost', port=6379, db=0, password=None, ttl=604800, enabled=False, recache=False)[source]
Bases: object
Redis connection configuration.
- Parameters:
-
-
host:
str = 'localhost'
-
port:
int = 6379
-
db:
int = 0
-
password:
Optional[str] = None
-
ttl:
int = 604800
-
enabled:
bool = False
-
recache:
bool = False
-
class ml_research_tools.core.config.LLMConfig(base_url='https://api.openai.com/v1', model='gpt-3.5-turbo', max_tokens=None, temperature=0.01, top_p=1.0, retry_attempts=10, retry_delay=10, api_key=None, tier='standard')[source]
Bases: object
LLM (Language Model) API configuration.
- Parameters:
base_url (str)
model (str)
max_tokens (int | None)
temperature (float)
top_p (float)
retry_attempts (int)
retry_delay (int)
api_key (str | None)
tier (str)
-
base_url:
str = 'https://api.openai.com/v1'
-
model:
str = 'gpt-3.5-turbo'
-
max_tokens:
int | None = None
-
temperature:
float = 0.01
-
top_p:
float = 1.0
-
retry_attempts:
int = 10
-
retry_delay:
int = 10
-
api_key:
Optional[str] = None
-
tier:
str = 'standard'
-
class ml_research_tools.core.config.LLMPresets(default='standard', presets=<factory>)[source]
Bases: object
Collection of LLM configurations with presets and tiering.
- Parameters:
-
-
default:
str = 'standard'
-
presets:
Dict[str, LLMConfig]
-
__post_init__()[source]
Initialize with default presets if empty.
-
get_config(preset_name=None, tier=None)[source]
Get an LLM configuration by name or tier.
- Parameters:
preset_name (Optional[str]) – Name of the preset to use (takes precedence over tier)
tier (Optional[str]) – Tier of model to use (e.g., “standard”, “premium”)
- Return type:
LLMConfig
- Returns:
LLMConfig object
- Raises:
ValueError – If no matching preset is found
-
class ml_research_tools.core.config.Config(logging=<factory>, redis=<factory>, llm_presets=<factory>)[source]
Bases: object
Global application configuration.
- Parameters:
-
-
logging:
LoggingConfig
-
redis:
RedisConfig
-
llm_presets:
LLMPresets
-
property llm: LLMConfig
Backward compatibility property to get the default LLM config.
-
classmethod from_dict(config_dict)[source]
Create a Config object from a dictionary.
- Parameters:
config_dict (Dict[str, Any]) – Dictionary containing configuration values.
- Return type:
Config
- Returns:
Config object.
-
ml_research_tools.core.config.load_config_file(config_file=None)[source]
Load configuration from file.
- Parameters:
config_file (Optional[Path]) – Path to configuration file. If None, uses the default.
- Return type:
Dict[str, Any]
- Returns:
Dictionary containing configuration values.
-
ml_research_tools.core.config.add_config_args(parser)[source]
Add configuration-related arguments to an argument parser.
- Parameters:
parser (ArgumentParser) – Argument parser to add arguments to.
- Return type:
ArgumentParser
- Returns:
Updated argument parser.
-
ml_research_tools.core.config.get_config(args=None)[source]
Get configuration from file and command line arguments.
- Parameters:
args (Optional[Namespace]) – Parsed command line arguments. If None, only uses the config file.
- Return type:
Tuple[Config, Path]
- Returns:
Config object.