Redis caching utilities for ML Research Tools.

This module provides utilities for Redis caching, including a Redis cache manager class and functions for common caching operations.

ml_research_tools.cache.redis.create_redis_client(config)[source]#

Create and return a Redis client based on configuration.

Parameters:

config (RedisConfig) – Redis configuration from the Config object

Return type:

Optional[Redis]

Returns:

Redis client instance or None if disabled or connection failed

ml_research_tools.cache.redis.generate_cache_key(args=None, kwargs=None, prefix='')[source]#

Generate a unique cache key based on input parameters.

Parameters:
  • *args (Any) – Arguments to include in the key generation

  • prefix (str) – Optional prefix for the key (e.g., function name)

  • kwargs (dict[str, Any] | None)

Return type:

str

Returns:

A string key suitable for Redis

ml_research_tools.cache.redis.get_from_cache(redis_client, cache_key)[source]#

Retrieve data from Redis cache if available.

Parameters:
  • redis_client (Optional[Redis]) – Redis client instance or None

  • cache_key (str) – Unique cache key for the data

Return type:

Optional[bytes]

Returns:

Cached data as bytes or None if not found

ml_research_tools.cache.redis.save_to_cache(redis_client, cache_key, data, ttl)[source]#

Save data to Redis cache with the specified TTL.

Parameters:
  • redis_client (Optional[Redis]) – Redis client instance or None

  • cache_key (str) – Unique cache key for the data

  • data (bytes) – Data to cache (as bytes)

  • ttl (int) – Time-to-live in seconds (0 for no expiration)

Return type:

bool

Returns:

True if successfully cached, False otherwise

class ml_research_tools.cache.redis.RedisCache(config)[source]#

Bases: object

Redis cache manager for ML Research Tools.

This class provides a simple interface for Redis caching operations, including serialization and deserialization of complex Python objects.

Example

from ml_research_tools.config import get_config
from ml_research_tools.cache import RedisCache

config = get_config()
cache = RedisCache(config.redis)

# Cache a Python object
data = {"results": [1, 2, 3]}
cache.set("my_key", data)

# Get it back
retrieved = cache.get("my_key")

Initialize Redis cache manager.

Parameters:

config (RedisConfig) – Redis configuration from Config object

__init__(config)[source]#

Initialize Redis cache manager.

Parameters:

config (RedisConfig) – Redis configuration from Config object

property enabled: bool#

Return whether caching is enabled.

property recache: bool#

Return whether recaching is enabled (don’t use cached values).

get(key, default=None)[source]#

Get value from cache by key.

Parameters:
  • key (str) – Cache key

  • default (Optional[TypeVar(T)]) – Default value to return if key not found

Return type:

Optional[TypeVar(T)]

Returns:

Cached value or default

set(key, value, ttl=None)[source]#

Set value in cache with optional TTL.

Parameters:
  • key (str) – Cache key

  • value (Any) – Value to cache (can be any pickle-serializable object)

  • ttl (Optional[int]) – Time-to-live in seconds (None uses config default)

Return type:

bool

Returns:

True if successfully cached, False otherwise

delete(key)[source]#

Delete key from cache.

Parameters:

key (str) – Cache key to delete

Return type:

bool

Returns:

True if key was deleted, False otherwise

exists(key)[source]#

Check if key exists in cache.

Parameters:

key (str) – Cache key to check

Return type:

bool

Returns:

True if key exists, False otherwise

clear(pattern='*')[source]#

Clear cache keys matching pattern.

Parameters:

pattern (str) – Redis key pattern to match (default: all keys)

Return type:

bool

Returns:

True if successful, False otherwise

ml_research_tools.cache.redis.cached(prefix='', ttl=None, key_fn=None)[source]#

Decorator to cache function results in Redis.

Parameters:
  • prefix (str) – Prefix for cache keys

  • ttl (Optional[int]) – Time-to-live in seconds (None uses config default)

  • key_fn (Optional[Callable[..., str]]) – Custom function to generate cache key (if None, uses generate_cache_key)

Return type:

Callable[[Callable[..., TypeVar(R)]], Callable[..., TypeVar(R)]]

Returns:

Decorator function

Example

from ml_research_tools.cache.redis import cached
from ml_research_tools.config import get_config

config = get_config()

@cached(prefix="expensive_computation", ttl=3600)
def expensive_computation(a, b, c):
    # ... some expensive calculation
    return result