Python module
lora_config
MAX LoRA configuration.
LoRAConfig
class max.pipelines.lib.lora_config.LoRAConfig(enable_lora: 'bool' = False, lora_paths: 'list[str]' = <factory>, max_lora_rank: 'int' = 16, max_num_loras: 'int' = 1, _config_file_section_name: 'str' = 'lora_config')
-
Parameters:
enable_lora
enable_lora: bool = False
Enables LoRA on the server
help()
static help()
Documentation for this config class. Return a dictionary of config options and their descriptions.
lora_paths
List of statically defined LoRA paths
max_lora_rank
max_lora_rank: int = 16
Maximum rank of all possible LoRAs
max_num_loras
max_num_loras: int = 1
The maximum number of active LoRAs in a batch.
This controls how many LoRA adapters can be active simultaneously during inference. Lower values reduce memory usage but limit concurrent adapter usage.
Was this page helpful?
Thank you! We'll create more content like this.
Thank you for helping us improve!