Model loading
The LlmTabularModel
is designed to work with Hugging Face (HF) models
using an adapter checkpoint provided by Aindo.
The model is loaded using the LlmTabularModel.load()
method.
In the simplest use case, the user only needs to specify the path to the checkpoint file and provide a Hugging Face authentication token. If no local base model path is given, the module automatically downloads the required model from huggingface.com.
from aindo.rdml.synth.llm import LlmTabularModel
model = LlmTabularModel.load(
ckpt_path="path/to/ckpt",
auth_token="hf_...",
)
Note: Some models are gated, meaning access is restricted until you accept their license agreement on huggingface.com. Make sure to accept the license agreement for the model you want to use before loading it.
Authentication token as an environment variable
For security, it is recommended to set the Hugging Face authentication token as an environment variable instead of passing it directly in the code.
In a shell, define the HF_TOKEN
environment variable:
export HF_TOKEN="hf_..."
Then, in any Python program launched from the same shell, the model can be loaded without providing the token:
from aindo.rdml.synth.llm import LlmTabularModel
model = LlmTabularModel.load(ckpt_path="path/to/ckpt")
This prevents exposing credentials in the codebase.
Offline usage
If internet access is restricted or the user prefers to use a local version of the base model, it is possible to specify a local path to the model.
from aindo.rdml.synth.llm import LlmTabularModel
model = LlmTabularModel.load(
ckpt_path="path/to/ckpt", # Path to the checkpoint file provided by Aindo (adapter)
model_path="path/to/model", # Path to a local version of the base model
)