• Small Language Models encapsulates the subset of neural language models that are small enough to be used and even fine-tuned on a single computer. Notable examples are BERT and its variants as well as modestly-sized auto-regressive generative models such as BART and T5
  • SLMs can be easily fine-tuned and tailored for specific use cases with adapter models like QLoRa