Edit model card

T5-XXL Encoder

This repo contains copies of the T5-XXL encoder in various quantization formats. The models in this repo are intended for use in InvokeAI.

Contents:

  • bfloat16/ - T5-XXL encoder cast to bfloat16. Copied from here.
  • bnb_llm_int8/ - T5-XXL encoder quantized using bitsandbytes LLM.int8() quantization.
  • optimum_quanto_qfloat8/ - T5-XXL encoder quantized using optimum-quanto qfloat8 quantization.
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference API
Unable to determine this model's library. Check the docs .