Making LLMs lighter with AutoGPTQ and transformers
Making LLMs lighter with AutoGPTQ and transformers

huggingface.co
Making LLMs lighter with AutoGPTQ and transformers

Hugging face transformers officially has support for AutoGPTQ, this is a pretty huge deal and signals a much wider adoption in quantized model support which is great for everyone!