Introducing MPT-7B: A New Standard for Open-Source, Commercially Usable LLMs
the brain is not a computer! it doesn't perform operations. it doesn't optimize anything.
A huge day for open source! 🔥 You can now load models from @huggingface in 4bit precision using load_in_4bit and bitsandbytes library, with no performance degradation.
The next wave of website consumer class actions involving chatbots emerges from California
GitHub - jupediaz/chatgpt-prompt-splitter: ChatGPT PROMPTs Splitter. Tool for safely process chunks of up to 15,000 characters per request
GitHub - Anil-matcha/Website-to-Chatbot: ChatGPT for every website.Instantly answer your visitors' questions with a personalized chatbot trained on your website content.
To de-risk AI, the government must accelerate knowledge production
New WizardLM model, now in 13B! Trained on 250k 'evolved instructions' from ShareGPT and recorded as matching or beating GPT4 on multiple benchmarks (not all, of course :) )
Yann LeCun and Andrew Ng: Why the 6-month AI Pause is a Bad Idea