According to https://huggingface.co/bigscience/bloom it has 70 layers, so that's not a perfect split, but even so that should still fit. Source: 10 months ago
It's notable that Hugging Face's BLOOM (https://huggingface.co/bigscience/bloom) might already be compliant (ignoring the 'member states' requirement which I'm sure they could comply with easily enough, it's about disclosing EU member where the model is on the market, so them simply listing all EU countries in a doc somewhere may suffice). Some may be tempted to point at the lower scores of OpenAI, Google and... - Source: Hacker News / 10 months ago
I've been reading a lot of the latest posts on hacker news, but a little lost on where to actually start. I'd like to run something on my local macbook pro to play around and does anyone have recommendation of like a top 5 links of who to follow and what to play with? So far I've seen things like https://huggingface.co/bigscience/bloom. - Source: Hacker News / 11 months ago
They do, and they did. It's called BLOOM and it comes in sizes up to 176B parameters. See: https://huggingface.co/bigscience/bloom. - Source: Hacker News / 11 months ago
ChatGPT is a GPT-3.5 model according to OpenAI Docs. So it has 175 billion parameters. For the architecture, people said BLOOM is likely the nearest alternative to GPT-3/3.5 models. Source: 11 months ago
Do you know an article comparing BLOOM to other products?
Suggest a link to a post with product alternatives.
This is an informative page about BLOOM. You can review and discuss the product here. The primary details have not been verified within the last quarter, and they might be outdated. If you think we are missing something, please use the means on this page to comment or suggest changes. All reviews and comments are highly encouranged and appreciated as they help everyone in the community to make an informed choice. Please always be kind and objective when evaluating a product and sharing your opinion.