Bloomberg & JHU’s BloombergGPT: ‘A Best-in-Class LLM for Financial NLP’
In the new paper BloombergGPT: A Large Language Model for Finance, a research team from Bloomberg and Johns Hopkins University presents BloombergGPT, a 50 billion parameter language model trained on a 700 billion token dataset that significantly outperforms current benchmark models on financial tasks.
Copy and paste this URL into your WordPress site to embed
Copy and paste this code into your site to embed