AI-Artificial Intelligence today is a Russian Overlapping Doll set of technologies which drive how AI is used, modified and self-generated. Yes, AI tools are actively used to generate better AI tools in a process called transformative generator models. A key tool in AI technology is LLM-Large Language Models which are built from huge, multi-billion record data files which in turn create parameter files of trillions of records. So AI-Artificial Intelligence is exactly that, built on a database of billions of carefully curated records which in turn help spawn trillions of parameter records which in turn are fed back into a full AI Models with one new driver, carefully crafted – transformer rules. Before you get lost in the numbers, here are some must view presentations that succinctly summarize the impact of Generative AI on not just computing but living in general:
2)Large Language Models and The End of Programming – Harvard Lecture by Matt Walsh
3)Outrageous State of the Art of AI Directions –NVIDIA CEO Jensen Huang Leaves Everyone SPEECHLESS
First, is the and complete data g nature of the lests generatas shown in this wikipedia report which has some of the most currenon whats moving [and with great pace] in LLMs. However, though WordPress leads the way with innovative pagebuilders there are 6 massive $3B tools which use LLMs in state of the art AI development. But first lets look at the top 6 LLMs:
- GPT-3 and GPT-4 from OpenAI, with 175 billion and 1.5 trillion parameters respectively, covering over 95 natural languages and 12 code languages.
- LLaMA from Meta, with 1.2 trillion parameters, covering 50 natural languages and 14 code languages.
- PaLM2 from Google, with 600 billion parameters, covering 101 natural languages and 12 code languages.
- BERT from Google, with 340 million parameters, covering 104 languages in multilingual model.
- BLOOM from BigScience, with 176 billion parameters, covering 46 natural languages and 13 code languages.