- Mais recentes
- Mais votos
- Mais comentários
You're correct to question these figures. The statement about GPT-4 having "170 trillion parameters and a 45 GB training dataset" appears to be inaccurate based on available information.
OpenAI has not publicly disclosed the exact parameter count for GPT-4 in their technical report. While estimates exist in the industry, the 170 trillion figure is significantly higher than commonly cited estimates, which tend to be in the range of 1-2 trillion parameters.
Regarding the training dataset size, 45 GB would indeed be extremely small for a model of GPT-4's scale and capabilities. For context, GPT-3 (a predecessor model) was trained on a much larger dataset measured in terabytes, not gigabytes.
These figures should be reviewed for accuracy, as they appear to contain errors that could mislead readers about the technical specifications of GPT-4.
Sources
Community | A Network Engineers Guide to Generative AI
Conteúdo relevante
- feita há 5 meses
- feita há 10 meses
