1 réponse
- Le plus récent
- Le plus de votes
- La plupart des commentaires
0
Hi,
Antropic's Claude has currently a limit of a few thousands token on output. But, it can take very large input (up to a 200-page book) .
Claude 2 can take up to 100,000 tokens in each prompt, meaning it can work over hundreds of
pages of text, or even an entire book. Claude 2 can also write longer documents—on the order
of a few thousand tokens—compared to its prior version, giving you even greater ways to develop
generative AI applications using Amazon Bedrock.
To overcome this limitation, your chatbot has to make a sequence of multiple queries to Claude and then you concatenate the answers to present them to the users: "write a documentation about feature X" + "write a documentation about feature X" + etc.
Best,
Didier
Contenus pertinents
- demandé il y a 2 mois
- Réponse acceptéedemandé il y a 7 mois
- demandé il y a un an
- AWS OFFICIELA mis à jour il y a 2 ans
- AWS OFFICIELA mis à jour il y a 2 ans