What do I do when I notice AWS Bedrock Anthropic models getting cut off before completing?

1

I have noticed over the last 2 days that the Anthropic models (all 3) are cutting off and not providing the entire completion. This is true within the API and Playground.

Here is an example of the same prompt getting cut off on AWS Bedrock Playground and AWS Bedrock API Call compared to the same prompt used in Anthropic Console which shows the entire completion.

AWS Playground (cutoff): Enter image description here

API App (cutoff) Enter image description here

Official Anthropic Claude Console (works) Enter image description here

질문됨 8달 전550회 조회
2개 답변
2

Hi,

Do you set the value of max tokens to its maximum (=2048) when calling via API? The default value is lower (=300 if I recall correctly)

Best

Didier

profile pictureAWS
전문가
답변함 8달 전
1

Hi Didier,

Thank you for the response. I am not noticing any cutoffs anymore. I wish I had written down wordcounts so that we did have a decent approximation of the tokens because you bring up a great point. I don't believe we were hitting the limits (I had some other examples which fall well below the 300 token mark) but the good news is that Playground is working well based on all my tests today and in the latter example my code was not getting the last streaming chunk that contained the stop sequence which is also fixed.

Thanks again.

답변함 8달 전
  • If it's helpful, a good rule of thumb is that 3 words is equal to about 4 tokens. So formulas you can use are...

    token count = word count * 1.33 world count = token count * 0.75

  • Hello there!

    Im facing the same issue with the AI21Labs Jurassic-2 Ultra model. Can you help me out? How do you solve this problem and how do you use the stop sequences?

    Best, Hasan.

로그인하지 않았습니다. 로그인해야 답변을 게시할 수 있습니다.

좋은 답변은 질문에 명확하게 답하고 건설적인 피드백을 제공하며 질문자의 전문적인 성장을 장려합니다.

질문 답변하기에 대한 가이드라인

관련 콘텐츠