Bedrock Titan Model Streaming /invoke-with-response-stream

0

Same question as this post from 6 months ago from another user: https://repost.aws/questions/QU9Rswl22UQ2a2YoADVsad8g/how-to-parse-response-for-invoke-with-response-stream-api-from-bedrock-when-calling-the-api-directly I am making api call directly to get streaming response. how can I decode the output manually? I keep getting error

UnicodeDecodeError: 'utf-8' codec can't decode byte 0xb4 in position 9: invalid start byte

ane
質問済み 2ヶ月前493ビュー
1回答
0

For Cohere this post would help: https://repost.aws/questions/QU4x7qfzsnSZmGG4xVyvSx0Q/why-streaming-is-not-supported-for-titan-and-cohere-model-on-bedrock.

In case of Titan I am not sure if response stream is supported. In any case I am pasting how response stream can be parsed as example, I hope this is something like this of help?

        payload = {
            "prompt": prompt,
            "max_tokens_to_sample": 1000,
            "top_k": 50,
            "temperature": 0.1
        }

        response_stream = self.bedrock_runtime.invoke_model_with_response_stream(
            accept="application/json",
            body=bytes(json.dumps(payload), "utf-8"),
            contentType="application/json",
            modelId=self.model_id,
        )

        stream = response_stream.get('body')

        output = []
        if stream:
            for event in stream:
                chunk = event.get('chunk')
                if chunk:
                    chunk_obj = json.loads(chunk.get('bytes').decode())
                    text = chunk_obj['completion']
                    output.append(text)

        complete_output = ''.join(output)
profile picture
エキスパート
回答済み 2ヶ月前
profile picture
エキスパート
レビュー済み 2ヶ月前
  • Thanks for the reply reply, however this is using the library. If you call the endpoint directly, the response isn’t as expected

ログインしていません。 ログイン 回答を投稿する。

優れた回答とは、質問に明確に答え、建設的なフィードバックを提供し、質問者の専門分野におけるスキルの向上を促すものです。

質問に答えるためのガイドライン

関連するコンテンツ