Bedrock Titan Model Streaming /invoke-with-response-stream

0

Same question as this post from 6 months ago from another user: https://repost.aws/questions/QU9Rswl22UQ2a2YoADVsad8g/how-to-parse-response-for-invoke-with-response-stream-api-from-bedrock-when-calling-the-api-directly I am making api call directly to get streaming response. how can I decode the output manually? I keep getting error

UnicodeDecodeError: 'utf-8' codec can't decode byte 0xb4 in position 9: invalid start byte

ane
已提问 2 个月前490 查看次数
1 回答
0

For Cohere this post would help: https://repost.aws/questions/QU4x7qfzsnSZmGG4xVyvSx0Q/why-streaming-is-not-supported-for-titan-and-cohere-model-on-bedrock.

In case of Titan I am not sure if response stream is supported. In any case I am pasting how response stream can be parsed as example, I hope this is something like this of help?

        payload = {
            "prompt": prompt,
            "max_tokens_to_sample": 1000,
            "top_k": 50,
            "temperature": 0.1
        }

        response_stream = self.bedrock_runtime.invoke_model_with_response_stream(
            accept="application/json",
            body=bytes(json.dumps(payload), "utf-8"),
            contentType="application/json",
            modelId=self.model_id,
        )

        stream = response_stream.get('body')

        output = []
        if stream:
            for event in stream:
                chunk = event.get('chunk')
                if chunk:
                    chunk_obj = json.loads(chunk.get('bytes').decode())
                    text = chunk_obj['completion']
                    output.append(text)

        complete_output = ''.join(output)
profile picture
专家
已回答 2 个月前
profile picture
专家
已审核 2 个月前
  • Thanks for the reply reply, however this is using the library. If you call the endpoint directly, the response isn’t as expected

您未登录。 登录 发布回答。

一个好的回答可以清楚地解答问题和提供建设性反馈,并能促进提问者的职业发展。

回答问题的准则