By using AWS re:Post, you agree to the Terms of Use
/AWS Elemental MediaLive/

Questions tagged with AWS Elemental MediaLive

Sort by most recent
  • 1
  • 90 / page

Browse through the questions and answers listed below or filter and sort to narrow down your results.

Low Latency Live streaming with CMAF

Hello, We are trying to achieve low latency live HLS streaming. Our current setup is pretty common: 1. On-Premises OBS studio, with encoding tuned for zero latency 2. Upstreaming via RTMP PUSH to AWS Elemental MediaLive 3. MediaLive output group to MediaStore. Repeating the same encoding settings (zero latency) 4. MediaStore 5. CloudFront CDN with hooked MediaStore origin 6. HLS.js player tuned for playback buffer minification There are great articles from Amazon experts regarding how low/ultra-low latency can be achieved. This is one we used to tune our current setup: https://aws.amazon.com/blogs/media/part-3-how-to-compete-with-broadcast-latency-using-current-adaptive-bitrate-technologies/ Here is one another great article we found: https://aws.amazon.com/blogs/media/part-2-getting-started-with-ultra-low-latency-using-aws-elemental-live/ As far as we see that, Common Media Application Format might be helpful to achieve ultra-low latency as it's become possible to operate with smaller addressable pieces of media (chunks/fragments) - rather than traditional segments. It allows earlier movement of media in the streaming pipeline. This solution is based on an on-premises AWS Elemental Live encoder, which we currently don't have. The understanding is - it's quite expensive and possibly we won't afford it :( Besides ordinary HLS Segment Length setting (BTW we use 1sec as of now), it also provides some interesting settings like: Chunked Encoding = true Chunk Length = 200ms Fragment Length = 1sec Chunked Transfer = true There are no such "advanced" settings in AWS MediaLive or in AWS MediaPackage, but we see that: 1) In AWS MediaLive there are HLS container option for channel output to be set to "fMP4 HLS" "Fmp4 hls – Choose this type of container if you want to package the streams (encodes) as fragmented MP4" https://docs.aws.amazon.com/medialive/latest/ug/hls-container.html 2) In AWS MediaPackage there is the following type of endpoint: Common Media Application Format https://docs.aws.amazon.com/mediapackage/latest/ug/endpoints-cmaf.html?icmpid=docs_elemental_mediapackage_helppanel We tried to configure both 1) fMP4 HLS as a container for output to AWS MediaStore and 2) CMAF endpoint for MediaPackage but did not see any difference in resulting end-to-end latency. The question is: Does it make sense to continue experimenting with these configuration options to achieve lower latency? As there are no mentioned above settings in AWS MediaLive and AWS MediaPackage, the feeling is - we tried the wrong way. But it's always better to ask so that I'm asking :) Thanks for reading and your help in advance, Igor Edited by: ibotov on Apr 6, 2021 7:48 AM
8
answers
0
votes
2
views
ibotov
asked 9 months ago

Object does not match "h265" error when creating a MediaLive channel

Hi, I'm trying to create a MediaLive channel with input specifications: Input codec: HEVC Input resolution: UHD Max input bitrate: MAX_20_MBPS The channel's input is RTMP and outputs HLS to a CDN. I have one output group only, the codec settings for which are H265. When I try to save the channel I get the error: videoDescriptions\[0].codecSettings Object does not match "h265" Now I've checked the a few times the settings and the actual json that's being sent out and I can't seem to find anything wrong with it. Could this be an issue with the validation or is there something else that I'm missing out? Also here's the json that's being sent and fails with the validation error: ``` { "name": "RTMPtoHLS4KHEVC", "inputAttachments": [{ "inputAttachmentName": "RTMP Push", "inputId": "3754057", "inputSettings": { "sourceEndBehavior": "CONTINUE", "inputFilter": "AUTO", "filterStrength": 1, "deblockFilter": "DISABLED", "denoiseFilter": "DISABLED" } }], "inputSpecification": { "codec": "HEVC", "resolution": "UHD", "maximumBitrate": "MAX_20_MBPS" }, "destinations": [{ "settings": [{ "url": "https://post.ioio-i.akamaihd.net/814841/testStream01_2/4ktesting/master" }], "id": "3nbjai" }], "encoderSettings": { "audioDescriptions": [{ "name": "2160p_audio_settings", "codecSettingsChoice": "AAC", "languageCodeControl": "FOLLOW_INPUT", "audioTypeControl": "FOLLOW_INPUT", "codecSettings": { "aacSettings": { "codingMode": "CODING_MODE_2_0", "profile": "LC", "bitrate": "96000", "sampleRate": 48000, "inputType": "NORMAL", "rawFormat": "NONE", "spec": "MPEG4", "rateControlMode": "CBR" } } }], "outputGroups": [{ "name": "HLS", "outputGroupSettingsChoice": "hls_group", "outputGroupSettings": { "hlsGroupSettings": { "inputLossAction": "EMIT_OUTPUT", "directoryStructure": "SINGLE_DIRECTORY", "segmentsPerSubdirectory": 10000, "outputSelection": "MANIFESTS_AND_SEGMENTS", "mode": "LIVE", "tsFileMode": "SEGMENTED_FILES", "streamInfResolution": "INCLUDE", "manifestDurationFormat": "FLOATING_POINT", "segmentLength": 6, "indexNSegments": 10, "keepSegments": 21, "segmentationMode": "USE_SEGMENT_DURATION", "iFrameOnlyPlaylists": "DISABLED", "programDateTime": "EXCLUDE", "programDateTimePeriod": 600, "clientCache": "ENABLED", "codecSpecification": "RFC_6381", "manifestCompression": "NONE", "redundantManifest": "DISABLED", "ivInManifest": "INCLUDE", "ivSource": "FOLLOWS_SEGMENT_NUMBER", "captionLanguageSetting": "OMIT", "timedMetadataId3Frame": "PRIV", "timedMetadataId3Period": 10, "destination": { "destinationRefId": "3nbjai" }, "hlsCdnSettings": { "hlsAkamaiSettings": { "connectionRetryInterval": 1, "numRetries": 10, "filecacheDuration": 300, "restartDelay": 15, "httpTransferMode": "CHUNKED" } } } }, "outputs": [{ "outputName": "yru1d", "outputSettings": { "hlsOutputSettings": { "nameModifier": "_$w$x$h$_$rc$", "hlsSettings": { "standardHlsSettings": { "audioRenditionSets": "program_audio", "m3u8Settings": { "programNum": 1, "audioFramesPerPes": 4, "pmtPid": "480", "videoPid": "481", "pcrControl": "PCR_EVERY_PES_PACKET", "audioPids": "492-498", "scte35Behavior": "NO_PASSTHROUGH", "scte35Pid": "500", "timedMetadataBehavior": "NO_PASSTHROUGH", "timedMetadataPid": "502" } } } } }, "videoDescriptionName": "2160p_video_settings", "audioDescriptionNames": ["2160p_audio_settings"], "outputSettingsChoice": "hls" }] }], "videoDescriptions": [{ "name": "2160p_video_settings", "scalingBehavior": "DEFAULT", "sharpness": 50, "respondToAfd": "NONE", "width": 3840, "height": 2160, "codecSettings": { "h265Settings": { "parNumerator": 1, "parDenominator": 1, "afdSignaling": "NONE", "rateControlMode": "QVBR", "bitrate": 1000000, "scanType": "PROGRESSIVE", "gopSize": 3, "gopSizeUnits": "SECONDS", "gopClosedCadence": 1, "sceneChangeDetect": "ENABLED", "profile": "MAIN", "tier": "HIGH", "level": "H265_LEVEL_AUTO", "adaptiveQuantization": "HIGH", "flickerAq": "ENABLED", "colorMetadata": "INSERT", "lookAheadRateControl": "HIGH", "alternativeTransferFunction": "OMIT", "timecodeInsertion": "DISABLED", "maxBitrate": 11500000, "qvbrQualityLevel": 10 } } }], "timecodeConfig": { "source": "EMBEDDED" }, "globalConfiguration": { "inputEndAction": "NONE", "outputTimingSource": "INPUT_CLOCK", "supportLowFramerateInputs": "DISABLED", "outputLockingMode": "PIPELINE_LOCKING", "inputLossBehavior": { "repeatFrameMsec": 1000, "blackFrameMsec": 10000, "inputLossImageType": "SLATE", "inputLossImageColor": "000000", "inputLossImageSlate": { "uri": "https://live-test-bucket-tokyo.s3-ap-northeast-1.amazonaws.com/smpte-color-bars-1080p.png" } } } }, "requestId": "1569585158382", "logLevel": "DISABLED", "channelClass": "SINGLE_PIPELINE", "roleArn": "arn:aws:iam::679554989631:role/live-qa-stg-develop-RoleMediaLive-PXCTCRBTSREG" } ```
2
answers
0
votes
0
views
ChristoChristoph
asked 2 years ago
  • 1
  • 90 / page