MediaLive: two overlays (GOP-aligned)

1

Hello,

What I want to achieve is:

  1. input: single video stream + two PNG images to overlay, image 'A' and image 'B'.
  2. output: two groups of live streams, group 'A' and 'B'.

Group 'A' would contain all bitrate renditions of the input video stream with image 'A' overlaid on top of each; similarily group 'B' would contain the same but with image 'B' overlaid.

Now, important requirement is that everything should be GOP-aligned so that we are later on in the chain able to on-the-fly follow a segment of 1080 rendition from group 'A' with a segment of 576 rendition from group 'B'.

Now, what I understand is that I can use two separate AWS MediaLive instances. The first would produce group 'A' and the second group 'B'. As I understand, MediaLive guarantees that individual renditions coming out of a single MediaLive instance are GOP aligned. However it seems to me that there is no such guarantee that outputs of two separate MediaLive instances are aligned? Is that correct?

If yes, then maybe I could somehow have only one MediaLive instance produce both groups 'A' and 'B' ? In that case however I cannot see a way to configure this single MediaLive to produce two groups of outputs with a different overlay on each?

Edited by: Koltunski on Jul 28, 2020 7:06 AM

Edited by: Koltunski on Jul 28, 2020 7:07 AM

asked 4 years ago385 views
11 Answers
0

Hi Koltunski,

Thank you for asking this question on the MediaLive forums, unfortunately there is not currently a way to implement two images going to different outputs on a single MediaLive channel. To do this you will need to create two channels for the different image overlays. To overcome the timecode and GOP issues that you are looking to avoid. I would suggest that you use MediaConnect to send your source to the two MediaLive channels and use Epoch Locking as your output locking mode in MediaLive which will attempt to synchronize each output to the Unix Epoch.

Also, when using Epoch Locking there are 2 things to note, you need to have timecode on your source for this to work and scte-35 messaging is not supported when using Epoch Locking.

Zach

Edited by: Zacharyb-AWS on Jul 28, 2020 1:12 PM

answered 4 years ago
0

Thank you Zachary for your in-depth answer.

One more question: you say that for 'Epoch Locking' to work the input video must have a 'timecode'.
What kind of 'timecode' are you referring to?

answered 4 years ago
0

Hi Koltunski,

Thank you for the follow up question, Video Timecode is time metadata that is assigned to a specific frame or a specific point in a video which is used for reference points and editing.

Zach

answered 4 years ago
0

Koltunski

Please see the reference to embedded timecode in https://docs.aws.amazon.com/medialive/latest/ug/timecode.html

answered 4 years ago
0

Hello,

I came back and I am trying to implement the pipeline described by Zachary in the second message of this thread.

So, again: what I am trying to acheive is: a single video input, two transcoding pipelines, each on overlaying an image on top of the video, and then weave the two resulting HLS streams together by taking every other segment from each one. For this to work, the two pipelines must obviously be GOP-aligned.

So far I've got two MediaLive pipelines and two inputs. Each input pulls the same HLS video source. Each MediaLive pipeline overlays a different image on top of the video; both send their output to a single MediaStore container. I stream like that for a while, stop, go to the container and edit the M3U manifests:

(...)
#EXTINF:10,
overlay_a_720p30_00326.ts
#EXTINF:10,
overlay_b_720p30_00327.ts
#EXTINF:10,
overlay_a_720p30_00328.ts
#EXTINF:10,
overlay_b_720p30_00329.ts
(...)

so that the two streams, 'overlay_a' and 'overlay_b' are weaved together. I use Epoch Locking, and I inject embedded timecodes during the transcode.

When I then watch such 'weaved' stream, it is clear that the two pipelines are many seconds apart. Thinking about it, it's not surprising - they are probaby apart already at the input and all of that timecode and locking later on cannot rectify the initial desynchronization.

Zachary says that one has to 'use MediaConnect to send your source to the two MediaLive channels'.

The problem is, I cannot see any way how I could create one MediaLive input of type 'MediaConnect' and then connect this one input to two MediaLive channels? The UI only lets me to use each input once?

answered 3 years ago
0

Hi LKoltunski,

Your description indicates that Epoch locking is not presently working. There are two additional requirements for Epoch locking to work successfully:

  1. The input timecode must be in UTC format.
  2. There cannot be any "complex" frame rate conversions happening in the channel. In other words, you can't take a 29.97 input and output 30.00 fps. Even multiples are OK.

When Epoch locking is working properly you will see segment numbers that are based on Epoch time. In other words, the segment number will be the Epoch timestamp divided by the exact segment duration.

With respect to the MediaConnect question, you need to create additional inputs with the same MediaConnect flow ARNs.

Please advise if you have any other questions,

Regards,
Steve

Edited by: AWSsteve on Feb 11, 2021 11:04 AM

AWS
Steve_W
answered 3 years ago
0

Thanks Steve!

So you're saying that even without MediaConnect, the following pipeline

    ---> MLive input -> MLive transcode (insert timecodes & Epoch) -> MStore container   

HLS ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||| --> weave into one HLS
---> MLive input -> MLive transcode (insert timecodes & Epoch) -> MStore container

is supposed to stay synchronized, if Epoch locking works?

Edited by: LKoltunski on Feb 11, 2021 12:01 PM

Edited by: LKoltunski on Feb 11, 2021 12:02 PM

Edited by: LKoltunski on Feb 11, 2021 12:03 PM

answered 3 years ago
0

If you can supply the required number of streams to feed both MediaLive channels, yes you can do it that way. But if you only have a single source, MediaConnect can distribute the single source to the two MediaLive channels by creating two MediaLive inputs referencing the same MediaConnect flow(s).

The key for Epoch locking to work is that every MediaLive pipeline is receiving the same source with the same embedded timecode (in UTC).

Note another way to confirm what timecode is being received, is that MediaLive will emit a log entry in the as-run log stream for each channel showing the initial timecode.

Edited by: AWSsteve on Feb 11, 2021 12:05 PM

AWS
Steve_W
answered 3 years ago
0

I came back to report mixed results.

About two weeks ago, I followed what Steve suggested and I managed to create this:

https://4n42nbhrddyctj.data.mediastore.eu-west-1.amazonaws.com/tears_abab.m3u8

(the above is a manually rewritten manifest, basically 'tears_a.m3u8' and 'tears_b.m3u8' weaved together in order to see if they are time-synchronized).

If you watch this with VLC, you'll see 10-second long segments, the odd ones with a visible, large 'A' overlaid, the even ones - with a large 'B'. Transition between the 'A' and the 'B' is seamless. Epoch Locking works; I can also tell this by the way the segments are named ( 'tears_a_720p30_161339652.ts' - the '161339652' is the Epoch divided by the segment duration just like Steve was saying ).

So success!

However, today I was trying to re-create this pipeline, to no avail. Whatever I try, the segments which land in MediaStore are named like this: 'tears_a_720p30_00012.ts' - '00012' instead of the Epoch - and by this alone, I can tell that the synchronization is not going to work. Indeed, when I manually rewrite the manifest into an 'abab' one just like above, I can see that the 'a_000N' and 'b_000N' segments are a few seconds apart. So Epoch Locking is not working at all.

I am at a loss trying to figure out why.

Here's what I do:

  1. create an input of type 'HLS' , pull https://ntv1.akamaized.net/hls/live/2014075/NASA-NTV1-HLS/master_2000.m3u8 (NASA TV)
  2. Notice that the above source is at 29.97 fps = 30000/1001. It also (as far as I can tell) does not include timecode info.
  3. create a MediaLive channel, set the above as an input, configure it to add timecode from SYSTEMCLOCK ( channel->General Settings-> Timecode Configuration-> SYSTEMCLOCK ) The only point of this MediaLive is to add the timecode.
  4. configure this to output to rtp://IP:PORT, where IP and PORT is taken from the MediaConnect instance from point 5)
  5. Create a MediaConnect flow, taken note of its input IP and PORT, configure its Whitelist CIDR block so that the RTP from the previous MediaLive gets through.
  6. create two more Inputs from the MediaConnect flow.
  7. create two more MediaLive channels. Configure each one to take its input from the just created two inputs of type MediaConnect.
  8. configure the 2 channels to send their output to appropriate mediastoressl:// ... location
  9. configure their template to be 'Live Event (HLS)' which creates the HSL output with 4 renditions.
  10. Set their class to be ‘SINGLE_PIPELINE'
  11. Channel → General Settings → Global Configuration → Output Locking Mode -> set this to 'Epoch Locking'.
  12. In each output rendition's settings pages, make sure that the ‘Frame Rate’ is set to ‘SPECIFIED’ and Num/Denom is 30000/1001
  13. in each of the two Channels 'General Settings' -> 'Timecode Configuration' set the timecode to 'EMBEDDED' (meaning: take the timecode from the stream, because it's supposed to be there -> it's supposed to be added by the first MediaLive instance)
  14. again for each of the two MediaLive channels, create a Schedule of type 'Static image activate' and overlay 'A.png' (first) and 'B.png' (second)
  15. create a MediaStore container (in point 8 we gave this as a location of the output)

The above is, AFAIK, ecactly how I was creating this before. Only now it doesn't work.

Any advice?

answered 3 years ago
0

LKoltunski

Given that the workflow doesn't seem to work the issue is most likely that the source does not contain timecode. In your procedure step 3 doesn't state that you enabled timecode insertion in the output. In the UDP output group, Video Settings - Codec Settings - Timecode - Timecode Insertion - Set to PIC_TIMING_SEI.

If with this change you still have issues then please either open a ticket in Support Center, providing the Channel ARNs for all three channels, (fastest response), or private mail the info to me, and we will investigate the issue further.

Regards

answered 3 years ago
0

Is there any more update on this topic? I am having the same issue. Only the first time try, the segment sequence number was EPOCH/segment_duration. After I edited the medialive channel (added another resolution to the output group), the sequence number was then named from 00001.ts.

answered 2 years ago

You are not logged in. Log in to post an answer.

A good answer clearly answers the question and provides constructive feedback and encourages professional growth in the question asker.

Guidelines for Answering Questions