Browse through the questions and answers listed below or filter and sort to narrow down your results.
How to play video from MediaLive through UDP?
**On AWS, how do you play video from MediaLive through the UDP output group?** For my use case, I'm building a live stream pipeline that takes an MPEG-2 transport stream from MediaLive, processes it through a UDP server (configured as an output group), and consumed by a web client that plays on HTML5 video. The problem is: the data is flowing, but the video isn't rendering. Previously, my output group was set to AWS MediaPackage, but because I need the ability to read and update frames over live stream, I'm trying to feed through UDP. Is setting the output group to UDP the right approach? The documentation is a bit sparse here. I'm wondering if there are resources or examples where others were able to play video this way as oppose to HLS/DASH.
Problem with AWS Elemental MediaLive Workflow Wizard
I'm getting some errors when I try to do a live stream using MediaLive. In MediaLive, I have MP4 input (stored in public s3) and MediaPackage configured as the destination. I created three end-points for MediaPackage (HLS, DASH, and CMAF). I used MediaLive Workflow Wizard to configure/set up the workflow as shown below: ![Enter image description here](https://repost.aws/media/postImages/original/IMPOxB55T7Q6mCv_vMdG_HJQ) When I started the workflow, I got the following alerts from MediaLive Channel: ![Enter image description here](https://repost.aws/media/postImages/original/IM-I4xQ8m3QVibv4orZ1Z2tw) Also when I tried to open the HLS playlist or DASH MPD, I got a **404 error**. For debugging purposes, I've given full access to the IAM role that is associated with the channel. Any idea on what I'm missing?
Is it possible to use an mp3 or wav file as a source file in aws media package ?
Am building an audio streaming platform, i want to protect the audio files with drm. I found that an aws service called media package, but after reading through the documentation, all i can see is packaging for videos only. The question is, is possible to use aws media package to package audio files only ?
How to publish live stream to MediaPackage using FFmpeg?
I want to push the stream to MediaPackage with FFmpeg instead of pushing to MediaLIve. I try the command ``` ffmpeg -re -stream_loop -1 -i input.mp4 -c copy -f hls 'https://username:password@mypackage_domain.com/xxx/channel/main.m3u8' ``` It seems that I push the stream succeed, but I can't play the stream with the endpoint, and there are no metrics about ingress items.
push browser camera to aws rtmp server
How to convert MediaStore lifecycle policy to s3 lifecycle rule
We're starting to test s3 as a live stream origin for our regular latency live workflows. Currently, we have a number of MediaStore containers with finely tuned lifecycle policies. I'm trying to convert some of these policies to s3 lifecycle rules. The logic seems similar which isn't surprising since MediaStore is built on top of s3. However, I couldn't find any examples in the s3 Lifecycle config [doc](https://docs.aws.amazon.com/AmazonS3/latest/userguide/intro-lifecycle-rules.html) which addresses the unique needs of live streaming workflows. Basically I'm trying to expire different media fragments and manifest files at varying intervals for different use cases. I'll continue messing around with the LifeCycle Configuration XML but it would be nice to have a quickstart guide. Seems like a good topic for an AWS Media Blog post from Media Service folks looking for extra credit ;-)
How to use Low-Latency HLS with MediaPackage?
I want to build a solution for live and on-demand video streaming. My custom service will receive RTMP streams, produce HLS and dump it to S3/stream it directly to MediaPackage. To make the latency lower, I want my custom service to output Low-Latency HLS (community, not the Apple one). I want to achieve it by using segments with ~2s duration and EXT-X-PREFETCH tag. - Will MediaPackage be able to process this? - If so, will the HLS outputted by MediaPackage also be "low-latency"? - Can I use MediaPackage to produce Low-Latency HLS for me (instead of implementing this in my custom service)?
Small Scale VOD Streaming - Am I looking in the right place?
Hi, I am learning AWS as I go... I work for a small company that would like to sell video subscriptions. I am looking into MediaConvert, MediaPackage and MediaStore with the plan to embed the output on our website in a secure way (users unable to download). If I had to guess, I would say we have about 20 hours of videos and would stream to about 30 users a month. Am I looking in the right spot or is this overkill?
Vod MediaPackage with cloudfront
I have a mediapackage configuration with Cloudfront CDN. I have few queries: 1. Is HLS packaging done during ingest process when playable URL is obtained (if yes, where is this Manisfest & .ts chunks stored?) or it is done when a playback request is received. 2. Once playback is done for the first time, is the manifest & ts chunks cached in cloudfront or packaging is done for every playback request. 3. I have done a CDN cache settings but not sure if playback is done from CDN cache or origin (mediapackage) . How to check the correct configuration.
Elemental MediaPackage MPD or HLS playback from middle
I have a requirement to play video from a position from where i left (store the position where I left) and continue to play from last position where i left. I tried multiple ways but non of them worked. Please someone guide me. I am having HLS and MPD url and using using elemental MediaPackage with CloudFront to stream the video.
When audio only and video description will be supported?
I would like to know when the DV will be supported (with CHARACTERISTICS="public.accessibility" tag in HLS manifest) AND When can we add in our smil an audio file only to add the track in the manifest? To make something like this : ``` <?xml version="1.0" encoding="utf-8"?> <smil> <body> <switch> <video name="file_HD.mp4" systemLanguage="fra" audioName="Fr" includeAudio="true"/> <video name="file_SD.mp4" systemLanguage="fra" audioName="Fr" includeAudio="false"/> <audio name="file_audioCorr.mp4" systemLanguage="eng" audioName="English" includeAudio="true"/> <= Audio file </switch> </body> </smil> ``` These are basic features not included