Skip to main content

Amazon Elastic Transcoder


Amazon Elastic Transcoder with an initial set of features and a promise to iterate quickly based on customer feedback. You've supplied us with plenty of feedback (primarily via the Elastic Transcoder Forum) and have a set of powerful enhancements ready as a result.
Here's what's new:
  • Apple HTTP Live Streaming (HLS) Support. Amazon Elastic Transcoder can create HLS-compliant pre-segmented files and playlists for delivery to compatible players on iOS and Android devices, set-top boxes and web browsers. You can use our new system-defined HLS presets to transcode an input file into adaptive-bitrate filesets for targeting multiple devices, resolutions and bitrates.  You can also create your own presets.
  • WebM Output Support. Amazon Elastic Transcoder can now transcode content into VP8 video and Vorbis audio, for playback in browsers, like Firefox, that do not natively support H.264 and AAC.
  • MPEG2-TS Output Container Support. Amazon Elastic Transcoder can now transcode content into transport stream containing H.264 video and AAC audio, which are commonly used in broadcast systems.
  • Multiple Outputs Per Job. Amazon Elastic Transcoder can now produce multiple renditions of the same input from a single transcoding job. For example, with a single job you can create H.264, HLS and WebM versions of the same video for delivery to multiple platforms, which is easier than creating multiple jobs and saves you time.
  • Automatic Video Bit rate Optimization. With this feature, Amazon Elastic Transcoder will automatically adjust the bit rate in order to optimize the visual quality of your transcoded output. This takes the guesswork out of choosing the right bit rate for your video content.
  • Enhanced Aspect Ratio and Sizing Policies. You can use these new settings in transcoding presets to precisely control scaling, cropping, matting and stretching options to get the output that you expect regardless of how the input is formatted.
  • Enhanced S3 Options for Output Videos. Amazon Elastic Transcoder now enables you to set S3 Access Control Lists (ACLs) and storage type options without needing to use the Amazon S3 API or console. By using this feature, your files are then created with the right permissions in-place, ready for delivery to end-users.

Comments

Popular posts from this blog

Python and Parquet Performance

In Pandas, PyArrow, fastparquet, AWS Data Wrangler, PySpark and Dask. This post outlines how to use all common Python libraries to read and write Parquet format while taking advantage of  columnar storage ,  columnar compression  and  data partitioning . Used together, these three optimizations can dramatically accelerate I/O for your Python applications compared to CSV, JSON, HDF or other row-based formats. Parquet makes applications possible that are simply impossible using a text format like JSON or CSV. Introduction I have recently gotten more familiar with how to work with  Parquet  datasets across the six major tools used to read and write from Parquet in the Python ecosystem:  Pandas ,  PyArrow ,  fastparquet ,  AWS Data Wrangler ,  PySpark  and  Dask . My work of late in algorithmic trading involves switching between these tools a lot and as I said I often mix up the APIs. I use Pandas and PyArrow for in-RAM comput...

Design of Large-Scale Services on Cloud Services PART 2

Decompose the Application by Workload Applications are typically composed of multiple workloads. Different workloads can, and often do, have different requirements, different levels of criticality to the business, and different levels of financial consideration associated with them. By decomposing an application into workloads, an organization provides itself with valuable flexibility. A workload-centric approach provides better controls over costs, more flexibility in choosing technologies best suited to the workload, workload specific approaches to availability and security, flexibility and agility in adding and deploying new capabilities, etc. Scenarios When thinking about resiliency, it’s sometimes helpful to do so in the context of scenarios. The following are examples of typical scenarios: Scenario 1 – Sports Data Service  A customer provides a data service that provides sports information. The service has two primary workloads. The first provides statistics for th...

Design of Large-Scale Services on Cloud Services PART 1

Cloud computing is distributed computing; distributing computing requires thoughtful planning and delivery – regardless of the platform choice. The purpose of this document is to provide thoughtful guidance based on real-world customer scenarios for building scalable applications Fail-safe   noun . Something designed to work or function automatically to prevent breakdown of a mechanism, system, or the like. Individuals - whether in the context of employee, citizen, or consumer – demand instant access to application, compute and data services. The number of people connected and the devices they use to connect to these services are ever growing. In this world of always-on services, the systems that support them must be designed to be both available and resilient. The Fail-Safe initiative  is intended to deliver general guidance for building resilient cloud architectures, guidance for implementing those architectures  and recipes for implementing these architectures...