Version: 2019.1 (switch to 2018.3 or 2017.4)
LanguageEnglish
  • C#

VideoPlayer

class in UnityEngine.Video

/

Inherits from:Behaviour

/

Implemented in:UnityEngine.VideoModule

Suggest a change

Success!

Thank you for helping us improve the quality of Unity Documentation. Although we cannot accept all submissions, we do read each suggested change from our users and will make updates where applicable.

Close

Submission failed

For some reason your suggested change could not be submitted. Please <a>try again</a> in a few minutes. And thank you for taking the time to help us improve the quality of Unity Documentation.

Close

Cancel

Description

Plays video content onto a target.

Content can be either a VideoClip imported asset or a URL such as file:// or http://. Video content will be projected onto one of the supported targets, such as camera background or RenderTexture. If the video content includes transparency, this transparency will be present in the target, allowing objects behind the video target to be visible. When the data VideoPlayer.source is set to URL, the audio and video description of what is being played will only be initialized once the VideoPlayer preparation is completed. This can be tested with VideoPlayer.isPrepared.

Movie File Format Support Notes

The VideoPlayer uses native audio and video decoding libraries. It is your responsibility to use videos that match the requirements for the target platform. The VideoClipImporter offers an option to transcode the VideoClip assets into one of H.264, H.265 or VP8 video codecs, along with a few options to experiment with, such as resolution. This uses the matching codec for audio tracks: AAC and Vorbis respectively.

See Also: VideoClipImporter.SetTargetSettings and VideoImporterTargetSettings.enableTranscoding.

You may choose to omit this transcoding and instead use videos that you already know are supported by the target systems, keeping finer control over the encoding process using an external program. Over time, the VideoClipImporter editor will provide guidelines and warnings to better help making proper format and encoding choices.

For now, vendor recommendations must be followed, and are especially constrained on older mobile platforms. For example, videos you find on the web will often require inspection and manipulations before they can be used reliably in the context of a game running on multiple devices. The following are examples of recommendations and known limitations:

* Android: Supported Media Formats. See additional notes below.
* Windows: Supported Media Formats, H.265
* iPhone 6-7: Compare iPhone Models (see TV and Video)
* UWP: Supported Codecs

The best natively supported video codec for hardware acceleration is H.264, with VP8 being a software decoding solution that can be used when required. On Android, VP8 is also supported using native libraries and as such may also be hardware-assisted depending on models. H.265 is also available for hardware acceleration where the device supports it. Key values to look for in your encoding parameters:

* Video Codec: H.264, H.265 or VP8.
* Resolution: For example: 1280 x 720.
* Profile: Applicable for H.264/H.265. The profile is a set of capabilities and constraints; vendors often specify this, such as "Baseline" or "Main". See H.264 or H.265.
* Profile Level: Applicable for H.264/H.265. Within a given profile, the level specifies certain performance requirements such as "Baseline 3.1". See H.264 and H.265
* Audio Codec: Typically AAC (with mp4 videos using H.264/H.265) or Vorbis (with webm videos using VP8).
* Audio Channels: depends on platform. For example, the Android recommendation is for stereo files, but many devices will accept 5.1.

Android Notes

* Support for resolutions above 640 x 360 is not available on all devices. Runtime checks are done to verify this and failures will cause the movie to not be played.
* For Jelly Bean/MR1, movies above 1280 x 720 or with more than 2 audio tracks will not be played due to bugs in the OS libraries.
* For Lollipop and above, any resolution or number of audio channels may be attempted, but will be constrained by device capabilities.
* The Vulkan graphics API is not yet supported.
* Format compatibility issues are reported in the adb logcat output and are always prefixed with AndroidVideoMedia.
* Also pay attention to device-specific error messages located near Unity's error messages: they are not available to the engine, but often explain what the compatibility issue is.
* Playback from asset bundles is only supported for uncompressed bundles, read directly from disk.

H.265 Compatibility Notes

OSX
Requirements: SDK 10.13+

Hardware encoding: 6th Generation Intel Core processor
Software encoding: All Macs
Hardware decoding: 6th Generation Intel Core processor
Software decoding: All Macs

Windows
Requirements: Windows 10 + HEVC extensions

HEVC extension (Hardware only)
HEVC extension (Hardware + software support)

Encoder
Decoder

iOS
Requirements: SDK 11.0+

Hardware decoding: A9 Chip
Software decoding: All iOS Devices

tvOS
Requirements: SDK 11.0+

XBox
Requirements: See here

UWP
Requirements: Windows 10 + See here

Note: Where H.265 support is indicated, it is not necessarily supported by all devices within the device family.

Android:
Requirements: 5.0+ See here


The following demonstrates a few features of the VideoPlayer:

// Examples of VideoPlayer function

using UnityEngine;

public class Example : MonoBehaviour { void Start() { // Will attach a VideoPlayer to the main camera. GameObject camera = GameObject.Find("Main Camera");

// VideoPlayer automatically targets the camera backplane when it is added // to a camera object, no need to change videoPlayer.targetCamera. var videoPlayer = camera.AddComponent<UnityEngine.Video.VideoPlayer>();

// Play on awake defaults to true. Set it to false to avoid the url set // below to auto-start playback since we're in Start(). videoPlayer.playOnAwake = false;

// By default, VideoPlayers added to a camera will use the far plane. // Let's target the near plane instead. videoPlayer.renderMode = UnityEngine.Video.VideoRenderMode.CameraNearPlane;

// This will cause our Scene to be visible through the video being played. videoPlayer.targetCameraAlpha = 0.5F;

// Set the video to play. URL supports local absolute or relative paths. // Here, using absolute. videoPlayer.url = "/Users/graham/movie.mov";

// Skip the first 100 frames. videoPlayer.frame = 100;

// Restart from beginning when done. videoPlayer.isLooping = true;

// Each time we reach the end, we slow down the playback by a factor of 10. videoPlayer.loopPointReached += EndReached;

// Start playback. This means the VideoPlayer may have to prepare (reserve // resources, pre-load a few frames, etc.). To better control the delays // associated with this preparation one can use videoPlayer.Prepare() along with // its prepareCompleted event. videoPlayer.Play(); }

void EndReached(UnityEngine.Video.VideoPlayer vp) { vp.playbackSpeed = vp.playbackSpeed / 10.0F; } }

MovieTexture Migration Notes

Since the VideoPlayer is fundamentally different from the legacy movie playback solution MovieTexture, we are providing a few examples to help you migrate your project using MovieTexture to the new VideoPlayer solution.

Playing the Movie Example :

MovieTexture :

using UnityEngine;

public class PlayMovieMT : MonoBehaviour { public AudioClip movieAudioClip; public MovieTexture movieTexture;

void Start() { var audioSource = gameObject.AddComponent<AudioSource>(); audioSource.clip = movieAudioClip; }

void Update() { if (Input.GetButtonDown("Jump")) { var audioSource = GetComponent<AudioSource>(); GetComponent<Renderer>().material.mainTexture = movieTexture;

if (movieTexture.isPlaying) { movieTexture.Pause(); audioSource.Pause(); } else { movieTexture.Play(); audioSource.Play(); } } } }

VideoPlayer :

using UnityEngine;

public class PlayMovieVP : MonoBehaviour { public UnityEngine.Video.VideoClip videoClip;

void Start() { var videoPlayer = gameObject.AddComponent<UnityEngine.Video.VideoPlayer>(); var audioSource = gameObject.AddComponent<AudioSource>();

videoPlayer.playOnAwake = false; videoPlayer.clip = videoClip; videoPlayer.renderMode = UnityEngine.Video.VideoRenderMode.MaterialOverride; videoPlayer.targetMaterialRenderer = GetComponent<Renderer>(); videoPlayer.targetMaterialProperty = "_MainTex"; videoPlayer.audioOutputMode = UnityEngine.Video.VideoAudioOutputMode.AudioSource; videoPlayer.SetTargetAudioSource(0, audioSource); }

void Update() { if (Input.GetButtonDown("Jump")) { var vp = GetComponent<UnityEngine.Video.VideoPlayer>();

if (vp.isPlaying) { vp.Pause(); } else { vp.Play(); } } } }

Downloading a Movie Example :

MovieTexture :

using UnityEngine;

public class DownloadMovieMT : MonoBehaviour { void Start() { StartCoroutine(GetMovieTexture()); }

IEnumerator GetMovieTexture() { using (var uwr = UnityWebRequestMultimedia.GetMovieTexture("http://myserver.com/mymovie.ogv")) { yield return uwr.SendWebRequest(); if (uwr.isNetworkError || uwr.isHttpError) { Debug.LogError(uwr.error); yield break; }

MovieTexture movie = DownloadHandlerMovieTexture.GetContent(uwr);

GetComponent<Renderer>().material.mainTexture = movie; movie.loop = true; movie.Play(); } } }

VideoPlayer :

using UnityEngine;

public class DownloadMovieVP : MonoBehaviour { void Start() { var vp = gameObject.AddComponent<UnityEngine.Video.VideoPlayer>(); vp.url = "http://myserver.com/mymovie.mp4";

vp.isLooping = true; vp.renderMode = UnityEngine.Video.VideoRenderMode.MaterialOverride; vp.targetMaterialRenderer = GetComponent<Renderer>(); vp.targetMaterialProperty = "_MainTex";

vp.Play(); } }

Static Properties

controlledAudioTrackMaxCountMaximum number of audio tracks that can be controlled. (Read Only)

Properties

aspectRatioDefines how the video content will be stretched to fill the target area.
audioOutputModeDestination for the audio embedded in the video.
audioTrackCountNumber of audio tracks found in the data source currently configured. (Read Only)
canSetDirectAudioVolumeWhether direct-output volume controls are supported for the current platform and video format. (Read Only)
canSetPlaybackSpeedWhether the playback speed can be changed. (Read Only)
canSetSkipOnDropWhether frame-skipping to maintain synchronization can be controlled. (Read Only)
canSetTimeWhether current time can be changed using the time or timeFrames property. (Read Only)
canSetTimeSourceWhether the time source followed by the VideoPlayer can be changed. (Read Only)
canStepReturns true if the VideoPlayer can step forward through the video content. (Read Only)
clipThe clip being played by the VideoPlayer.
clockTimeThe clock time that the VideoPlayer follows to schedule its samples. The clock time is expressed in seconds. (Read Only)
controlledAudioTrackCountNumber of audio tracks that this VideoPlayer will take control of.
externalReferenceTimeReference time of the external clock the VideoPlayer uses to correct its drift.
frameThe frame index of the currently available frame in VideoPlayer.texture.
frameCountNumber of frames in the current video content. (Read Only)
frameRateThe frame rate of the clip or URL in frames/second. (Read Only)
heightThe height of the images in the VideoClip, or URL, in pixels. (Read Only)
isLoopingDetermines whether the VideoPlayer restarts from the beginning when it reaches the end of the clip.
isPausedWhether playback is paused. (Read Only)
isPlayingWhether content is being played. (Read Only)
isPreparedWhether the VideoPlayer has successfully prepared the content to be played. (Read Only)
lengthThe length of the VideoClip, or the URL, in seconds. (Read Only)
pixelAspectRatioDenominatorDenominator of the pixel aspect ratio (num:den) for the VideoClip or the URL. (Read Only)
pixelAspectRatioNumeratorNumerator of the pixel aspect ratio (num:den) for the VideoClip or the URL. (Read Only)
playbackSpeedFactor by which the basic playback rate will be multiplied.
playOnAwakeWhether the content will start playing back as soon as the component awakes.
renderModeWhere the video content will be drawn.
sendFrameReadyEventsEnables the frameReady events.
skipOnDropWhether the VideoPlayer is allowed to skip frames to catch up with current time.
sourceThe source that the VideoPlayer uses for playback.
targetCamera Camera component to draw to when VideoPlayer.renderMode is set to either VideoRenderMode.CameraFarPlane or VideoRenderMode.CameraNearPlane.
targetCamera3DLayoutType of 3D content contained in the source video media.
targetCameraAlphaOverall transparency level of the target camera plane video.
targetMaterialProperty Material texture property which is targeted when VideoPlayer.renderMode is set to Video.VideoTarget.MaterialOverride.
targetMaterialRenderer Renderer which is targeted when VideoPlayer.renderMode is set to Video.VideoTarget.MaterialOverride
targetTexture RenderTexture to draw to when VideoPlayer.renderMode is set to Video.VideoTarget.RenderTexture.
textureInternal texture in which video content is placed. (Read Only)
timeThe presentation time of the currently available frame in VideoPlayer.texture.
timeReferenceThe clock that the VideoPlayer observes to detect and correct drift.
timeSource[NOT YET IMPLEMENTED] The source used used by the VideoPlayer to derive its current time.
urlThe file or HTTP URL that the VideoPlayer reads content from.
waitForFirstFrameDetermines whether the VideoPlayer will wait for the first frame to be loaded into the texture before starting playback when VideoPlayer.playOnAwake is on.
widthThe width of the images in the VideoClip, or URL, in pixels. (Read Only)

Public Methods

EnableAudioTrackEnable/disable audio track decoding. Only effective when the VideoPlayer is not currently playing.
GetAudioChannelCountThe number of audio channels in the specified audio track.
GetAudioLanguageCodeReturns the language code, if any, for the specified track.
GetAudioSampleRateGets the audio track sampling rate in Hertz.
GetDirectAudioMuteGets the direct-output audio mute status for the specified track.
GetDirectAudioVolumeReturn the direct-output volume for specified track.
GetTargetAudioSourceGets the AudioSource that will receive audio samples for the specified track if VideoPlayer.audioOutputMode is set to VideoAudioOutputMode.AudioSource.
IsAudioTrackEnabledWhether decoding for the specified audio track is enabled. See VideoPlayer.EnableAudioTrack for distinction with mute.
PausePauses the playback and leaves the current time intact.
PlayStarts playback.
PrepareInitiates playback engine preparation.
SetDirectAudioMuteSet the direct-output audio mute status for the specified track.
SetDirectAudioVolumeSet the direct-output audio volume for the specified track.
SetTargetAudioSourceSets the AudioSource that will receive audio samples for the specified track if this audio target is selected with VideoPlayer.audioOutputMode.
StepForwardAdvances the current time by one frame immediately.
StopStops the playback and sets the current time to 0.

Events

clockResyncOccurredInvoked when the VideoPlayer clock is synced back to its VideoTimeReference.
errorReceivedErrors such as HTTP connection problems are reported through this callback.
frameDropped[NOT YET IMPLEMENTED] Invoked when the video decoder does not produce a frame as per the time source during playback.
frameReadyInvoked when a new frame is ready.
loopPointReachedInvoked when the VideoPlayer reaches the end of the content to play.
prepareCompletedInvoked when the VideoPlayer preparation is complete.
seekCompletedInvoke after a seek operation completes.
startedInvoked immediately after Play is called.

Delegates

ErrorEventHandlerDelegate type for VideoPlayer events that contain an error message.
EventHandlerDelegate type for all parameterless events emitted by VideoPlayers.
FrameReadyEventHandlerDelegate type for VideoPlayer events that carry a frame number.
TimeEventHandlerDelegate type for VideoPlayer events that carry a time position.

Inherited Members

Properties

enabledEnabled Behaviours are Updated, disabled Behaviours are not.
isActiveAndEnabledHas the Behaviour had active and enabled called?
gameObjectThe game object this component is attached to. A component is always attached to a game object.
tagThe tag of this game object.
transformThe Transform attached to this GameObject.
hideFlagsShould the object be hidden, saved with the Scene or modifiable by the user?
nameThe name of the object.

Public Methods

BroadcastMessageCalls the method named methodName on every MonoBehaviour in this game object or any of its children.
CompareTagIs this game object tagged with tag ?
GetComponentReturns the component of Type type if the game object has one attached, null if it doesn't.
GetComponentInChildrenReturns the component of Type type in the GameObject or any of its children using depth first search.
GetComponentInParentReturns the component of Type type in the GameObject or any of its parents.
GetComponentsReturns all components of Type type in the GameObject.
GetComponentsInChildrenReturns all components of Type type in the GameObject or any of its children.
GetComponentsInParentReturns all components of Type type in the GameObject or any of its parents.
SendMessageCalls the method named methodName on every MonoBehaviour in this game object.
SendMessageUpwardsCalls the method named methodName on every MonoBehaviour in this game object and on every ancestor of the behaviour.
GetInstanceIDReturns the instance id of the object.
ToStringReturns the name of the GameObject.

Static Methods

DestroyRemoves a gameobject, component or asset.
DestroyImmediateDestroys the object obj immediately. You are strongly recommended to use Destroy instead.
DontDestroyOnLoadDo not destroy the target Object when loading a new Scene.
FindObjectOfTypeReturns the first active loaded object of Type type.
FindObjectsOfTypeReturns a list of all active loaded objects of Type type.
InstantiateClones the object original and returns the clone.

Operators

boolDoes the object exist?
operator !=Compares if two objects refer to a different object.
operator ==Compares two object references to see if they refer to the same object.

Did you find this page useful? Please give it a rating: