Lets know about characteristics of Video files. | Techbirds
General parameters of a video:
(1) Video bit rate
(2) Data Rate
(3) Video frame rate
(4) Audio bit rate
(5) Size in bits
(6) Resolution w x h
(9) Video codec
(10) Audio codec
(11) Container (file Type)
Question: what is the difference between bit rate of video and data rate of video?
Bit rate of video is no of bits of video (frame motion) processed per second.
Bit rate of audio is no of bits of audio processed per second.
Bit rate is directly proportional to the quality of video.
Now, data rate of video is the overall bit rate of audio and video.(we don’t simply add these two bit rate to get data rate of video)
Data rate is also called as overall bit rate.
Data rate = size of file / time duration
Data rate of video is affected by various parameters. These are:
- video bit rate
- video codec
- audio bit rate
- audio codec
Question: Does data rate effects the quality of video?
Yes. Data rate is directly proportion to quality of video.
Question: Does data rate has any effect on streaming of a video on internet?
Yes. To have a smooth non stoppable video buffering on the internet, the bandwidth of internet connection should be more than data rate of video.
Question: Is there any relation b/w resolution and data rate of a video?
Resolution is the height and width of the video in pixels.
Number of pixels in a frame = pixels in width X pixels in height
Each pixel takes a fixed size (in bits).
That means resolution also determines the size of video.
If your video has high resolution then each pixel of your video requires high data rate to have a good video quality.
We can say that for a given resolution of a video data rate should be large.
And for a given data rate, resolution of a video should be low.
Question: What are the factors affecting Resolution of a video?
No of pixels in frame width
Number of pixels in frame height
Size of a pixel or pixel aspect ratio (PAR)
Question: What is data rate of a video?
Data rate is overall bit rate is a combination of video stream and audio stream in your file with the majority coming from your video stream.
File size = Data rate (kilobytes per second) x duration (sec)
When I say that ESPN distributes their video at 600Kbps (600 kbps is the Bandwidth),
This means that each one-second chunk of audio and video comprises about 600 kilobits of data.
Data rate is the most important factor in streaming video quality. That’s because all streaming codec use what’s called “lossy” compression, which means that the more you compress, the more quality you lose.
Note: When you stream video, the data rate of the encoded file must be somewhat smaller than the bandwidth capacity of the remote viewer; otherwise, the video will frequently stop playing.
Data rate < BW
Question: What is Resolution of a video?
Resolution is the height and width of the video in pixels.
A 320×240 video has 76,800 pixels in each frame, while a 640×480 video file has 307,200, or four times more.
That means you have to apply four times the compression to achieve the same data rate.
Number of pixels in a frame = pixels in width X pixels in height.
Question: What is the relation b/w Data rate and resolution of a video?
Data rate and Resolution (i.e. number of pixels) of a video are related to each other in quality related streaming decision.
Many videos are originally stored in 720×480 (SD) or 1920×1080 (HD) , but it gets sampled down to relatively small resolution for streaming , for ex: 640×480 resolution.
This is because when number of pixels in file size increases, you have to apply more data rate for maintaining same quality.
When producing streaming video, you have two options. Option 1 is to choose a data rate, then produce at the highest resolution that looks good at the data rate.
Option 2 is to choose the desired resolution, then produce at the data rate necessary to produce good quality at that resolution. The key point is that you should always consider one when discussing the other.
Just for the record, note that the most common video resolutions for 4:3 video are
640×480, 440×330, 400×300, 320×240, 240×180 and 160×120.
The most common resolution for widescreen 16:9 videos are 640×360, 480×270 and 320×180.
Question: What is audio channel?
One way to look at it is the number of speakers.
A “channel” in audio is just one separate stream of audio information. Mono audio sources have one channel. Stereo sources have two (left and right.)
The new terminology, “x.x”, indicates the number of full-range audio channels and the number of subwoofer channels, and is pronounced (for example) “Five Dot One.”
5.1 audio has five normal audio channels (Left, Center, Right, Left Surround, Right Surround) and one subwoofer channel (LFE) carrying extended bass. It’s intended to be reproduced using six speakers.
Factors affecting a video:
- Input parameters that are taken for a video:
(Video bit rate, audio bit rate, video codec, audio codec, resolution (DAR + PAR))
- Output parameters that are dependent on input parameters :
(Data rate, File Size)
- parameters that are dependent on Output parameters:
(Streaming / Buffering speed)
When producing files for delivery via streaming, you should:
- Encode at a data rate that’s comfortably less than the bandwidth of most target viewers.
- Determine if there are any streaming server specific requirements for these files.
- Encode using CBR data rate control.
Question: What are key frames in video and why we should use it?
A key frame in animation and filmmaking is a drawing that defines the starting and ending points of any smooth transition. Frames are filled with in-betweens of two key frames. In video compression, a key frame, also known as an Intra Frame, is a frame in which a complete image is stored in the data stream.
In video compression, only changes that occur from one frame to the next are stored in the data stream, in order to greatly reduce the amount of information that must be stored. This technique capitalizes on the fact that most video sources have only small changes in the image from one frame to the next. Whenever a drastic change to the image occurs , such as when switching from one camera shot to another, or at a scene change, a key frame must be created.
Because video compression only stores incremental changes between frames (except for key frames), it is not possible to fast forward or rewind to any arbitrary spot in the video stream. That is because the data for a given frame only represents how that frame was different from the preceding frame. For that reason it is beneficial to include key frames at arbitrary intervals while encoding video.
For example, a key frame may be output once for each 10 seconds of video that would allow seeking within the video stream at a minimum of 10 second intervals. The down side is that the resulting video stream will be larger in size because many key frames were added when they were not necessary for the visual representation of the frame.
585 total views, 1 views today
Share this On