Separate names with a comma.
Discussion in 'Et Cetera, Et Cetera' started by B_Marius567, Nov 30, 2008.
Why is Blu-ray 1080P only 24fps when 1080! is 60fps?
24fps seems to slow!!!
I believe most films are shot at 24fps.
1080P has to do with the lines of vertical resolution not the frames per second.
1080i They are Half-frames ( interlaced )
1080P Are Full frames ( progressive ) :smile:
and they have the same resolution at 1080 x 720 :smile:
Frame rates in film and television
There are three main frame rate standards in the TV and movie-making business.
60i (actually 59.94, or 60 x 1000/1001 to be more precise; 60 interlaced fields = 29.97 frames) is the standard video field rate per second that has been used for NTSC television since 1941, whether from a broadcast signal, rented DVD, or home camcorder. (When NTSC color was introduced, the older rate of 60 fields per second was reduced by a factor of 1000/1001 to avoid interference between the chroma subcarrier and the broadcast sound carrier.)
50i (50 interlaced fields = 25 frames) is the standard video field rate per second for PAL and SECAM television.
30p, or 30-frame progressive, is a noninterlaced format and produces video at 30 frames per second. Progressive (noninterlaced) scanning mimics a film camera's frame-by-frame image capture and gives clarity for high speed subjects and a cinematic-like appearance. Shooting in 30p mode offers video with no interlace artifacts. The widescreen film process Todd-AO used this frame rate in 1954–1956. For video, this frame rate originated in the 1940s for recording current events or recording shows that were not shot in 24p film. However, this frame rate became popular in the 1980s, with the popularity of music videos.
The 24p frame rate is also a noninterlaced format, and is now widely adopted by those planning on transferring a video signal to film. But film- and video-makers turn to 24p for the "cine"-look even if their productions are not going to be transferred to film, simply because of the "look" of the frame rate. When transferred to NTSC television, the rate is effectively slowed to 23.976 fps, and when transferred to PAL or SECAM it is sped up to 25 fps. 35 mm movie cameras use a standard exposure rate of 24 frames per second, though many cameras offer rates of 23.976 fps for NTSC television and 25 fps for PAL/SECAM. The 24 fps rate became the de facto standard for sound motion pictures in the mid-1920s.
25p is a video format which runs twenty-five progressive frames per second. This framerate is derived from the PAL television standard of 50i (or 50 interlaced frames per second). While 25p captures only half the motion that normal 50i PAL registers, it yields a higher vertical resolution on moving subjects. It is also better suited to progressive-scan output (e.g. on LCD displays, computer monitors and projectors) because the interlacing is absent. Like 24p, 25p is often used to achieve "cine"-look.
50p and 60p is a progressive format used in high-end HDTV systems. While it is not technically part of the ATSC or DVB broadcast standards, it is rapidly gaining ground in the areas of set-top boxes and video recordings.
Actually, the 1080 refers to the vertical resolution (scan lines), not the horizontal.
Full-HD screens typically have a native resolution of 1920x1080 pixels.
I may buy a samsung blu ray player soon they are going down in price
I have to rent the blu-ray movies from netflex. I am not paying $26.00 a movie!!!!
actually I thought standard frame rate was 29.97 or 30 frames per second
That's the standard NTSC framerate for broadcast signals here in the USA.
16 and 35mm cameras have been shooting film at 24 fps since the 1920s. Digital cameras have continued to use this 24p standard in the modern era.
I'm guessing that the digital video recorders from a few years back used less that 24 fps because when you moved the camera you got jerky motion, unlike older camcorders that used tape.
We're talking about professional movie cameras of the sort used to film the commercial features that you might find made into the discs you would play in a Blu-Ray device.
i stands for INTERLACED- Interlaced video means the picture only displays HALF its full resolution.
Just like NTSC Video... the frame rate is divided into FIELDS... First the set displays EVERY OTHER LINE ... the odd numbered lines of resolution.
Then the set displays the Even numbered lines of resolution.
That is, 1080i takes each frame of actual video and shows you only 540 lines at a time... first the odd lines, then the even lines, each for 1/60th of a second. ( apporx)
Because of the persistence of vision, you are not aware of the fact that the image you perceive is actually only 30 frames of actual motion, split into 60 fields.
1080p the p stand for progressive... that means the display shows 24 full frames of 1080 resolution, one after the other.
24 frames per second is the speed of Movie film- and most HD DVDs of movies do not have more then 24 frames per second of actual information.
NTSC television is 29.97 fps ( I think)
Newer Samsung sets and others have image interpolation processors in which the set actually displays 120 frames per second. IT simply INVENTS the intermediate frames that are not in the actual signal to make fast motion smoother.
The new 120 frame sets are also capable of displaying REAL 3D when used with shutter glasses and gaming systems supporting shutter glasses. In which case the set sends 60 full frames per second to each eye.
In other words... even television actually displays less than 30 fps.
IT just shows that 30 frames in 60 fields, half the lines at a time.
1080i can be displayed satisfactorily on sets with slower LCD switching times since the LCDs have twice as much time to switch to the next image. ( while the odd lines are switching, the even lines are still on... and vice versa. )
For 1080p you want a set with switching times under 6ms- since the entire picture has to switch between each frame...slower swtiching times might create a noticeable flicker.
If I understand properly:
720p = 60 FPS
1080-/p = 30 FPS
720 P is better for gaming, while 1080p is best for computers, movies etc; Things that don't rely on response-time nearly as much.
No, you don't understand it properly.
720 and 1080 are resolution specifications. They refer only to the number of scan lines contained in a video signal; they have nothing to do with the framerate.
DVD movies (including HD and Blu-Ray) are produced by cameras that record 24 frames per second. Gaming consoles and computer systems can deliver video signals with considerably higher framerates, depending on the graphical horsepower of the system. For example, my gaming rig will run CounterStrike at 1280x1024 resolution (very close to full HD) at over 200 frames per second, and it will drive the game at 1024x768 resolution at over 300fps...but it's all pointless, because my LCD monitor can only run at 60Hz. So any framerate greater than 60 is wasted.
The point here is that the maximum resolution rating of a TV or monitor has nothing to do with a video signal's framerate. They're two entirely independent concepts.
Where frame rate matters is in OTHER applications.
200 fps in some games might allow for slow motion replay, for example. Also- higher frame rates mean your system is not going to be taxed to produce sound effects, music backgrounds etc.
For full motion true 3D display, your system must be capable of a full 60 frames for TWO separate views of the scene ( if the system uses interlaced fields ) one for each eye.
Shutter glasses plug into your gaming system. each lens is able to turn opaque for a 120th of a second- alternating between the left eye and right eye.
The TV displays a field for the right eye- while the left lens is shuttered...and then displays a field for the left eye while the right eye is shuttered.
This creates a true 3D visual input-
1080i only displays 540 lines per field, two fields per frame for 30 FPS smooth motion. (approximate TV standard frame rate) By interlacing frames, the LCDs can switch fairly slowly and still produce seamless motion... however- freeze framing might only display one filed... half full resolution.
1080p produces a full 1080 lines for every frame. It requires much faster LCD switching to look smooth, and it offers full resolution on freeze frame.
720 P is a progressive ( not interlaced) signal with 720 lines of resolution.
Some people prefer its appearance because it is half way between High Definition video, and standard video... thus higer defintion signals get DOWNSAMPLED to display and lower resoultion signals get UPSAMPELD to display.
Upsampling the 525 lines of NTSC standard video to 1080 can create noticable artifacting.
It takes less processing power, and much cheaper video screens to display 720 lines fer frame than it does 1080 lines.
But, really... 1080 is the new standard for hi definition video... might as well get a set that displays all 1080.
And 1080p is widely regraded as the best LOOKING and best performing.