Welcome Matrice Pilots!
Join our free DJI Matrice community today!
Sign up

Bitrate change with FPS?

Joined
Dec 8, 2015
Messages
11
Reaction score
0
When shooting with the X5, does the bitrate change with the frame rate?
If it stays constant wouldn't it be wiser to shoot in 24p rather than 30p? More data per frame right?
 
It doesn't change.
If you capture a frame less often there is more change to encode between 2 frames, so you likely wouldn't gain anything. Most cameras that use temporal-based codecs work that way and I've never seen a direct relation between framerate and quality at a givven bitrate being documented.
 
Yup, it's the way temporal codecs work. Very little change between frames/slow moving subjects means less data to encode and store.
Also don't forget the X5 uses VBR not CBR so whole data rate varies between each i-frame in any case.
 
One consideration that might drive the decision as to whether to shoot at 24 v. 30 fps is whether one prefers DCI 4K (4096x2160) or 17:9 aspect ratio versus UHD 4K (3840x2160) or 16:9 aspect ratio that is used for television broadcast. DCI 4K uses the full width of the sensor, which is more useful for those who wish to render a more cinema-like format (or even mask top and bottom of frame for a 2.35:1 aspect ratio).

This doesn't change the fact that, as mentioned above, there is no change in bit-rate.
 
I ask because I filmed a scene yesterday at 4k 30fps and the trees look horrible. Complete fuzzy messes. It looks super low bitrate. Focus has been calibrated. Everything is in order. But it still looks like garbage.

I checked a similar shot I filmed when I first got the Inspire Pro. It's almost the exact same shot from the exact same location and the trees look great. Plenty of detail. The only thing I can tell is different is the frame rate. The good shot was filmed at 24fps.

So I'm wondering is it a frame rate and/or bitrate issue? Or is the newer firmware jacking up the image quality somehow?
 
I ask because I filmed a scene yesterday at 4k 30fps and the trees look horrible. Complete fuzzy messes. It looks super low bitrate. Focus has been calibrated. Everything is in order. But it still looks like garbage.

I checked a similar shot I filmed when I first got the Inspire Pro. It's almost the exact same shot from the exact same location and the trees look great. Plenty of detail. The only thing I can tell is different is the frame rate. The good shot was filmed at 24fps.

So I'm wondering is it a frame rate and/or bitrate issue? Or is the newer firmware jacking up the image quality somehow?

What were your mode settings (D-Log, Cinelike, None) for both shots? Same or different? Were both shots exposed properly (same histogram curve)? Without knowing that, just the frame rate difference alone won't isolate why one looks better than another.
 
What were your mode settings (D-Log, Cinelike, None) for both shots? Same or different? Were both shots exposed properly (same histogram curve)? Without knowing that, just the frame rate difference alone won't isolate why one looks better than another.

D-LOG for both and they were exposed properly.
I'm gonna shoot another test today at 24fps and 30fps and see if there's a difference. I'll report back.
 
I ask because I filmed a scene yesterday at 4k 30fps and the trees look horrible. Complete fuzzy messes. It looks super low bitrate. Focus has been calibrated. Everything is in order. But it still looks like garbage.

I checked a similar shot I filmed when I first got the Inspire Pro. It's almost the exact same shot from the exact same location and the trees look great. Plenty of detail. The only thing I can tell is different is the frame rate. The good shot was filmed at 24fps.

So I'm wondering is it a frame rate and/or bitrate issue? Or is the newer firmware jacking up the image quality somehow?
Don't forget the bitrate maxes out at a pathetic 60mbps so it is totally inadequate for decent 4K. It only takes the trees you were shooting to have a fraction more movement in them (breeze/wind) and the codec will have to work that much harder to capture/encode the movement within the I-frames. This is where the codec will break down.
 
  • Like
Reactions: slim.slamma
Don't forget the bitrate maxes out at a pathetic 60mbps so it is totally inadequate for decent 4K. It only takes the trees you were shooting to have a fraction more movement in them (breeze/wind) and the codec will have to work that much harder to capture/encode the movement within the I-frames. This is where the codec will break down.

That might be what I'm experiencing because this some of these shots look like garbage and I can't figure out why.
I ran some test this afternoon and it looks like frame rate and frame size don't play that large a role.
Some of the shots I got would look good and others would look horrible.
Here's two still frames from some video. The first one shows a ton of macro blocking. Trees look incredibly pixelated. I circled some of the areas where you can really see it.
The second video from a similar vantage point is much clearer and retains much more detail.


1.jpg 2.jpg
 

Recent Posts

Members online

No members online now.

Forum statistics

Threads
2,771
Messages
25,510
Members
5,721
Latest member
SilverBear 42