Jump to content

Stuart Keasley

Member Since 23 Aug 2008
Offline Last Active Jun 01 2016 10:33 AM

#373627 Red filters for GH4

Posted by Stuart Keasley on 15 May 2016 - 01:04 AM

As interceptor says, you're not comparing like for like. HD100 uses interframe compression whereas HD200 uses intraframe compression. Both are variable bit rate, so the actual end bit rate returned will depend on the subject and the type of compression applied. A low lit area with lots of dead space and no movement isn't really going to put the compression to the test.

Shift your test to a well lit subject with no dead space and lots of movement, you'll see something difference.

In terms of your test, your pixel peeking on a single image, so not really a conclusive or comprehensive test when comparing the quality of a moving image across different compression methods and resolutions ;)

#372852 Red filters for GH4

Posted by Stuart Keasley on 21 April 2016 - 03:47 AM

I wrote to Alex and he replied that he is not sure none of his rear lens filters would fit in the  Pana 7-14mm. Let's see if someone comes up with a solution...



How about this?



#372551 Shutter Speed thoughts

Posted by Stuart Keasley on 13 April 2016 - 02:09 AM




Yes!, That was exactly what I was talking about... Why should we stuck to 180º as a general rule bearing in mind our light problems shooting underwater. I'm sure below -20 you will never use ND filters at all and you will feel happy if you chose a fast lens to open as much as you can.
Why should we consider 180º our way to achieve a "normal" look since things down there behave in different ways?



"Normal" means image playback on a screen will look the same to us as it would if we were sat there underwater with the scene in front of us, ie our eyes are seeing things in a way that our brain expects. It doesn't matter whether you're looking at an underwater scene, your kids playing in the park or a mountain vista with clouds scatting across the sky, your brain still processes that image in the same way. The more you deviate away from 180 degrees, the more the image will have the potential to look odd to your brain, you will add either a smeary effect or a staccato effect, depending on which way you push the shutter... which is why I tend to be stuck to 180 underwater, because I tend to be filming natural environments and behaviour, I don't want to add an effect to the look.

As for trying to gain that extra bit of light by increasing the shutter, I think you need to put that into perspective. If your shooting for PAL land, you'll be running at 25p, so at 180 degree you're shutter is going to be 1/50th. By definition, the maximum you can increase that to is a 360 degree shutter, i.e 1/25th second, which is an increase of just one stop... if you're struggling for light that much, then you're already in trouble. Get some video lights, and/or shoot that scene when conditions are more favourable....
But, with all of that said, 180 degree shutter is no different from all the other rules in photography, they're there to guide you, give you the benefit of the wealth of experience that's out there, give you an idea of what you may expect if you abide by them or ignore them. But at the end of they day, it's your choice, you're the creative that's pointing the camera, play around and see what works for you, see what parameters you are personally happy to work with. That way, if you are ever faced with a low light scene that you know you're unlikely get another chance, and you need to get some more light from somewhere, you'll know whether to flick the gain on or open up the shutter.

#371023 The ultimate GH4 video rig! What is it...?

Posted by Stuart Keasley on 02 March 2016 - 06:15 AM

Not sure Stuart! LOL


From my (mis)understanding I thought the BBC et al (via the EBU standards) have certain minimum requirements for source footage depending on the project, and within the finalised project there could be no more than 25% percentage of footage which could be captured with cameras which do not meet the required standards due to the nature of the shot (environmental, drone, on-car etc etc). The final project would then have to be delivered in the required format




Yep, that's about the size of it... however the document you're referring to here:



covers the technical standards required for delivering the footage from the edit suite as a completed product to the BBC (and others) when ready for transmission/broadcast, i.e this is what the edit suite needs to render to once they've completed all their clever grading etc.


If you look at the end of Section 1.1 on page 6, the document directs you to standard EBU R118 "to assess the suitability of cameras for HD use", this is the document that the outlines the standards required for cameras


(You can download a copy of EBU R118 from here : https://tech.ebu.ch/docs/r/r118.pdf)


If you have a look through there, you can see that the GH4 spec complies with HD Tier SP (for specialist cameras), and would also comply with other HD tiers in terms of image quality, although it does fall down in other areas (e.g time code/genlock, audio etc).

In terms of your original comment, 4-2-0 codec is acceptable, depending on the bit rate and compression method used (have a look at table 1).



So coming back to my original point, GH4 internally recorded footage doesn't meet their standards, but could be used for broadcast - but only within the 25% percentage of the final project?


Happy to be corrected, as I don't want to be wrong myself nor pass on duff info!


So anything shot on a GH4 would be considered as proper, broadcast quality HD, and would not impact on the 25% allowance for other stuff.... which is just as well, cause some of the stuff we've done on GH4s would have blown that allowance in more than a few shows already :)

#347501 A Circular Experiement

Posted by Stuart Keasley on 09 May 2014 - 11:15 AM

If you get a lens is too small for the sensor, you'll get a complete circle...

#347405 Video Recording Formats AVCHD vs AVCHD Progressive

Posted by Stuart Keasley on 07 May 2014 - 08:25 AM

Your subjects don't have a great deal of moving detail, so the bit rate isn't going to be pushed. Try throwing a fairly complex image in front of the camera, e.g back lit water droplets hitting a matt black surface, where you're capturing loads of movement, and the cameras trying to cope with a full range of light and exposure, you may see 50p struggle a bit when you look at each individual frame.


If you take a 50p clip and render it out as 25p, your video file will have half the amount of frames. This will be done either through interpolation of adjacent frames (which can result in a fairly messy output) or by dropping frames. It depends on how you've set your NLE up.


If you render a clip out, the bit rate will be defined by the rendering process, the codec used and the parameters chosen. Throw your AVCHD clip through premier and then output as DNxHD, you'll end up with a 180 Mbps data stream. However you're not going to increase quality.... you can't add data in through the rendering process. As for your intermediary codec, no surprise there at all, each looking at each frame, the 50p footage will have twice as many frames as the 25p.


AVCHD is and 8 bit format and highly compressed. It does a great job, which is why it has become so popular, however it does have it's limitations. 8 bit means the image will struggle in low light and highlight, as priority is given to the mid tones where the majority of the important information is assumed to be. Re the compression, as with JPEG vs raw, if you get it right straight out of the box, and don't need to make adjustments, then you'll end up with a pleasing image, however as soon as you start to try to push the image with any form of grade/colour adjustment, you'll very quickly find yourself hitting problems. But in the general case for an untouched clip, it's unlikley that you're going to notice the difference between 25 and 50p.


In terms of bit rate, put it into perspective. BBC HD is broadcast at around 10 Mbps, and is viewed on a screen much larger than your PC monitor. You'd be hard pressed to notice, picture quality is still good... but you can be assured that the initial delivery was way higher than that in order to give them the latitude to pull the image around and get the look they wanted (minimum delivery for us for broadcast is DNxHD at 180 Mbps or ProRes at 185).


If you are shooting 50p, and then retaining the same frame rate in your end delivery, then there will be a minimal increase in quality when compared to 25p. However if you bring the 50p down to 25p, then you will be faced with either dropping frames and losing half'ish of the data, or interpolating frames and quite possibly getting a smeary image as a result.

I would suggest that you approach each shot individually, and choose the appropriate settings. If it warrants slowing down in post, shoot 50P. If it doesn't, shoot 25P.

#347002 Shooting video & stills on the same dive with a single cam, does it work?

Posted by Stuart Keasley on 29 April 2014 - 12:20 PM

Fish eye would look ugly in video, however wide angle and macro lens choices would be very similar, so no real issue there...


However the head set would be a struggle. Video, you need to be thinking about the getting the detail, ins and outs, cut aways, GVs, what you've done and what you need to do build the sequence and tell the story. Trying to mix stills into the middle, if you're as feather brained as me, you'll forget where you are with the video, and won't have sufficient time to concentrate on the stills to get good results there either.


The lighten requirements are also very different, as are the settings on the camera... so the more you swap between the two, the more time you'd waste resetting and settling in.


So, I'd say keep it simple. If you're doing a video dive, focus on that, stills then think of stills... and even more than that, dedicate each dive to video wide or video macro, stills wide or stills macro. 

#346677 Orcalight Seawolf-22000 lumens in the Maldives

Posted by Stuart Keasley on 23 April 2014 - 11:18 PM

Hi Fergus
I agree the Orca SeaWolf light is a fantastic LED light. Great even light and colour rendering. Clever way to transport and use of high capacity rechargeable batteries. One thing I don't like is the opening and closing of the canister. Two hex bolts and only way of opening is use a blunt flathead screw driver next to bolts. I am going to suggest Orca provide some sort of snap latches or even a  thumbscrew. Great light and output is 22K lumen as specified.

OrcaLight have already redesigned the method of opening and securing the lid, have a read of the following for details:


#346299 Subal S7Q Housing for the Odyssey 7Q

Posted by Stuart Keasley on 15 April 2014 - 10:52 PM

Looks like my prayers have been answered


My interest is to pair up with the FS700, however Convergent Design have also said the 7Q will support the Sony A7S, and presumably therefore the GH4.

#346097 BMPCC underwater filming setting

Posted by Stuart Keasley on 12 April 2014 - 11:51 PM

That device is to work on a flat surface. If you go diving on rocks you won't have that. And if you go on sand you can have 4-5-7 20 legs but stability comes out of balance not planting the legs in the sand as if it was an anchor. Even sand contains living organisms and should make sure your impact on the environment is as little as possible. That's my perspective at least. Results matter but after am sure am not demanding more than needed out of the seabed

From the Edelkrone web site:

"Works great on uneven surfaces, adapts to any surface"

A quadpod has no issue with an uneven surface. It has adjustable legs.

I'm with you, in that I'd rather work with a tripod. But it's a preference thing, Edmond has gone a different route, he's happy with it and is getting good results.

#346084 BMPCC underwater filming setting

Posted by Stuart Keasley on 12 April 2014 - 03:06 PM

Quadpods are more stable and stronger but can take longer to set up.

Tripods are still stable, and are a lot quicker to work with.

You've got some stunning results Edmond, which us really all that matters ;)

#345812 Physically small intervalometer

Posted by Stuart Keasley on 07 April 2014 - 09:53 PM

I'd go with Magic Lantern. Yes, there are risks, however has anyone ever heard anything ever going wrong (when using just timelapse)

And yes, I've used it on a 5d mkii

#345774 Malta

Posted by Stuart Keasley on 07 April 2014 - 04:52 AM

Thank you Stuart ,yes i have been told the same thing (enigmatic) about mauri but due to the problems with organisation (%50 my fault) i couldnt do it. I wish i can dive to the places you recoment one day

Just for a tease, filmed quite a while ago now, but here's the Southwold


#345578 Vacuum leak detector question

Posted by Stuart Keasley on 03 April 2014 - 07:59 AM

Obviously this needs some extensive testing.  I suggest we send a dedicated and unbiased researcher (that would be me) to the Caribbean for a grueling two-week regimen of test dives.  I will thoroughly document the testing process and advise you of the results upon my return home.  In regard to funding this research, please send a PM and I will send you my paypal address. 
Now about this Scotch Mist; does adding lemon peel to a nice whisky really create an enjoyable beverage?  I have my doubts but could probably experiment with that as well during the above two-week research project.  No additional funding would be required for this.  Plus, I could watch for condensation on my glass as I go from an air-conditioned room to the patio and supplement the fogging data.

Can't help with funding, but I'm very happy to supply a pressure cooker (possibly a head to put in it to check for results ;)), along with a small tub of water, some ice cubes and a copy of Nuffield A-level Physics test book (1987 edition), all obviously essential elements for your tests :D

#345562 Vacuum leak detector question

Posted by Stuart Keasley on 03 April 2014 - 12:49 AM

Aluminum conduces heat much more than glass or acrylic so in general terms if any condensation occurs it will be on the walls of the housing and not on the port or the LCD.


Obviously the many people who experienced condensation on their glass port in an aluminium housing were just seeing scotch mist.







Edited: Admin