Last week, I decided that it would be interesting to check out the NAB Show, an annual tradeshow in Las Vegas put on by the National Association of Broadcasters.
As a NAB Show newbie, I wasn’t sure what to expect; I knew that it would be different from the dive shows I typically attend, but I wasn’t prepared for just how different it ended up being. Everything was shiny, slick, and professional, and there were quite a few people wearing suits. I didn’t see any flip flops or swimsuits, and no one was serving rum from beneath fake tiki huts. In the industry SCUBA diving shows I’ve attended, aisles are often devoid of people, and bored-looking booth workers sit around chatting idly or twiddling their thumbs. NAB was absolutely packed full of people. I could barely make my way through the aisles, and people were jammed right up against each other in the more popular booths.
I won’t bother to talk about specific products here because there are no doubt a bunch of NAB show reports on other websites.
Although there were basically no products on display that were designed especially for underwater use, I combed the aisles looking for equipment that might be interesting for underwater videographers and filmmakers. I saw three themes at this year’s NAB:
- Digital SLRs do awesome video. Let’s build stuff to support it.
- SLR sensors are great. We’ll put them into video cameras.
- 3D is here, whether you’re ready or not.
Like many people out there, I’ve been really excited by the video I’ve been getting from my Canon 5D Mark II and 7D SLRs. The show featured dozens of booths with accessories that make shooting video with an SLR bearable and semi-ergonomic. The first few feature-length movies shot with Canon 5D Mark II cameras are starting to appear, and the last episode of House was shot with a one. But the video guys have decided that they’ve had enough, and they’re starting to put big sensors into their cameras, which means that good video ergonomics will soon be meshed with big, clean sensors. I’m happy to have SLRs drive competition and push features into dedicated video cameras. We’ll see features converge from both sides, and prices will plummet because you can already shoot beautiful HD 1080p using a $700 Canon SLR.
2010 is, beyond a doubt, the year of 3D. There were probably hundreds of booths featuring 3D equipment: cameras, camera accessories, camera support, 3D displays, software, and more. I noted with humor that there was a small “3D pavilion” featured in the show guide. It was truly a joke because the entire show was really just one huge 3D pavilion.
People looking cool in their Panasonic 3D glasses
Shooting to target 3D on land seems like it’s pretty easy. You stick two cameras together and adjust inter-ocular distance (IO) and convergence. If the IO is smaller than the diameter of your lens, you switch to an orthogonal setup, with one camera shooting through a 50% mirror, and the other shooting reflected. But post-processing is hard and mysterious. The high-end production folks seem to have it figured out, and there was quite a lot of activity at the large booths with presentations from the few special effects guys with experience in 3D. Prosumer and consumer 3D, however, seems like it’s a big mess. Where are the standards? If I were to wake up tomorrow with 10 hours of fantastic 3D footage, how would I deliver it to a stock house? How would I edit it? How would I display it? What sort of signal would it take to deliver it to a 3D display? How would I distribute it?
There were, on display, many dual-camera setups which record separate video streams. Some magic happens using various software or hardware products designed to merge two streams into one, and then it seems to be delivered to what looked like home-brewed 3D monitors via dual HDMI or HD-SDI connections. Even the home-brewed monitors were expensive – thousands of dollars – even though they were really just two LCDs placed orthogonally with a 50% mirror in the middle. The whole industry seems to be so crazy about getting into 3D that they are just hacking stuff together, pricing it high, and hoping that someone will buy it.
3D has been proven to be a big money-maker in the movie industry, and the momentum of that industry will push it right into the home. It may take time for 3D televisions to become ubiquitous, as there seem to be many comfort and content issues to contend with. Will Joe Diver really come home from work, put on a pair of 3D glasses, and sit in front of the television? And if so, what 3D content will be available? It seems hard to believe that people will be walking around their homes with dorky, uncomfortable 3D glasses on their faces, and the displays that do not require glasses are a long way from being perfected.
Active 3D glasses
3D does, however, seem to be perfectly natural for computing and hand-held applications. I didn’t see any demonstrations of 3D at a smaller scale, but it must be coming. Both computers and hand-held devices are single-viewer platforms that feature generated content (as opposed to filmed content). Knowing that there is only going to be one person in front of a display makes it easy to include lenticular displays that do not require 3D glasses, and generated content is easy to produce in 3D. It’s pretty clear to me that it’s not going to take very long for every handheld, gaming, and computing platform to go completely 3D.
The underwater imaging industry is typically slow moving, and it is frustrating to wait for our industry to catch up with what is going on in other industries. I didn’t see many underwater housings at the show, but I’m told that Amphibico were showing a housing for a Panasonic POV (point of view) system (not 3D). Also, I met up with John Ellerbrock of Gates Underwater Housings, who also had a prototype housing for the Panasonic POV available (upon request). John and I had a nice chat about where we thought 3D would go for underwater videographers. He has some interesting ideas, and I’m looking forward to see what Gates comes up with in the coming years.
If you are an underwater housing manufacturer, I urge you to start looking at manufacturing 3D housings. We all need to start experimenting with underwater 3D at a consumer level; it is clearly the future.
There are some really hard problems to solve for underwater folks who are thinking of going 3D. Howard Hall has repeatedly mentioned in interviews that the 3D IMAX system he uses works best on certain kinds of subjects, with clear constraints on size and distance. On land, it is easy to adjust convergence, but it is quite hard to do in underwater setups. Flat ports need to stay perpendicular to the direction the lens is pointing, which makes converging two cameras hard unless they are housed separately, and 3D is also sensitive to inter-ocular distance. Separately housed cameras can push cameras too far apart to shoot subjects that are close. Dome ports are an issue because they distort images in ways that might not make for convincing 3D. Macro is problematic because it requires tiny IO, and mirror-based 3D system would be extremely bulky underwater.
But who knows? It’s going to take exhaustive testing to see what does and doesn’t work underwater. If you like to experiment and tinker, please make your own underwater 3D housing and start shooting. Think of this as a call to arms!
Wetpixel now has a new forum called Underwater 3D where you all can discuss your thoughts about shooting 3D underwater. We’ll see you online!
Special thanks to Mary Lynn Price of DiveFilmHD, who took time out of her workshops to chat and buy me lunch.