Put that 3D TV to good use by making 3D home movies. By John Brandon
Since the early 1900s, filmmakers have tried to take advantage of our two eyes by making 3D movies. Now, thanks to an influx of relatively affordable 3D TVs, you can enjoy the extra dimension outside the cinema – with videos you’ve shot yourself.
For big-budget movies, cinematographers use two cameras linked together and separated with a beam splitter. But unless you have Hollywood-level money to throw around – the cost of a rig runs into the tens of thousands – you’re better off with a cheaper dual-lens camera, which can achieve the same effect. (There’s even a smartphone, the LG Thrill, that shoots 3D with stereoscopic lenses.) On these cameras, the lenses record two videos simultaneously.
I’m a video buff, but until recently I’d never dabbled in the third dimension. So for my first 3D project, I filmed a music video of my friend’s band, Outpouring, performing its song “Love Came Down” onstage. The constraints of a concert style video are beneficial to the 3D-video format. Musicians onstage can move around plenty, but they’re confined to a discrete space in which viewers can explore visual depth.
To get the variety of shots I wanted, I filmed the band twice playing the song: first, I shot from a fixed position, and then I roamed around the stage for close-ups. For both takes, I used the Sony HandyCam 3D HDR-TD30V (costing around R10 000), which is small and easy to use and can capture both photos and videos in 3D.
Another option is the JVC GY-HMZ1U (also about R10 000), a similar camera. For either, you need a memory card big enough to store high-definition video; a 64 GB card can hold more than 2 hours.
As in 2D filming, it’s best to have good, uniform lighting. But there are differences in how you should compose your shots. Avoid dramatic zoom-ins and quick camera movements, which can nauseate viewers, says Grant Anderson, executive director of Sony’s 3D Technology Centre. “Zoom feels unnatural because our eyes do not zoom.” Roger Guyett, visual effects supervisor for 2009’s 3D Star Trek, among other movies, suggests keeping the camera close to the action. “You get the most dimension out of anything really close to you,” he says. “You can drive by a mountain range and not really see any dimension in the mountains because they are so far away.”
When it comes to video-editing software that can handle 3D, the choices are fairly limited. For 3D recordings, the common AVCHD format combines the left and right feeds into a single side-by-side view. But side-by-side 3D is a relatively new standard, so most editing programs don’t yet support it.
For the videos I shot with the Sony camera, I first tried Vegas Pro, the company’s video editor (which has a pretty hefty R6 000 price tag). It’s not your only option, though. I also worked with Apple’s Final Cut Pro X (R3 000) on some footage, and edited using Dashwood Stereo3D Toolbox LE, a R1 000 plugin that gives stereoscopic editing capabilities to Final Cut, as well as to other programs such as Adobe Premiere Pro.
For a more streamlined and affordable interface, try Cyber- Link PowerDirector 11 Ultra (R800). You sacrifice some speed and functionality for the price, but the 3D footage I got from it looked nearly as good as the results from Final Cut Pro and Vegas Pro, and I found the simplified workflow helpful: After making cuts in the editing view, I switched to the production view, which hides all editing and deals only with exporting the video.
In any of the programs, you’ll want to import your two video tracks in side-by-side mode; delete audio from one of the sides, though, because you only need one track of sound. For my music video, I combined clips from the two takes I filmed, structuring the narrative based on the wide static shot, and then cut any material that didn’t look convincing in 3D – the effect can be broken when someone steps out of frame or moves too quickly. As you might expect, it helps to preview a rough cut on a 3D TV and then re-edit as needed.
While editing, I realised I should have relied more on my tripod to avoid jerky handheld movements. Those look cool in a street-cam way, but they tend to diminish the 3D effect. I should also have directed my performers to move around more, especially forwards and backwards, to create more dynamic visuals. I had the band play the song a third time so I could get more footage, including a shot of a guitar neck jutting towards the camera.
To get your movie from the computer to a 3D TV, export it as an AVCHD MPEG-4 file, maintaining the side-by-side format. The processes differ from program to program, but they all share one thing: it takes a while – at least twice as long as the video itself. Once the file is complete, you can watch it directly on a 3D computer monitor such as the HP Pavilion 2311gt (R3 000) or the Asus VG278H (R6 000). You can also play the file directly from your computer by connecting it to a 3D TV with an HDMI cable, or on a Blu-ray player by burning a disc.
If you don’t have a 3D TV, you’re not out of luck. Try generating an anaglyph 3D file, which uses a red-cyan filter to create the mild 3D effect familiar to 1950s moviegoers. Then upload it to YouTube and tag it with the site’s 3D tag, yt3d:enable=true, so that anyone with a pair of low-cost red-cyan glasses will see 3D, even on a 2D monitor.
To record in 3D, you’ll need a camera, such as the Sony HDR-TD30V (top) or the JVC GY-HMZ1U (bottom), that reproduces binocular vision with two lenses, and editing software, such as Apple’s Final Cut Pro X or CyberLink PowerDirector 11 Ultra (centre), that can handle side-by-side video. See the pictures above.
Active 3D glasses have rapidly alternating shutters that are synchronised with a TV via infrared; one frame is shown to the left eye, then the next is shown to the right.
Passive 3D televisions are similar to the systems in cinemas, with polarised glasses and a polarised screen that work together to keep the left- and right-eye images separate.
For a 3D camera to work without the help of digital adjustments, the distance between the camera lenses should be the same as the average distance between human eyes: about 6,3 cm.
Coloured lenses admit certain colours to each eye, often red and cyan, respectively. Through these glasses, an anaglyph image – made of two slightly offset versions of the same picture – appears 3D.