Mars Curiosity

Posted on

Exciting stuff that they did at NASA. Sorry to say though… Buzz beat them years ago.


Uncharted Territory

Posted on Updated on

Well, here we go… It started a while ago, and to say the least I’m quietly excited. Our creative team at church has been asked to film and edit stories of people who have gone through struggles, and share where God is in it all.

We have a larger that usual team for this job, using various lighting, multiple cameras, steady-cams, slides, tripods and other gear. So far, the biggest challenge has been using footage from a RED camera. Now, 4k footage is brilliant and beautiful in it’s raw nature. It was great to be able to have the use of it, but in doing so, there have been other issues that we wouldn’t usually face when doing a simple one or two camera shoot.

The biggest problem with the RED footage, is that we have to down sample it to match the other formats, of 1080p@25f. Why you ask? Well, when using 1, 5 and 7 D cameras, they all shoot in H.264 footage. The reason we aren’t leaving it in 4k or using the raw files in the time lines, is because simply because of editing software compatibility, and currently the footage won’t be shown on a 4k device anyway.

The audio was recorded two ways, using wireless lapel mics directly in to a MacPro, and using a small boom mic, initially plugged in to a H4N Zoom with phantom power in the XLR input, and then directly to the RED camera – after we overcame some technical problems.

When you shoot 4 cameras, have various audio sources and the event is shot over several hours, organisation of the footage has to be paramount. Various folders were set up for each of the sessions, video footage, the shooter and camera they were using, as well as audio paths to store the correct files. Project files/folders were also set up and arranged for later conversion and the shifting of between programs during the colouring and sound edits.

The other challenge that would have taken hours to do was to syncronise all the footage so that it was in line. Before the shoot started, we syncronised all of our cameras to at least have a time stamp to roughly refer to. Making all the footage line up was going to be the biggest challenge, however a program I’d heard about many times and dismissed as not being need proved essential. It’s called Plural Eyes, and you just load the footage and audio of each session in a timeline, export as FCP-XML and load that file to PE. It then does the calculations and produces a new XML file ready to import back in, with everything synced. AMAZING! The new XML imported beautifully in PrProCS5.5 and FCP7 without any import dramas.

So now what is left? We are still awaiting the conversion of the RED files – which is still using 8 cores maxed out. This could take a few days at this rate… (update: 18 hours)

Because of program compatibility issues between software editing versions, we will be duplicating the project structures on portable disks, and then sharing the FCP-XML files for the main edits. It sounds crazy, but when you are working with PrPro5, and PrPro cs5.5 and FCP7, something has to give and you do what works… or will in theory.

When this has all happened and the initial story edit is complete, the round trip will be to add other footage, clean up things, colour it and then add and fix the audio and other music effects.

I’ll keep you posted on how it goes… It’s exciting.

My latest video creation

Posted on Updated on

A new love that I’ve found since moving interstate has been that of filming and editing. I’ll admit I’m not the best in the world, and that there are probably much faster and more efficient ways of doing things, but at the moment as I grasp new projects, ideas, constructive criticism, and a creative feel, I’m enjoying learning about new and exciting techniques.

The most recent creation has been that of a carols promotional video. We planned to render (some elements), shoot, edit and colour all of it within a week. This surprisingly is what happened.

We filmed late on a Sunday afternoon, with time set aside for after the shoot, to do a time lapse of the park getting dark, to the incorporate it in to the video.
This worked really well, considering I had only done one time lapse attempt of something else at night, and this was only 3 days beforehand! This was of course after chatting to a camera nut who knows what to do properly of course.

The video was shot on a Canon 7D with a Sigma 24-70 F2.8 using the Cinestyle video profile. (Find out more here and here about this).  This recorded the footage in an incredibly dull looking footage. This is something I had never done before, so it was a little bit of a gamble, considering I only had one week turn around, and that Sunday afternoon was the only time we had to film. I did know however, that this was going on in the DSLR industry, so the thought of “how hard can it be” came to mind. The LUT was loaded using the Red Giant free software that can be downloaded here.

There were problems using this kind of colour correction. Firstly I like seeing my footage in full colour when editing (yes, not good workflow yet I know), but this mean I couldn’t do this without applying the Red Giant filter and having to render. Even without having this filter on, it was still going to have to be added at the end, and thus add to the rendering process, just like I was going to have to with many of the other effects applied – such as the Vignette and at times a quick Brightness/Contrast filter.

Many of the effects that were used came from one of two places; After Effects, or Motion.

The scribble effect that was used for the facepainting, jumping castle and also the pony ride was done in AE, as I couldn’t find an effect like this in Motion. There were several problems for this and they were:

  • I’d never used AE up until this point, and so all I had to go by to get the effect was tutorials. Thanks to those who do them! They are a big help.
  • I had to find the most efficient way to do the edit. I saw it as two ways. 1. Make the edit in FCP and then export the entire project via an XML plugin and send it to AE, which I didn’t like the idea of for several reasons. or 2. Create the elements and export them back to FCP – which I ended up doing.
  • Exporting the “scribbled” elements was also and interesting concept, but once I figured it out as an Uncompressed with Alpha, it seemed to work out.
  • Play back in AE also confused me, especially with RAM previews as well. (Different story, and a good gotcha for new users).
  • Render times sometimes took a long time to finish, such as the Motion projects that were imported, and also the stars and cloud “footage” which were just images with slow moving tweens.

Once these new elements were imported, they were then arranged and shrunk/edited as required.

Motion was also used for some of the other effects, such as the rain, and the smoke on the BBQ – both of which are built in effects. Thanks Apple. 🙂

The video editing codec I use is ProRes 422 LT 720p @ 25FPS with Compressor being the main conversion tool. Why not use MPEG Streamclip? Well I use compressor, as I can use more cores, plus other Apple devices and spread the load around a couple of computers and speed up the conversion process. I’ve set up QMaster/QAdministrator to work. Not that often that you can get a MacPro with 8 cores all at 100%! 🙂

All of the audio was tweaked in the Audition. I’m a massive fan of Audition as I think it kicks the pants of Apple’s SoundTrack (sorry FCP fans). Even though the sound was originally recorded on a H4N Zoom, there were issues with the quality because of the environment it was recorded in, so we re-recorded the sound in a studio and imported it as uncompressed AAC as FCP doesn’t like MP3 files @ 44.1k when the sequence was @ 48k.

I also did a brief colour on the video using the 3 way colour corrector. I need to learn how to use Color.

So, that’s the technical information about the video, now here it is!

Hope you can come to the carols on Bradbury!

My solution to the jquery mobile back button problem

Posted on

As I’m new to this new jquery mobile, I’m bound to run in to troubles and find unique ways of accommodating solutions.
So, the my first problem I’ve encountered has been trying to implement a back button. After playing with codes and a few options, I got frustrated and have decided to do it my own way.

By default the back button shows, which is great, if your entire site is on one page. It stays self contained and works really well. But, if you are like me and need to use a CMS or implement a site on to multiple pages, you are going to find it a little harder. I could be wrong, and welcome comments on this, but It seems that when you move to an external page from where you just were, the jquery looses it’s history and can’t do a back to the previous page.

So, I just did this:


<div data-role="page" id="MainPage" data-theme="a" >
<div data-role="header">
<span style="font-size: 9px; text-align: center;">Header</span>
<!-- Copy this 'a' code -->

<a class="ui-btn-left ui-btn ui-btn-icon-left ui-btn-corner-all ui-shadow ui-btn-up-a" data-icon="arrow-l" data-rel="back" href="javascript:history.back(1) " data-theme="a">
<span class="ui-btn-inner ui-btn-corner-all">
<span class="ui-btn-text">Back</span>
<span class="ui-icon ui-icon-arrow-l ui-icon-shadow"></span>

<div data-role="content">
<p>Please select a job type</p>
<ul data-role="listview">
<li><a href="#page1">Page 1</a></li>
<ul data-role="listview">
<li><a href="#page2">Page 2</a></li>
<div data-role="footer">
<h4>Page Footer</h4>

<div data-role="page" id="page1">
<div data-role="header">
<h1>Page 1</h1>
<div data-role="content" data-add-back-btn="true">

<ul data-role="listview">
<li><a href="Page1.html" rel="external">External Link 1</a></li>
<li><a href="Page2.html" rel="external">External Link 3</a></li>
<div data-role="footer">
<h4>Page Footer</h4>

I used the cs5.5 dreamweaver code base originally, with it only needing these slight mods. Let me know how you go.

Dear Apple Australia. Please adjust your iTunes store

Posted on Updated on

Dear Apple Australia,

I’m writing to ask you to update the genres that you categorise the music in the Australian iTunes Store and change the way the account system works to be able to buy from “International” stores.

Your big brother US store has a lot more content and genres, so why aren’t they matched?

My first concern is that music styles are being mixed in with the wrong types, of which the main category of concern is “Inspirational”.
For instance, the US store doesn’t have this genre under the music category. It simply doesn’t exist. I notice instead that those deemed “Inspirational” are instead placed into a better fitting category such as “Christian and Gospel”. Why does this category not exist in the Australian one?

If you have a look at the types of songs that are in the Australian Inspirational list, you’ll see “I still Call Australia Home”, mixed in with worship songs about God. These songs couldn’t be further from with each other. How these two types of songs be listed together has got me baffled.

So why doesn’t Apple Australia reorganised this? Is it because no one in there knows Christian music? Is it not cost effective as far as sales go? Why not just duplicate the US version and adjust the prices accordingly to the Australian matched ones?

There is so much good new Christian music that I think we as Aussies are missing out on.

How or why are we missing out? I’ll try to explain.

If one browses the US version of the site, there are plenty of genres someone can look at and see. We can look at and listen to all the artists and songs we what.
The problem comes when you want to buy the song, which is my second point.
If we want to purchase a song or video we like, iTunes suddenly is displays a message along the lines of “Oops, you need to change store because your account only works in Australia”. So now we’ve just lost our place of where we were on the charts and we are now back to the Australian home page with the browsing history cleared and saying to ourselves “Where was that song I just wanted to buy?”.

The iTunes store works great. Except for these two points.

Working in the radio industry this kind of thing would be very beneficial.

I ask you sincerely, please make some changes.

Transistion from Adobe Audition/Cool Edit to Ardour

Posted on Updated on

Now I’ve always used Cool Edit/Adobe Audition and would consider myself a little more of a power user where I use multi-track features and plugins and things. AA is reasonably a fairly simple application to get your head around when seeing it for the first time. It was a long time ago when I first used it, but I haven’t really tried anything else since.

I guess being the fact I like to try out new softwares and things, and having compiled kernels in the earlier days from version .9, I’d sort of think I’m a little nerdy at times. These days though I try things on a needs basis. When looking for free alternatives of multi-track audio software, there were definitely limited options available. As I used Windows XP as my primary operating system there were even less options, and of those that were available most have been fairly unusable.

Along came Audacity, and it was great for what it was. I jumped on board early in version .82 to try it out and for what it is being free, it’s pretty good. But, there are limits. I think generally speaking you get what you pay for and as with a lot of freeware, this is the case. But, there are exceptions.

Years ago when using Linux in my testing phases of ‘look what this can do’ scenario and never actually using it for any real purpose, I discovered Ardour and realised it had a lot of potential. After installing it and importing a few MP3 files, I had a quick play and thought, yeah that works, but never took it to anything further.

Well, since getting a little Mac laptop and getting ready to retire my pc laptop, I’ve been in need of something that is going to do the job properly of multi-track audio recording and editing. So, this is where Ardour comes in to play.

Since downloading it from I’ve had only a few dramas. I’ve now got a little Behringer UFO202 device for my stereo recording, which the unit itself hasn’t caused my any dramas, but for a new user from AA, getting these devices and Ardour working together is rather confusing and at times frustrating. This is partly because I was sold a hardware device (from an Apple
Genius) that couldn’t do stereo recording. I returned that after a week and a half of trying everything, but that learning curve has since helped me in getting Ardour working properly.

So the biggest challenge for any new user of Ardour is the whole Jack Audio thing. Having read a little, I discovered that Ardour is one of a hand full of Jack-Native programs, whereby support for the Jack program is basically built in to it. Why does it matter? Well, Jack is like a way of taking care of the audio traffic. Internally, the audio data is transported and routed around the computer in and out, to each specific program. This is very very different to Windows computers, where you have 2 audio level windows – Audio In, and Recording Out, where you just select which device, of wav, line in or microphone as your recording device. It’s really different in a way. But, the basics of it are the same.

To get it working in Mac OS X, go to Applications, Utilities and run Audio Midi Setup. Now, I got Ardour working without using Jack and the Core Audio driver, but it is easier and the recommended method to use Jack OSX ( Follow the instructions there and you should be right.

Having just posted on the Ardour website, I got a couple of replies straight away, which was great. But I think I may be going about using Ardour in the wrong way… But we’ll see. If I can use it the way I’d expect or find a way, I’ll be trying…

The blog is back

Posted on

It’s been a long while, but now that I’ve found my blog again, it will be making a comeback.

Thankyou to all who had followed in the past.