07 April 2013

Feed 86% - Natimuk Workshops

Recently we ran another Feed workshop in the Natimuk Hall inviting a group of local actors to interact with the animated scenes I'd been developing over the past few months.  

The main plan was to see how performers actually meshed with the projections Id been developing both in terms of how they could engage with the projection and how well the projections would respond to them. A lot of things you can imagine how they might work when you're looking at them on the computer screen but until you actually see them on a stage with real people its hard to be sure. Often live the effect is very different from the screen effect, particularly the speed things move at, things that seem to drift around the monitor in a leisurely manner rip across the scrim at lightning speed, confusing the performers and inducing seizures in the audience. Luckily a lot of the worst of this was became obvious during the Horsham workshops late last year so I was a lot more mindful of it going in to it this time.
One of the key elements of the play is the feed itself, a cloud of data that tracks each performer. Full of contextually relevant info, helpful pointers and targeted advertising. Characters can review new images from the feed, send images to one another share them with a group or communicate via text. All while moving around on the stage.
The script requires that there be as many as seven performers on stage doing this at anyone time and keeping track of it all and keeping it even even vaguely reliable has been the biggest challenge of the project so far using a range of techniques, video tracking, using the kinect, manual tracking with the iPad via touchOSC. Each method has its strengths and weaknesses and likely the ulimte solution will feature some kind of combination of the techniques. Manual tracking works really well up to about 4 objects at which point everything goes to pot and the operators head explodes. The kinect is awesome (the depth tracking and for not requiring visible light) but has some annoying limitations. It is limited to roughly a 6-meter area and it cant see through the scrim we need to have hanging across the stage so it can only track people in front of that. Visible light tracking allows for a greater range, sees right through that scrim but it lacks the depth perception of the kinect and is sensitive to visible light and therefore and gets confused by the projections. The biggest problem with all the automated tracking is that the computer tends to get confused when performers cross paths or bump into each other (that's where  human operators really come into their own). Something I havent tried yet but would like to is planting some kind of wi-fi tracker on each performer allowing them to be tracked and uniquely identified anywhere in the room. From what I've read, there can be issues with accuracy and lag, but if it could be got working it ought to be perfect. Anyone with any experience of this, PLEASE get in touch.

And thanks again to Regional Arts Victoria who's support allowed this recent stage of the development to happen.It feels like we've made massive headway over the last few months and the whole project is creeping steadily closer to being a reality.

These are all stills from the video we shot (thanks Gareth Llewellin). I'll be uploading that once Ive had a chance to edit something together but likely there'll be a bunch more stills first focusing on some of the scenes we worked through.

01 March 2013

Feed-Horsham Workshops


Work on Feed with Anna Loewendahl and Greg Pritchard has continued, slowly but surely. We managed to book the Horsham town hall for a week of workshops. It was great to have such a whopping big space to spread ourselves around in. Not so great was the whopping big windows letting in all the ambient light and making it impossible to do much projection work during the day like Id been hoping. Still... after a week of late nights we managed to get a fair bit worked out. We've done a fair bit of experimenting with using the kinect to track performers on stage and using that to drive the projections. Its such a step forward from just triggering cue points and having to make sure everyone was in the right place at the right time. Hopefully now the actors can just act and have everything else all fall into place around them.



The Up-Car scenes...kind of an updating of the old classic film trick of having the actors jiggling about in a stationary car while the pre-recorded countryside whizzing by is projected onto a screen just out the back window. Only this time were projecting the car as well, and it all gets generated on the fly. The car can be controlled in real time, either manually or buy tracking the performers position and then the city streams by in the background depending on the postion and orientation of the car.


The up-car, and a lot of the animations now were doing with Unity3D. Ive only been using it for the last six months or so, but it seems really flexible in the sort of input it can take (keyboard, microphone, kinect, OSC, etc) and I'm really impressed with what it can churn out in real time. It feels like its going to be a be part of my projection arsenal from here forth. Ive also been using Unity for  a project with Chamber Made Opera down in Melbourne but I'll save that for a separate post.



And speaking of inputs and OSC... touchOSC has been another amazing discovery. I think I'm a bit late to the party here but wow! It's literally turn my iPad into the actually useful tool that I hoped it would be when I bought it all those years ago allowing you to easily set up a custom multi-touch panel of knobs and dials and then send those calls off remotely to anywhere within wi-fi range. I really wish I'd known about some of this back when I did Highly Strung (I was pretty happy with my iPod-gaffer-taped-to-pico-projector rig at the time but this just seems so much more flexible and reliable). If I ever remount Highly Strung again I will be revisiting the tech.







 

 

We've got another workshop this weekend in the Natimuk Hall (which can be blacked out) and I shall endeavor to get another post together about that as soon as I can, hopefully with some video snippets this time (and ideally before the end of 2013).

13 March 2012

a Feed of Spaghetti

Another new project slowly ticking away in the background.

FEED is a stage-play adaptation of a novel by M.T Anderson that is set in a disturbingly not too distant future where everyone has the internet in their heads. Everybody has the knowledge of the world at their fingertips and, as the world slides into oblivion, nobody is interested in anything beyond accessorising (sound familiar?).

I'm pretty excited about the idea at the moment. I think you could make this as a Hollywood blockbuster with shed-loads of special effects and the jaded audience wont even blink(though the story is probably better than a lot of the films I've seen in recent times). But no, the film buff amongst them would make comparisons to something like Tron and off they'd all trot to get a burger. But I think that to try and  do something like this live on stage will make the audience sit up in their seats and take notice (and hopefully engage a little more with the content of the story).

The two biggest challenges in all of this for me and are firstly to try and visualise what it might be for everyone to have the internet in their heads and secondly how to pull this off in a theatre setting. At the moment I imagine the Feed to be this swirling vortex of data (text, images, video, etc) and then within that a little sub-vortex orbiting the head of each of the characters. The items circling each of the characters is stuff that is specific to them and will indicate what they're thinking about. The way they pass images and data from one another too will be a big part of the whole thing. And then there's "texting" as well. Maybe a third of the script is done with text messages so there needs to be some kind of representation of that as well.

To try and do this its going to be necessary to track the performers as they move about on stage and then use that info to drive the visuals. So that the performers can just act, and not have to worry about being in the right place at exactly the right time. I've managed to do this so far using a Kinect and Tuio, talking to Quartz Composer, building on some of the work I did (but ultimately abandoned) for Highly Strung. I may stick with this, or I may find something better. At the moment I like the way the kinect works on infra-red light so it doesn't get confused by projections or the lighting on the actors (or complete darkness for that matter). Ive got zero interest in xbox games but, for $150, the kinect is a great bit of kit.

 And the code to do that looks like this.

And again...In more detail...just so you can get a better idea of whats going on.


This has been my first foray into Quartz composer and what they call spaghetti code. Fortunately there are a lot of great resources out there on the internet (kineme.net in particular, I've learned loads from deconstructing/butchering some of the things on that site). Basically though, it's lots of little "bits" that do "stuff" and you link them up in a big ugly tangle...

...And it makes something that looks a bit like this.


And then you can project all that onto a scrim in front of the performers.


I managed to get a bit of time in with Greg and Anna recently in between them stressing out over their  performance of Oliver's Tale on the salt lake.  The shot above is of Anna doing a much better job than me of looking like a youth as we test out the texting thing.

For the Feed itself the plan is to run the show with a live connection and be doing live image searches on the internet based upon things that are relevant to the context of the show then have the results thrown into the mix with everything else. Below are the results of a couple of google image searches on a few different strings.
"Disco"
"Ecological Disaster"
"Boobs, Kissing & Ulcers" (quite a heady mix)

I does quite a nice transition from one set to the other too (if I do say so myself), taking about 10 seconds to change from one to the next as the whole thing continues swirling around. And what is great about this is that it opens up the possibility of having the audience text in stuff in real-time as well, and have that influence the feed too. The idea of have the visuals being this omnipresent thing on stage that connects everyone together, actors and onlookers, I think could end up making for a pretty powerful experience.

There's still lots to figure out and lots more to learn. But right now I'm excited about the possibilities... and the fact that the performance will be all INDOORS and immune to the whims of the weather gods AND there will be absolutely NO giant puppets in the show. All this makes me feel positively relaxed about the whole thing. How hard can it be?

06 March 2012

the Rhyme of the Ancient Merino - trailer

I finally got around to scraping together a Merino trailer for the web. I have been meaning to do this for such a long time.

There is plenty of making of stuff on the blog here if you want to go  digging and hopefully I'll get a chance to upload more over the next few weeks.

But for now...

Enjoy

25 February 2012

Decommissioned

Slowly ticking along with the Decommissioned (the graphic novel).


It's on a temporary hiatus at the moment while I'm having chats with a potential publisher. Mostly we're talking about print size and page count and stuff like that but it doesn't feel like its worth doing too much more until that has all been sorted out. That's fine. I've got plenty else to keep me busy right now but I'm just itching to dive in and start churning out the pages. Meanwhile here are a few little snippets.





04 February 2012

Highly Strung - In retrospect

Well its taken a few months for me to get around to editing this together (not that I spent a few months on the edit, I just had a bit of other stuff to catch up on) but here it is. A five minute collection of shots from  Highly Strung, the silo show.



Don't expect it to make a whole lot of sense, its all out of sequence and taken from both the rehearsals as well as the final show. Its just a collection of shots to demonstrate the kind of stuff that was going on in the show.

Putting it together, I really wished I had more footage of the puppeteers doing their thing actually. Looking back that's possibly the most interesting aspect of the show for me, the things that people (who were nearly invisible during the performance) had to do to put the giant puppet through its paces.

An early rehearsal where we had 3 operators up on ropes (Wendy, Callum, Kate)
Damo A and Callum on the legs as the puppet is caught in one of the all too common freak gusts of wind
Kete, Damo J, Michelle Anthony, Gareth, Jac, Callum
Michelle McFarlane (Head Turn Technician)
Gareth Llewellin (Chief Up/Down Guy)
Anthony  (Left Elbow Operator) & Kate (the Bird in the Hand)

The coordination required to get the puppet into pose sometimes involved up to 10 people all doing there part and with no way to see how the puppet actually looked. They just had to have faith that, when each of them did X at time Y, it was a good thing.


(photos Michelle McFarlane)

If I did the show again I'd definitely make more of a feature out of the puppeteers, lighting them up rather than hiding them away.

01 November 2011

We dit it We did it we Did it we did It!

....A bit windy mind you, but it was great to finally get it up there after all the work. Apparently there were about 2000 people in the audience. Not a bad turn out for a show that seemed destined not to happen on a ferociously windy night.

A big thanks to everybody involved who worked their guts out to make it happen, especially the puppeteers who were seriously battling the elements on Saturday. Anthony, Jillian, Michelle, Callum, Kate, Gareth, Damo (1 & 2), Kete, Jacq,  Wendy & Scott. You guys are awesome. Also thanks to Helen for coping with the Herculean sewing task and Paul for taking the giant-sized, person-shaped tent and turning into a puppet that did stuff. Sound and light were a big part of the show as well so thanks Stephen for composing the tracks and Outlook for making the show seen and heard.

Thanks also to the funding bodies, Arts Victoria, Festivals Australia and Australia Council We couldn't have done it without you.