27 December 2013

Art Pumping Action - Bike Tech

As I brace myself for the onslaught of the Falls Festival in Lorne, Here's a hastily slapped down  breakdown of the bike powered computer game concept we launched at the Nati Frinj. There were 3 games made but they all hinged around the use  of  an arduino to forward a bunch of information capture off the bike via a range of sensors into  Unity which generated the visuals in response.  The Oculus Rift was a late but excellent addition to the setup. I had originally intended to build a little head sized projection booth so that only cyclist could see.
Compared to this though, a pair of goggles that responds to the orientation of the riders head and provides the visuals in 3D just takes it a step beyond the original concept. When we were building this rig the plan was that the entire power for the game be generated by the cyclist and the power consumtiption of the goggles beat out any comparable projection system (without even factoring n th nightmare of me trying to sculpt up a firbeglass projection  dome). I must be the only person ever to have bought an Oculus Rift for its relatively low power consumption.
  
There is a video of the whole Art Pumping action thing (with the actual Virtual Reality part starting at about 1 minute).

Art Pumping Action Compilation from dave jones on Vimeo.

The 3 bikes were placed radiating out from the central table so that we could get away with one of the riders throwing themselves off the bike without setting off a domino effect.  

You may snicker but I was genuinely surprised at how strong the compulsion was to throw oneself of a stationary bike whilst trying to lean into the corners.  The more experienced the cyclist was, the bigger the leaning issue. During the early test run Callum (who actually rigger these bikes) threw himself off and kicked in my favourite laptop screen (which would have been a complete disaster if I hadnt got some pretty good footage of him doing it).

So all the goodies, arduino etc are packaged up in the blue water bottle container there (genius idea courtesy of Callum)


The light mounted on the front of the bike and powered by the generator on the back wheel provided the load. By turning up and down the brightness of the light up and down then makes more work for the  rider.
 A hall effect sensor on the back wheel measures how fast the wheel is turning and sets the speed of the bike in the game.


 The  dial rubbing against the front steering fork (scavenged off an old oven) is stuck to a potentiometer and feeing back the steering info.
In the background, the screen is displaying the riders eye view. I hadn't intended to leave this here for the duration but it was such a crowd pleaser that I left it there the whole festival. It was great actually when the people queuing up to have a go were cheering on the rider.
In this particular game the goal was to round up sheep and flick them into the trailer you are towing (by running them down). The game was conceived by one of the students during a workshop at the local primary school.
In the background you can see the  Regulator and battery setup courtesy of Greer Allen of Magnificent Revolution. Does a great job of capturing an storing all the spare power generated by the bike. The Bike is more than capable of generating the power to run the laptop (35watts) and oculus (5 watts +  maybe some coming out of the computer USB???)

The generator sits up against the back wheel replacing the standard roller in one of those bike training stands. Having the adjustable height feature of the training stand was pretty convenient. In the background you can see the regulator (in the old ammo box) that could take the surplus power from all the bikes and feed it into the (yellow) deep cycle battery.

The bike when it was going hard was able to put out about 100 watts
The more electricity  the bikes had to produce the  hard it was to spin the generator.
This was hooked up to the gradient of the hills the rider had to climb in the game. The steeper the hill the brighter the light, the harder it was to pedal. The result was surprisingly convincing.
Just for good measure I wired up a bunch of bells and horns at the front that could be used to jump, fire or whatever. I didn't end up taking as much advantage of these as I thought I might have...the main reason being the oversight that when you've got the Oculus goggles on its actually pretty hard to find the bells.

Heading off in the morning to go down and install these beauties at the Falls Festival in Lorne over new years and, quite frankly, a little daunted about how they might hold up to the onslaught of 16,000 adrenaline-fueled youths over new years.

Perhaps this shall be my last post....

16 November 2013

Nati Frinj 2013 - part 1



The dust has settled on the 2013 Frinj Festival and, after a week of being gut-wrenchingly,  head-poundingly, bed-riddenly, ill I'm starting to feel vaguely recovered (both from the festival and subsequent sickness). Almost human.

Lots of good bike powered things this year (if I do say so myself). No single big show as in years gone by, but rather a whole suite of smaller things. This was part of a deliberate attempt on my behalf to avoid another high stress all or nothing moment like the final showing Highly-Strung show of 2011...

Basically just spreading myself so thin I couldn't possibly fail at everything.

Genius, or idiocy?

Only time will tell.

Right now my gut response is that it was more of a juggling weasels experience than a pushing a rhinocerous into box type experience but not really less stressful in any meaningful way.

Still a few good things came out of it. A few things I'm really quite pleased with.

Such as the dueling unicycle puppets, concieved by me and realised by Anthony Schellens. Its a race and these larger than life-sized puppets get propelled around their course by two punters pounding away on their respective stair-masters.


And then there were the games, a series bike-powered virtual reality machines, my first experiment with the Oculus Rift and still very much a work in progress. With the help of an arduino (and another big learning curve), I managed to take a bike covered in sensors and hook it up to the rift to give festival goers their first taste of Virtual reality (in exchange for a pint of sweat). It really feels like there's a lot of potential for this and its definitely something I plan to flesh out in the near future.



And last but by no means least... "Somethings afoot in the dead of the night" a performance event and LED lighting spectacular with 8 Foot Felix where the entire energy needs of the show were created by just four bikes with the audience being invited to step up, take a turn on the pedals and help to make the show happen. It was really create to have the band throw themselves into this so completely and embracing the spirit of experimentation and risk taking that has always been the strength of the Frinj. I'm really happy with what we managed to achieve with a whole bagful of unknowns and not a whole lot of time to bring it together.



I'll come back and revisit each of these three bits with loads more photos, videos and a bit of behind the scenes type stuff but just wanted to get something out there for now,  just to prove to myself that it all happened.

Thanks so much to the Frinj folk, the committee and the volunteers for all pulling together and making this the most action packed festival yet. And thanks especially to Callum and Anthony for stepping in to pick up all the pieces I'd scattered so widely and helping to turn them into something better than I could ever have done on my own.  Thanks also to the funding bodies VicHealth and Festivals Australia for having the vision to back us in the development of these ideas.

11 September 2013

The Rhyme of The Ancient Merino

The Rhyme of the Ancient Merino...

...In full...


 
The Rhyme of the Ancient Merino from dave jones on Vimeo.

The story is based upon the 50 year history of the ACT (Arapiles Community Theatre) and the recent influx of new arrivals and the changes that has brought about. The puppets were constructed life sized from local farm machinery and filmed in-situ where possible the wheat-fields surrounding the town, the local grain silos and the nearby mount Arapiles.

The film was made in memory of Tim Beohm who, amongst many other things, was the president of the ACT for most of its existence.

With its life sized puppets made of old farm machinery sometimes film haing off the side of the nearby Arapiles or the grain silos, bringing this film is probably still the hardest thing I've ever tried to do.

Hats off too to Ian Van Gemeren too for an fantastic score. And everyone else who helped with the animation, there were a lot of long days and late nights that went into this one.

Enjoy

05 August 2013

Feed Development Video

Here is a 5 minute edit from some of the Feed development workshops (mostly from the Natimuk workshops).


Feed: Development from dave jones on Vimeo.


Currently working towards the next Nati Frinj Festival. We've been extremely lucky with our funding applications this year and it looks like a lot,lot, lot of good things are starting to come together. Worth another blog post in its own right.

soon...




....no really!

24 April 2013

Disembodied heads - Feed Development Part 5


Nina.

The voice of the Feed.

She is Violet's personal Feedtech assistant. Benign at first, then increasingly less so as the story goes on and Violet's situation worsens.


We still haven't decided if shes the manifestation of someone in a remote call center or computer generated entirely.
The mouth responds to an audio Feed, generated by one of the performers on stage with a microphone.
Full control of the head, eyes, mouth, eyebrows, blinking etc, controlling Nina's face was the most complicated of all the tasks as I'd set up.
Sometimes two hands just weren't enough.
TouchOSC is capable I think of sending up to 10 controls simultaneously but my brain starts to melt once I try and think about more than three things at the same time. Ultimately Ill probably set up a bunch of buttons that just trigger pre-composed sequences but for now its all hands on deck.
 




18 April 2013

Ricochet - Feed Development Part 4



The characters play a game called Ricochet near the start of the story that basically involves slamming into one another and trying to rack up points by hitting the bouncing glowing bits. Again, not really a scene that was pivotal to the story but an opportunity to really try and get the tracking sorted out.
 
We did two tests one with  the players being tracked with a kinect and the other just with me tracking them manually on the iPad and all this was hooked to an actual working game so that the actors could just try and play the game rather than trying to act like they were playing a game. The still images don't really do justice to it but the difference between the quality of the actors movement when their bodies were actually being tracked and and when they were just pretending to play the game was startling. When they were actually being tracked you could see them crouching, ducking and moving with much more of a sense of purpose than when they were just pretending which mostly involved spectacular leaps. The problem for now is that when they were jumping around and crossing over  each others paths, the tracker would get confused about who was who and the players would find that they were suddenly scoring own goals. This was a problem that I could avoid when I was manually tracking but, when I did that, the whole cause and effect sensation was broken. It might be possible to mount the kinect in the roof looking straight down on the players so they never cross over but my feeling is still that some kind of RF/WiFi tracking where each player has a unique id is the way forward. The game visuals could do with a bit more tarting up too really, I imagine something like plasma pong that you jumped around inside of would be absolutely mind blowing for actors and audience like.


13 April 2013

UpCar™ Blues - Feed Natimuk Development Part 3


I think the UpCar™scenes were one of my favourite experiments thus far. Not all that pivotal to the plot, more of a means of getting Violet and Titus from one scene to the next but I was really keen to have a go at updating this...
(image source movietom.wordpress.com)
The old rear projected screen behind the stationary vehicle was a classic staple of 50s and 60s cinema and I was itching to try it in a theatrical setting and put my own twist on it, not just projecting the background, but the car as well. Unlike the old cinema version, the car in Feed is free to turn in 3D and move about the stage depending on which way its facing the background stream. The two passengers are seated on the moving table. The actors, with only their faces, are seated on the moving table just behind the sharks-tooth scrim as bright colours in the projection obscure them except where the windscreen which has been left black to reveal their faces. 

Theoretically it would be possible to track the passengers as they moved around and update the projections accordingly but for the time being I was just tracking them manually with the ipad.
I could control the rotation, left right up down and tilt as well as the X Y and Z position of the car in space and it was pretty easy to keep up with the performers who were moving the passengers around on the stage. It would be possible if you wanted, to have the car do a complete 360 withthe passengers inside it. The background would just recede into the distance based on the direction the car was facing and the whole scene is rendered in real time so there is no need for the actors to have to try and move the car to exactly the right spot at the right time which would have been almost impossible.
It was one of those things that seemed like it ought to work but really, until we set it up and actually tried it, it was impossible to know it if was going to be convincing enough for the audience to be worth doing. It was a lot of work to get this scene all set up and when we first tried it at the Horsham workshop the scrim we were using was too transparent and the effect totally lost (Below is the extremely lame looking Horsham first take on the whole car thing).
I almost abandoned the idea at this point, so it was quite a relief to see it looking just so much more convincing with the sharks-tooth. Very pleasing after all that to finally see it working as well as it did.


12 April 2013

The Stagecraft of Feed - Natimuk Development Part 2


Violet's father (Craig) delivers his speech
One of the things we have been experimenting with was how we could assemble these few elements and combine them with projections to build a range. The idea was to work with a set of basic but versatile elements and reconfigure them to construct all the elements we need for the show. As well as the semi transparent white sharkstooth scrim across the front (for the feed and text messaging), we had three other elements, a table and two screens (roughly 3x4meters), all on wheels.  The two moveable screens actually had projectors mounted behind them so that they could be continuously projected upon even as they are being moved around the stage regardless of what angle they were facing (originally the plan had been to track and project onto these surfaces but it just made so much more sense to just mount the projectors on them and be done with it). 

For simplicity I kept the controls for the two mobile screens on a separate controller so that someone else could operate it if the backgrounds needed to change whilst I was needing to concentrate on the foreground animation.
At various points the two screens form the internal corner of a seedy motel, the external street corner corner of a building a flat wall with an open doorway and a range of other combinations besides. To compliment this the mobile bench served variously as bed, UpCar, kitchen table etc (and will likely also be used as an additional projection surface in the final production).


Titus and Violet (James and Libby) at the hotel.

One of the goals of this development phase was to experiment with the way the transition from one scene to the next could work as a theatrical moment in its own right, taking the audience on a mental journey as the elements of one scene transform into the next. There's a moment of confusion as peoples brains struggle to comprehend what they're actually seeing, still trying and hold onto pieces of the old scene as it falls apart and then the payoff and a wave of comprehension as the new scene drops into place. When its well done, its something I've always enjoyed in animation (both watching and striving to create). Its even more exciting trying to make it happen live on the stage.

The scene below is a good example, the kitchen walls drop away into darkness and the table  transforms into an UpCar™ and suddenly we are on a joy flight through the city.

Domestic Bliss™ with James Craig and Naja

Titus and Violet (James and Libby) in the UpCar
The UpCarscene was pretty much my favourite thing we attempted this time round so Im going to save it for a whole post of its own...

coming soonish...




07 April 2013

Feed 86% - Natimuk Workshops

Recently we ran another Feed workshop in the Natimuk Hall inviting a group of local actors to interact with the animated scenes I'd been developing over the past few months.  

The main plan was to see how performers actually meshed with the projections Id been developing both in terms of how they could engage with the projection and how well the projections would respond to them. A lot of things you can imagine how they might work when you're looking at them on the computer screen but until you actually see them on a stage with real people its hard to be sure. Often live the effect is very different from the screen effect, particularly the speed things move at, things that seem to drift around the monitor in a leisurely manner rip across the scrim at lightning speed, confusing the performers and inducing seizures in the audience. Luckily a lot of the worst of this was became obvious during the Horsham workshops late last year so I was a lot more mindful of it going in to it this time.
One of the key elements of the play is the feed itself, a cloud of data that tracks each performer. Full of contextually relevant info, helpful pointers and targeted advertising. Characters can review new images from the feed, send images to one another share them with a group or communicate via text. All while moving around on the stage.
The script requires that there be as many as seven performers on stage doing this at anyone time and keeping track of it all and keeping it even even vaguely reliable has been the biggest challenge of the project so far using a range of techniques, video tracking, using the kinect, manual tracking with the iPad via touchOSC. Each method has its strengths and weaknesses and likely the ulimte solution will feature some kind of combination of the techniques. Manual tracking works really well up to about 4 objects at which point everything goes to pot and the operators head explodes. The kinect is awesome (the depth tracking and for not requiring visible light) but has some annoying limitations. It is limited to roughly a 6-meter area and it cant see through the scrim we need to have hanging across the stage so it can only track people in front of that. Visible light tracking allows for a greater range, sees right through that scrim but it lacks the depth perception of the kinect and is sensitive to visible light and therefore and gets confused by the projections. The biggest problem with all the automated tracking is that the computer tends to get confused when performers cross paths or bump into each other (that's where  human operators really come into their own). Something I havent tried yet but would like to is planting some kind of wi-fi tracker on each performer allowing them to be tracked and uniquely identified anywhere in the room. From what I've read, there can be issues with accuracy and lag, but if it could be got working it ought to be perfect. Anyone with any experience of this, PLEASE get in touch.

And thanks again to Regional Arts Victoria who's support allowed this recent stage of the development to happen.It feels like we've made massive headway over the last few months and the whole project is creeping steadily closer to being a reality.

These are all stills from the video we shot (thanks Gareth Llewellin). I'll be uploading that once Ive had a chance to edit something together but likely there'll be a bunch more stills first focusing on some of the scenes we worked through.