Showing posts with label making of. Show all posts
Showing posts with label making of. Show all posts

23 March 2017

Balloons Pt 2

Testing the balloons in the water for height size etc.

somehow, after the stillest day ever,  struck by a freak windstorm mid test (hence the wobbliness of some of the balloons).

Still successful test all in all.





Photos by Greg Pritchard

08 March 2017

Red V Blue - Tyrrendarra Style

Took the Unicycle puppets town the coast to the Tyrendarra Show the other weekend (The centenary edition of the Tyrendarra Show no less! Thats pretty impressive).
 
Thanks to Adam Demmert for the photos (And all the help keeping it running over the course of the day).

And of course the inevitable hip, shoulder and buttock replacement. This is like the smash up derby of puppet shows.



Even Had a bit of time to streamline the set up a bit. Swapping out the ramps for some hyper-active  suspension in the unicycles.





Balloons Pt 1

Getting ready for a project up in Wagga at easter.

Early testing




19 March 2015

Seeds of change (Part 2) - Self portrait with happy face

In the deep freeze at the Seedbank installing some more light extrusions. In spite of the nice warm oranges and pinks in this shot its a nose blistering -22°c in here.

17 February 2014

Captives of the City

A short compilation from the workshop I did with chamber Made Opera last year.


Captives of the City - Demo from dave jones on Vimeo.

Whether it makes the final show or not I think my favourite thing was the little pico projector getting towed along as JAcob manipulates the projected puppet. I think theres some great potential here for hand held battery powered projectors to get animation into some pretty interesting places.

There another workshop coming up in March so hopefully another chance to keep experimenting.

Firstly though I'm about to head up to APAM2014 to help unleash the MadeInNatimuk brand upon the world and see if we cant get any of the shows we've made remounted. Some of those projects were so much work to bring together it seems a shame not to give them a chance at another run.

So fingers crossed....and more in a bit.

27 December 2013

Art Pumping Action - Bike Tech

As I brace myself for the onslaught of the Falls Festival in Lorne, Here's a hastily slapped down  breakdown of the bike powered computer game concept we launched at the Nati Frinj. There were 3 games made but they all hinged around the use  of  an arduino to forward a bunch of information capture off the bike via a range of sensors into  Unity which generated the visuals in response.  The Oculus Rift was a late but excellent addition to the setup. I had originally intended to build a little head sized projection booth so that only cyclist could see.
Compared to this though, a pair of goggles that responds to the orientation of the riders head and provides the visuals in 3D just takes it a step beyond the original concept. When we were building this rig the plan was that the entire power for the game be generated by the cyclist and the power consumtiption of the goggles beat out any comparable projection system (without even factoring n th nightmare of me trying to sculpt up a firbeglass projection  dome). I must be the only person ever to have bought an Oculus Rift for its relatively low power consumption.
  
There is a video of the whole Art Pumping action thing (with the actual Virtual Reality part starting at about 1 minute).

Art Pumping Action Compilation from dave jones on Vimeo.

The 3 bikes were placed radiating out from the central table so that we could get away with one of the riders throwing themselves off the bike without setting off a domino effect.  

You may snicker but I was genuinely surprised at how strong the compulsion was to throw oneself of a stationary bike whilst trying to lean into the corners.  The more experienced the cyclist was, the bigger the leaning issue. During the early test run Callum (who actually rigger these bikes) threw himself off and kicked in my favourite laptop screen (which would have been a complete disaster if I hadnt got some pretty good footage of him doing it).

So all the goodies, arduino etc are packaged up in the blue water bottle container there (genius idea courtesy of Callum)


The light mounted on the front of the bike and powered by the generator on the back wheel provided the load. By turning up and down the brightness of the light up and down then makes more work for the  rider.
 A hall effect sensor on the back wheel measures how fast the wheel is turning and sets the speed of the bike in the game.


 The  dial rubbing against the front steering fork (scavenged off an old oven) is stuck to a potentiometer and feeing back the steering info.
In the background, the screen is displaying the riders eye view. I hadn't intended to leave this here for the duration but it was such a crowd pleaser that I left it there the whole festival. It was great actually when the people queuing up to have a go were cheering on the rider.
In this particular game the goal was to round up sheep and flick them into the trailer you are towing (by running them down). The game was conceived by one of the students during a workshop at the local primary school.
In the background you can see the  Regulator and battery setup courtesy of Greer Allen of Magnificent Revolution. Does a great job of capturing an storing all the spare power generated by the bike. The Bike is more than capable of generating the power to run the laptop (35watts) and oculus (5 watts +  maybe some coming out of the computer USB???)

The generator sits up against the back wheel replacing the standard roller in one of those bike training stands. Having the adjustable height feature of the training stand was pretty convenient. In the background you can see the regulator (in the old ammo box) that could take the surplus power from all the bikes and feed it into the (yellow) deep cycle battery.

The bike when it was going hard was able to put out about 100 watts
The more electricity  the bikes had to produce the  hard it was to spin the generator.
This was hooked up to the gradient of the hills the rider had to climb in the game. The steeper the hill the brighter the light, the harder it was to pedal. The result was surprisingly convincing.
Just for good measure I wired up a bunch of bells and horns at the front that could be used to jump, fire or whatever. I didn't end up taking as much advantage of these as I thought I might have...the main reason being the oversight that when you've got the Oculus goggles on its actually pretty hard to find the bells.

Heading off in the morning to go down and install these beauties at the Falls Festival in Lorne over new years and, quite frankly, a little daunted about how they might hold up to the onslaught of 16,000 adrenaline-fueled youths over new years.

Perhaps this shall be my last post....

05 August 2013

Feed Development Video

Here is a 5 minute edit from some of the Feed development workshops (mostly from the Natimuk workshops).


Feed: Development from dave jones on Vimeo.


Currently working towards the next Nati Frinj Festival. We've been extremely lucky with our funding applications this year and it looks like a lot,lot, lot of good things are starting to come together. Worth another blog post in its own right.

soon...




....no really!

24 April 2013

Disembodied heads - Feed Development Part 5


Nina.

The voice of the Feed.

She is Violet's personal Feedtech assistant. Benign at first, then increasingly less so as the story goes on and Violet's situation worsens.


We still haven't decided if shes the manifestation of someone in a remote call center or computer generated entirely.
The mouth responds to an audio Feed, generated by one of the performers on stage with a microphone.
Full control of the head, eyes, mouth, eyebrows, blinking etc, controlling Nina's face was the most complicated of all the tasks as I'd set up.
Sometimes two hands just weren't enough.
TouchOSC is capable I think of sending up to 10 controls simultaneously but my brain starts to melt once I try and think about more than three things at the same time. Ultimately Ill probably set up a bunch of buttons that just trigger pre-composed sequences but for now its all hands on deck.
 




18 April 2013

Ricochet - Feed Development Part 4



The characters play a game called Ricochet near the start of the story that basically involves slamming into one another and trying to rack up points by hitting the bouncing glowing bits. Again, not really a scene that was pivotal to the story but an opportunity to really try and get the tracking sorted out.
 
We did two tests one with  the players being tracked with a kinect and the other just with me tracking them manually on the iPad and all this was hooked to an actual working game so that the actors could just try and play the game rather than trying to act like they were playing a game. The still images don't really do justice to it but the difference between the quality of the actors movement when their bodies were actually being tracked and and when they were just pretending to play the game was startling. When they were actually being tracked you could see them crouching, ducking and moving with much more of a sense of purpose than when they were just pretending which mostly involved spectacular leaps. The problem for now is that when they were jumping around and crossing over  each others paths, the tracker would get confused about who was who and the players would find that they were suddenly scoring own goals. This was a problem that I could avoid when I was manually tracking but, when I did that, the whole cause and effect sensation was broken. It might be possible to mount the kinect in the roof looking straight down on the players so they never cross over but my feeling is still that some kind of RF/WiFi tracking where each player has a unique id is the way forward. The game visuals could do with a bit more tarting up too really, I imagine something like plasma pong that you jumped around inside of would be absolutely mind blowing for actors and audience like.


07 April 2013

Feed 86% - Natimuk Workshops

Recently we ran another Feed workshop in the Natimuk Hall inviting a group of local actors to interact with the animated scenes I'd been developing over the past few months.  

The main plan was to see how performers actually meshed with the projections Id been developing both in terms of how they could engage with the projection and how well the projections would respond to them. A lot of things you can imagine how they might work when you're looking at them on the computer screen but until you actually see them on a stage with real people its hard to be sure. Often live the effect is very different from the screen effect, particularly the speed things move at, things that seem to drift around the monitor in a leisurely manner rip across the scrim at lightning speed, confusing the performers and inducing seizures in the audience. Luckily a lot of the worst of this was became obvious during the Horsham workshops late last year so I was a lot more mindful of it going in to it this time.
One of the key elements of the play is the feed itself, a cloud of data that tracks each performer. Full of contextually relevant info, helpful pointers and targeted advertising. Characters can review new images from the feed, send images to one another share them with a group or communicate via text. All while moving around on the stage.
The script requires that there be as many as seven performers on stage doing this at anyone time and keeping track of it all and keeping it even even vaguely reliable has been the biggest challenge of the project so far using a range of techniques, video tracking, using the kinect, manual tracking with the iPad via touchOSC. Each method has its strengths and weaknesses and likely the ulimte solution will feature some kind of combination of the techniques. Manual tracking works really well up to about 4 objects at which point everything goes to pot and the operators head explodes. The kinect is awesome (the depth tracking and for not requiring visible light) but has some annoying limitations. It is limited to roughly a 6-meter area and it cant see through the scrim we need to have hanging across the stage so it can only track people in front of that. Visible light tracking allows for a greater range, sees right through that scrim but it lacks the depth perception of the kinect and is sensitive to visible light and therefore and gets confused by the projections. The biggest problem with all the automated tracking is that the computer tends to get confused when performers cross paths or bump into each other (that's where  human operators really come into their own). Something I havent tried yet but would like to is planting some kind of wi-fi tracker on each performer allowing them to be tracked and uniquely identified anywhere in the room. From what I've read, there can be issues with accuracy and lag, but if it could be got working it ought to be perfect. Anyone with any experience of this, PLEASE get in touch.

And thanks again to Regional Arts Victoria who's support allowed this recent stage of the development to happen.It feels like we've made massive headway over the last few months and the whole project is creeping steadily closer to being a reality.

These are all stills from the video we shot (thanks Gareth Llewellin). I'll be uploading that once Ive had a chance to edit something together but likely there'll be a bunch more stills first focusing on some of the scenes we worked through.