Moved!

I am moving my blog over to……

http://alexjohncampbell.com/fdadmp

So go there from now on!

Tim Exile and Reaktor

After looking at Aphex Twin, and what he uses to create his music, I found out about a peice of software called reaktor. It is used to create music, and has the potential to create a track from scratch. I saw a video of an artist called Tim Exile performing with a massive set up based around this piece of software.

http://www.native-instruments.com/en/embed/1035

Here is a video of him explaining his set up. Very interesting, but it must take a lot of time and practice to know your way around a set up like he does.

http://www.native-instruments.com/en/embed/1034

Performance Video

For this project I will be working by myself, as I have a few ideas I would like to pursue. I play the drums, or at least I used to when I had a drum kit available. So most of my ideas so far are based heavily on percussion and beats, although I will try to think up some ideas completely unrelated.

Up till Tuesday just gone, I didn’t have any strong ideas, but after Bob’s session, I felt much more inspired. He showed us many artists, the first that I found really interesting was John Cage. He showed us a video of him performing the ‘Water Walk’.

I thought that he was really ahead of his time, pushing the boundary,  of even today’s audience, let alone the audience of 1960. I liked his attitude, when the presenter says some of the audience will laugh, John Cage replies ‘I consider laughter preferable to tears’.

Although I don’t want to produce such a experimental track, I do like the way he used a strange mix of everyday objects as instruments.

On a side note, here is a very interesting interview that shows John Cage’s thoughts and ideas on  music and sound.

The second artist he brought up the I really liked was Chris Cunningham. He showed us his monkey drummer. I thought that it was excellent how it illustrated the incredibly fast beats, and the complexity of them. You find your self watching the complicated actions of each limb and listening out for there sound.

Chris Cunningham works often with Aphex Twin, aka Richard David James, an electronic music artist. Aphex Twins music I find interesting, some of the elements and techniques he uses I would be interested in using.

The initial idea that sprang to mind was to create some kind of performance of a track, including a story line and performance by a distorted, human like thing.

I then thought it would be cool to make some kind of music yourself. Involving the audience in the performance. I though about making a user interface that allows the user to compose a whole track with pre-recorded beats and rhythm, and be able to add different effects, adjust the tempo, completely change the layout, add brakes and more.

Work Placement

Work Placement: Tuesday, Week One.

This was my first day. I turned up at 10 30 with Luke and met everyone at SiSo. I settled in pretty quickly and got to work. My job was to Photoshop an image, basicly to remove some stuff and add some bits in. All in all, it was an ok day!

Post Production

To cut a long story short, after starting a few ideas and not following them through, I decided to make the idea about the in brain sat nav.

I decided to have a man loose his keys in the garden and have the sat nav HUD guide him to them.

Here is the storyboard I used to film with.

I did have a few problems. Firstly, I filmed on a standard definition DV camera. The footage looks ok, but motion tracking was very difficult as there was no real definition for it to focus on. I was advised to re-film in HD. So I borrowed the Sony HD DV camera and filmed again. The footage was very clear and motion tracking was not so difficult.

Also, the first time I filmed, I didn’t put anything specific in shot to track to. The second time I shot, I used an orange squash bottle to track to, and in the garden, used the washing line to track.

The HUD consisted of a battery indicator, signal bars and the arrow that pointed to the lost item. I made all of these icons in Maya and exported them as PSDs to retain there alpha layer.

Here is the final video.

I think that the final product was acceptable. I think that after reading my description it is clear what is going on. Some of the HUD looks a little basic, could have done with some more detail. The motion tracking worked pretty well considering it was very wobbly, hand held footage, although in places the arrow didn’t move when needed.

If I were to do it again, I would make an introduction screen describing what this was. I would add some distinct tracking points in and edit them out after for some smoother motion tracking. Finally, I would add some more features into the HUD such as distance traveled, time, speed ect. like a normal sat nav.

Post Production

I have started work on my project, and the one I chose was the one involving compositing a Maya designed character into a real life scene.

I have decided on a design for the character:

I have also drawn up a story board.

Firstly the person walks into the room.

Then he spots the character in the kitchen.

Then, camera shots of there eyes show the characters confusion/simpleness and the persons fear. Done in a comical way.

The person then pauses, and sprints off, slipping up a bit.

Finally, the character says something along the lines of ‘I wander whats up with him?’

I have now started work on the character in Maya, and have found a high quality camera that I can borrow.

Post Production

I know that I should be set on my idea and getting on with it… but I’m a little undecided still.

The first idea about the composing the characters made in maya into real footage would be excellent but technically difficult, not to say it can’t be done. Here are a few drawings I have done for the monster ideas.

My second idea about the real live GPS finder thing is still in development.

I will make a final decision in the next day, at the moment I am leaning towards the characters.