VFX – Week 4 of digital media – level 2

In the second week of Jon’s VFX workshops we learnt about how to use 3D tracking in a shot and exported some of our own attempts at using 3D tracking within After Effects (you can find these at the bottom of this post). In order to get a three dimensional track of the video we needed to use the track camera option in the tracking window. What this does is tracks all of the potential tracking points within the shot and then uses triangulation to create a map of points which then creates a sudo 3D track of a 2D video. These tracking points can then be applied to a null object but highlighting over certain tracking points, right clicking and clicking on one of the options depending on what job you are doing. For instance in the first video I used ‘Create Text and Camera’ As i was adding text into the video, where as on the second video i used ‘Set Ground Plane and Origin’ so that the software knew that the area that had been triangulated was the ground. With any of the options an in software camera will be made so that the graphics added can all be applied correctly.

I think with the first 3D tracking shot i did i used a good light angle to match the light diffusion in the room, i also think that the softness of the shadow was about right to match the real life elements in the shot. I do however think that the tracking could have been executed slightly better as in the video when the camera pans and moves into the third dimension the text doesn’t quite stick to the box. This may be due to me messing around with some co-ordinates and possibly changing the Z axis therefore the text wouldn’t sync with the target i had previously made.

On the editing of the second video i don’t think that the effect sold that well mainly due to my placing of the hole and the angle it was set at which just didn’t set very well but i didn’t understand why. However I do think the effect of the inner sides of the hole moving separately due to parallaxing worked very well and i really like the idea that its the little touches that really sell the effect.

Finally i learnt that VFX is different to SFX, SFX is in real life where as VFX is done in the editing suites post production, I didn’t know this before but i’m sure it will be useful to know so that i don’t sound like an idiot when talking to industry professionals!

 

VFX – Week 3 of digital media – level 2

This is the first week of VFX with Jon, in this week we learnt about the implications of 2D tracking, which situations are the best to use it and how to use it in Adobe After Effects. The premise of 2D tracking is to use tracking points using the X and Y co-ordinates in order to create visual effects within the video. The tracking point on After Effects follow a small cluster of pixels from one frame to the next. So for instance if you have a tracking point of an orange sticker on the end of a finger, it would track the cluster of orange pixels that you would have set when using the track motion option in the tracking window of After Effects. This process takes a little while due to the amount of work that is going on. Although we mainly used contrast Jon did tell us that you can track using luminance in order to find a key cluster of pixels to fix upon.

I found out from Jon that it is best to shoot with a high shutter speed to reduce motion blur, this is because the tracking markers find it hard to track on to certain specific pixels when the whole shot is blurry from motion. Jon also told us that you should always shoot with your VFX in mind, so if you have a high contrast between the foreground and the background and you want to do a sky replacement then you should get the foreground correctly focussed and then leave the sky over exposed so that it will be easier to key out in post production.

I understood the overall idea of 2D motion tracking and what it can and cannot do in terms of adding elements to a shot. For example there are many ways that could hinder an editors tracking of a shot like objects moving in front of the tracking point or the tracking point moving off the screen, if there is too much motion blur and also if the shot uses depth on the Z axis, adding a third dimension which 2D tracking will not work for.

Augmented reality – Week 2 of digital media – level 2

In the second week of the digital interactions segment with Clive we learnt about augmented reality which is a technology that adds a computer generated picture or video on a persons view of the real world, creating a view that is an augmentation of their reality.

Clive taught us that with every augmented reality software the key point is that there needs to be a fixed point for the animation, this allows the software the track the video and stream the animation on to a specific tracking point. We were shown an app called Augment which is a basic version that can be used on phones, as soon as i saw it i had to get it and play around with the features (my favourite being the dancing skeleton).

In the session we learnt about the history of AR and the application it can have to many different areas such as for arts, gaming, architecture, psychology and medical treatment. I could really see the vision that Clive and others that are within the augmented reality industry have. Seeing the prospects of how versatile the application of AR can be towards any sector to make the lives of the modern human much easier.

I have a feeling that i will be wanting to use AR in my convergence task as i see great potential in the applications i can use it for.

Projection mapping – Week 1 of digital media – level 2

Clive showed us an array of projection mapping (PM) tools such as the computer software HeavyM and the iPhone/android app DynaMapper. It was interesting to see how easy it was to do simple projection mapping with different content such as JPG and MP4 on software as basic as an iPhone app!

I also enjoyed learning a brief history of PM, with companies like Disney using a lot of projection which i did not know before. I am a big fan of co-ordinates and used to love making graphs in maths in school so when i heard that PM uses X and Y to map out areas i got a little excited! I also learned that although mainly demonstrations of PM are for art purposes, i found out that there is a market there also for practical application within the real world to solve problems, especially in terms of architecture and other design led jobs.

during the session i took my chance and downloaded HeavyM, and to my surprise i found it so so easy to use, i had no idea that projection mapping could be natively edited within software without the use of coding, and it can integrate with other mediums such as music natively as well! When i got home i created a few small PM ‘canvases’ that i could have used to project on to different object within my house if i had a projector. Find attached some screenshots of some really cool PM’d buildings.

Projection Mapping Using Christie Projectors (Mosaika - Parliament Hill Building) 1 maxresdefault466261_193360910813570_1321400761_o_1_620_wide 2658075_orig

All in all a very good lesson on an interesting subject.