Tuesday 14 June 2011

The VISUALISATION

The fun part came when we were able to actually get outside and experiment for the first time with the device itself. People were curious as to what exactly we were doing out in the city with this strange object filming them. This was perfect because it meant that our performance was working. We tried to use a range of textures from brick, cobble stone, and even grass to convey our theme. It lead to  some very interesting footage. I felt as though the visuals were very artistic, and because of the videos abstract nature, we found ourselves accident getting some awesome visuals. Even something as simple as someone looking at their phone really stood out in my eyes.  When ever the robot was off the ground the visuals 'fuzzed'  out and went black, this really helped with editing as it always gave smooth transitions between videos and gave a very smooth flow to the video as we changed from each persons individual use.

The presentation


All our hours of hard work has finally payed off. Not only that but our presentation was also a success.  We incorporated the hexagon shape into its overall design. We glued still images onto the surface of black hexagons which were then glued onto 3D foam hexagons on the wall to further relate to our project which revolved around 3D.  We also had a light shining from the roof which cast shadows on the walls to further reinforce this fact. The best part about the formal presentation was the fact that we gained vital feedback as to  what the tutors really thought about our design and ideas. The had anxiously been waiting for this, and it gave me a lot more understanding of the project itself and how it could have been improved not only physically but also conceptually. I felt as though the production was a bit too rushed as we only finnished the project a day before we were due to present. I knew that Cory and Tom were doing a lot of the work for the presentation and I offered to help continuously but it since a lot of the work required the 3D workshop to finish cutting out pieces of foam it was necessary that all 4 people went to the 3d lab. I felt unwanted as I had no experience with the lab. I refuse to say that I didn't take part because i did, every time they went up to the that I would come with them, but they didn't want any of my help, yet i still came to show i want to be involved. I edited the visualization and put it all together, helped paint, shopped around for the glue and string to put it all together, programmed the servo to run the phyical visualization,  among many other chores.. So to claim that the group was split in half with one side doing work and the other slacking off would not be true just because 2 members did a lot of the fabrication work. We also only just finished the project before the weekend and during the weekend I was working all through the day. I offered to come in and help after 5pm but no one wanted to stay after that time. I stayed at uni many times late at night but 12pm is my last bus and I literally have to go home after this time. I believe that i did make a large contribution to this group and tried to work as hard as I possibly could 100% percent of the time. Being the programmer I felt distanced from the group as they were always in the 3D lab fabricating the design,  which is why I  think the group felt so much tension near to the end of the project. Nonetheless, up until the presentation we all worked very smoothly. I enjoyed this project, it was  a lot of fun and i enjoyed working with the group.

Sunday 12 June 2011

Project 4 - Three Experimental Works

ARTIFICAL NATURE:
While researching ideas on our project I managed to stumble upon a very interesting concept in development at the moment. Artificial Nature is a transdisciplinary research project of creativity in complex systems that investigate environment generation through the computational generative ecosystems. Its whole purpose is to explore creativity as a form of art, study and play, by taking inspiration from nature's creativity but recognizing the potential of natural creation more than just the physical aspects. Spectators can witness, control and manufacture amazing, complex and intuitive patterns evolving from the behaviors of the species, as the organisms in turn interact with their dynamic environment. It uses unfamiliar data and uses it to create art in a way that has never before been used. And it is all inspired by the combination of art and animation. Furthermore, it is simply a tool that provides the means to create art using the imagination. I thought this was very relevant to my project because it absorbed data to recreate a piece of ever changing artwork. I can relate to this because my project will do exactly the same thing. We are hoping to use flex sensors in order to record the texture of our surroundings and use that information to generate a new perspective on the environment. Artificial nature allows viewers to view nature from the microorganism level and gives insight as to how complex nature truly is. It goes far beyond just ‘animals.’ People fail to consider where it all originated from on a cellular level. This to me is a great way of allowing people to view a different perspective of the wilderness in an evolving 3d environment. This relates to my project because we too are showing the city from a different perspective in the sense that most people view it as a static object as a whole. People fail to see its true fragility and how it came to be.

3D PROJECTION:
Another example I found of experimental technology that relates to my project was 3D projection. I thought it was a great source of information and explores the use of texture very well. It dissolves and breaks through the restraints that architecture has set in our modern world. The structure of buildings restricts its artistic value. However with 3D projection, the boundaries are limitless. It allows the structure to adopt any kind of texture that it wishes. Perhaps in the future we will see buildings, pristine and smooth, void of architecture, and instead its design will be solely based on projectors. It relates to my project because we ourselves are exploring a similar concept of texture. As above, I believe our texture recording concept is showing us the outdated surfaces that surround us every day within the CBD. We see cracks, and flaws everywhere we look. It gives the city a ‘grungy’ feel to it to see these features. Buildings are simply a
solid façade. 3D projectors allow us to uncover different interpretations of conception, geometry and aesthetics expressed through graphics and movement.    

PETER CAMPUS
While researching I also fell upon a contemporary artist called Peter Campus. He was not any other art I had seen before. He was able to explore the world beyond its face value in 3D video like never before. His works are old but I feel they need to be mentioned because they are still relevant to this day and also have a great importance to our current project which is being developed. The most striking video that I watched was one called 'Three Transistions.' In this video Peter explored surfaces beyond their face value. He cuts through a wall of paper, but as he enters the hole he has cut he also is emerging at the same time (weird i know). He showed that surfaces can be more than their 2D perception. Just like in the city where the walls themselves are an illusion. The walls, from a distance seem to be 2D and flat, where as when you get closer you become aware that nothing is as pristine as it seems at first. Even the smoothest surface has blemishes, and this is the point that I wanted to highlight with our project.       


Thursday 9 June 2011

Project 4 - Further Construction



After a short drive to sylvia park, we managed to buy ourselves some rather expensive styrofoam (60$). I have no idea why it was that expensive, but we ended up forking over the money for it. That day we went straight to the 3D lab in order to sculpt some components to the roller. These components were to be used in order to contain the arduino board and also the webcam away from sight as to make the device a little bit easier to look at ascetically. In order to get the right shape we sanded it down after cutting out a rough estimate. This foam would also be used for our time based visualization.

I cant help but feel we are falling behind. It seems as though we are focusing in on such minute details without actually focusing on the project as a whole. We have the ideas but it seems we don't know how to go about executing them. After experimenting with the foam in regards to our visualization I found that our original idea of threading string through it in order to lift it up in layers wasn't quite going as planned. The string was actually cutting into the foam which meant it couldn't possibly rip through it when being used with a servo, this is a risk I am not willing to take after spending 60 dollars on it already. I think that using this type of foam was a very bad idea.

I have completed all the programming successfully and it is ready to be used, its simply waiting for the device to be fabricated. Its getting frustrating waiting around for things to get done. Despite going to the 3D lab to help I cant help but feel as though I am not needed. There doesn't seem to be a need for 4 people in the 3D lab. Nevertheless I am determined to get this project completed even if the deadline is very close within our view...

Wednesday 1 June 2011

Project 4 - The last Day. So close yet so far away...

Today I wanted to get some testing done but due to complications of implementing a webcam into the program instead of a built in laptop webcam proved be much harder than at first thought. My computer wouldn't support the webcam for some reason and I ran into continuous errors. I consulted forums and found that I was not the only person to encounter this. I followed all instructions such as re downloading quicktime, Qtcap, and quicktime for java but nothing worked so  we tried it on Toms computer which worked surprisingly enough (dam macs). However his computer had MORE problems than i had. For some reason openGL would not work and we had to find patchwork code to fill in some gaps so that it would work on a mac. The arduino codes also did not work as the numbers were printing out random WTF values such as 0 and even 44 million. It was not surprizing that this caused major problems. To combat this error we worked for more than 4 hours trying to figure out the problem and evidently it turned out that the structure of the program meant that the arduino serial code had to be placed below the open GL code.... And this is the exact reason why i hate programming. Tedious problems take hours to crack. Another problem we currently have is that the video flickers if we use the flex sensor values to change the intensity of the 3D effect. The smaller increments of X and Y mean that there is less flickering which is strange because it uses much more GPU and CPU power. However if we play with the ratios of the arduino flex sensor values we could fix this i think.

We also decided that it would be a good idea to take the program out into the city using the webcam to see what kind of results we would see in the real world. We found that it worked very well, however in some places the sun was too bright and made seeing anything hard. However since the program works on brightness intensity we can use this to our advantage when objects are silhouetted against the brightness it will give a greater 3D effect. At the end of the day we managed to get one test in using the device on the fabrication concept. To get variation in texture we simply created our own course for it to run over using our books :)We used books because we assumed that this would be the maximum variation in texture we would encounter such as uneven bricks out in the city.

Project 4 - Programming ALMOST Done!





After hours of tedious programming trying to communicate arduino with processing I have finally succeeded in my mission. This aspect was the main feature of our project because without it, the flex sensors would be absolutely useless. I found out that there is a huge different in arduino between the ports A0 and 0, including A5 and 5. When i was using analogRead to record the values, it was generating a string that was vertical in the arduino serial console. This did not coincide with my processing code because my processing code was looking for a string that had a horizontal structure similar to this: xxx,yyy. This is why I could not convert the string into 2 int values within processing. I feel as though i have a much broader understanding of how programs talk with each other  Before this project I didn't even think it was possible to do something like this. Using these values from the flex sensor we can now feed them into a 3d pixel program that I have altered. By doing this I hope that I can change the intensity of the 3D illusion. The code is based on color intensity and vertexes. If I can find a good use for the variables I can convert the texture of surfaces onto the video which will give us a point of view of the city from the materials that are put together to make it. I think this is a great idea and I think it will be easily done from this point onwards. All we need to worry about now is the physical visual presentation rather than the video side of the project.