Sketches

This archive is a storage space for my projects, experiments and ideas...

Search Archive by Month

Tuesday, January 4, 2011

Augmented Reality and Physical Computing....some thoughts

Augmented Reality (AR) is a technology that blurs the line between what’s real and what’s computer-generated by overlaying digital images, video and other information onto real world content.” (http://www.augmentedreality.co.uk/)

Augmented Reality in Reality

Although Augmented Reality is not a new phenomenon (its roots can be traced back as far as Ivan Sutherlands' work in 1968) the last few years have seen a dramatic rise in terms of its application and use in everyday life.

With the rise of ever more complex and faster processing capabilities of computers, phones and other hand held devices, augmented reality applications are becoming common place as tools that can help users understand and interpret data around them in new ways; from applications in engineering and medicine that allow professionals to more readily visualise potential solutions, to personal use that might help a user find the nearest geo-tagged restaurant.

The “overlay” nature of AR separates it significantly from the immersive nature of Virtual Reality (VR) and it is this “mixed” reality that has appeal within both the professional and personal fields; data no longer has to be a string of numbers to be interpreted by experts and can now be designed and represented in a manner more fit for purpose for consumption by humans.


For example, a “layers” application, now readily available for the vast majority of GPS and internet enabled devices, sends and receives data based on position and correlates that data with a multitude of social media data (from Twitter, Facebook, Flickr etc) in order to allow the user to view the latest and most local data, for example photographs on flickr, from within their phones' camera interface without any need for complicated data management; the photographs are “overlayed” as icons on the phones' screen.


 (Screenshot of “layers” application Wikitude)

It is this “intelligent” data visualisation that can help AR applications convey more meaning and personal relevance to the user and create new means and forms of communication and interaction beyond the “data layer”; in this scenario the data handling is managed mostly through an Application Programming Interface (API), a protocol that allows programs and applications (e.g Twitter, Facebook etc.) to communicate with each other and the user with a minimum of programming intervention, providing more opportunity for designers to concentrate on the nature of the visualisation as opposed to how the data is captured and handled.

AR, the Giant Finger and Design

While current developments around the use and application of AR might encourage us to rethink our relationship with technology, there are other implications that suggest a much more interactive and exploratory investigation of our relationship with technology and how that relationship might develop.

Since the inception of the computer as we know it, the means of interaction has mainly remained focussed on productivity and efficiency through the mouse/keyboard/monitor.

It is this focus that emergent thinking around the nature and application of physical computing is currently challenging.

The “traditional”, productivity based model has little in common with natural human interaction, and while this model has prompted cultural shifts in the way we interact with and understand technology (keyboards and mice being almost ubiquitous), as far as the technology itself is concerned, the user is no more than a giant finger, which for the user inevitably means there are a limited number of ways in which to interact.

This shift from “How do we react to technology” to “How does technology react to us” has opened an intriguing new line of thought among many designers and artists across the web and a focus on how the technology can interact more “human-ly” with users has become a rapidly evolving field.

From a physical design point of view, sensory data such as motion and gesture detection and face recognition as well as API based data from open online data sources can now be captured, analysed and interpreted by artists and designers using open-source Integrated Development Environments (IDEs) that have a focus on design and visualisation. Thus dialogue and experimentation may begin to challenge productivity and efficiency.

This experimental design can manifest itself in intriguing new ways; varied data sources can act as triggers within a physical space as well as physical interaction creating data visualisation.

The combination of both of these approaches creates its own data set that in turn can be visualised or act as secondary triggers, creating an organic and evolving interactive space that can allow us to investigate the nature of interaction, design and our environment.

 
(Data flow of organic physical computing setup)

As the advent of Web 2.0 technologies encouraged the use and production of new and experimental forms of media, intelligently designed social applications, physical computing and data visualisation can be seen as an essential element in helping us to understand and produce new media that can, in turn, help us understand and develop new ways and forms of communication and interaction.

These new media that connect, that are interactive and that are social have the potential to allow us to reflect on our relationship with technology, with the environment in which we exist and the people around us.

Sunday, January 2, 2011

Experiments with Blender and FCP


A bit of an experiment using FCP, blender and a bit of Motion. I know FCP isn't open source but I've yet to come across anything open that does the job as nice...yet!