Portal to another world

The Little Gallery, 5 Ellis Street, London, SW1
Copyright managed by the Crafts Study Centre

The image above has been used as the pattern for the Crafts Study Centre’s first proof-of-concept augmented reality (AR) app. The idea was born from conversations between Jean Vacher, Curator, Crafts Study Centre (CSC) and Adrian Bland, Contextual Studies Co-ordinator, School of Media and Culture, University for the Creative Arts (UCA); and centered around the concept of the door to The Little Gallery being a portal to another world. My role as Educational Technologist has been to bring this to reality… Augmented Reality!

SCARLET+ Technical workshop, 1st October 2012
Having started on the project in September I began with familiarising myself with AR. The Technical workshop was the first opportunity to create a working example of AR using Junaio and test it in practice.
See also: great blog posts from Rose and Matt about the technical workshop http://teamscarlet.wordpress.com/2012/10/05/scarlet-coding-for-the-terrified-or-how-i-learned-to-stop-worrying-and-love-php-3/ and http://teamscarlet.wordpress.com/2012/10/05/scarlet-technical-workshop/.

Technical support from MIMAS
Jean tasked me with providing a proof-of-concept and in my enthusiasm I thought it wouldn’t take long, however these were the lessons learned from the longer-than-expected process:

  1. Unlike building a website in HTML, the work can only be tested properly on the live server i.e. no live site = no demo.
  2. Creating a Junaio channel with callback URL is actually relatively straightforward, but explaining your technical requirements can be difficult as a newbie to AR. Matt answered detailed questions from our IT department in order to set-up what was essentially ‘a folder’.
  3. IT resourcing issues need to be addressed before the new academic year; despite completing a detailed work request, this was turned down by our IT department. A sliver lining was my connection with the Visual Arts Data Service (VADS) which led to them kindly hosting the folder for us.
  4. FTP access is required in order to build an AR application. It is essential to have FTP access to your folder as tweaks with the code need to be made throughout testing on the live server.
  5. Equipment – this raises interesting issues with inclusivity for students using AR in education – both Jean and I have had to borrow devices from family in order to test the AR content. Devices can be iPad2 (and above), iPhone3 (and above), and Android devices (untested by us so far).
  6. Internet – the Junaio application requires a good connection to work on the device, and the eduroam network has not always obliged.
  7. Why AR? AR is undoubtedly exciting with a bit of a thrill of the new in education, however that makes it even more important to put pedagogy before technology. By working closely with an academic to tailor something to his students’ needs, we hope to keep the pedagogy at the fore-front of AR.
  8. Technical support from MIMAS through Matt’s role in the SCARLET+ project was essential.

Glossary

  • callback URL – this is the weblink to your AR content and specifically to the ‘html’ folder, an example URL would be http://www.website.ac.uk/dev/scarlet/html
  • channel – this is a location where Junaio will make your content available to others – users can use a QR code or a URL to access your channel and therefore access the AR content
  • FTP access – File Transfer Protocol – permissions on the folder need to be set-up so AR developers can transfer files from a local machine to the live folder
  • Junaio – an augmented reality browser that you can download for free on a range of devices in order to access both location-based and static (GLUE) AR
  • pattern – the pattern is also called your ‘tracking image’ by Junaio, after a user has scanned your AR channel they need to find the ‘pattern’ or ‘tracking image’ this is the trigger for your AR content which will appear on top of this
  • portal to another world – this concept is mentioned briefly above, a forthcoming blog post will consider how we might enable this through AR

SCARLET+ ‘Swings and Roundabouts’ – the ups and downs of creating an AR application.

This work is licensed with permission of the University of Sussex Special Collections under http://creativecommons.org/licenses/by-nc-sa/3.0/ For permissions beyond the scope of this license contact http//sussex.ac.uk/library/specialcollectionsWorking on Scarlet+ has so far been a roller coaster of triumphs and tribulations, but I am delighted to be able to declare that I am now working on the app itself.

I had a huge breakthrough last week when Matt Ramirez suggested I try creating my trigger image using Firefox instead of Chrome as he had had problems with Internet Explorer in the past. It worked, so since then I have been able to really get into the nitty-gritty of how changing code changes the app, learning about linking, and coming up with the first rough version of the Mass Observers’ branch of the ‘Observing the 1980’s’ app for Lucy Robinson’s ‘Thatcher’s Britain’ lecture.

The image on the right, caught using the lovely ‘screenshot-straight-to-twitter’ function in junaio, shows how one trigger image now pulls up the three Mass Observers’ numbers, which then leads to some biographical information, which then leads to their writings on being a Mass Observer. So far, so good. Except that you are probably wondering why this is all triggered by an image of the front of a book about Humphrey Jennings. This was the trigger image for my test app, and the new channel does not want to let it go! I have created a new trigger file using a Mass Observation image and replaced the old one. I have gone through the files in the webdrive and removed any trace of the old Humphrey trigger. I have removed any PNG or JPEG files with this image even though they are not the trigger image, just pictures. I have deleted junaio from my phone and re-installed it. I have re-started my computer. None of this has had any effect. The old trigger still works quite happily whilst the new one does nothing. Why? I don’t know, but I am sure I know someone who does…I’m off to email Mimas!

As I finally have something to play with, I also had a short session today showing my efforts so far to Special Collections staff to keep them up to date with my progress, give them a window into the processes involved in creating an AR app, and to give me some feedback. As the whole idea of Scarlet+ is to embed the skills into Special Collections, these sessions are an essential part of the project as they help the rest of the department to see how an app is made. Squirreling myself away for six months then appearing suddenly with the finished result would not be half as useful as asking my colleagues to play with half-built, slightly shoddy-looking first ‘drafts’.

I was delighted with how little direction it took to get everyone up and running. Downloading junaio to their phones took about a minute and was done whilst I was telling them what we were going to be up to during the session. The QR codes worked well, with most people understanding intuitively that ‘when it beeps, it’s done’. When extra buttons appeared on screens, few needed any encouragement to press them and find out what happened next. There were interesting questions and concerns from our staff.  What could we use as the trigger image? How many trigger images would give the best result? Would a single image result in too much information on the screen at one time, but if we use more, where would they be placed? Copyright concerns for downloads of Mass Observation material.  We also made the discovery that junaio does not work on a Samsung Galaxy Ace, which has lead to the idea that we gather a list of other similar non-compatible items.

This work is licensed with permission of the University of Sussex Special Collections under http://creativecommons.org/licenses/by-nc-sa/3.0/ For permissions beyond the scope of this license contact http//sussex.ac.uk/library/specialcollections

This project is not a steady road to a neat, simple end result; Scarlet+ is like a ride in an amusement park…it goes up and down, around and around, it has thrills and chills, and it’s very shiny and new, and it’s fun for a while…

…then your trigger doesn’t work and you want to scream. But it’s just a ride and we’re here to learn.

With fond apologies to Bill Hicks.

Embedding the SCARLET methodology in other subject areas

As part of my work researching AR I have been collaborating with other faculties in the University of Manchester to put together some sample content showing how it could be used to enhance the student experience. The first example uses a picture of a patients mouth as a tracking image, then overlays buttons over areas of decay adding contextual information related to the clinical condition. I thought it was important to create a meaningful link from the printed marker and the AR element to increase the user engagement and impact. The user experience is intuitive and requires little guidance, making the learning more immersive than a simple online demonstration. It also places the user at the heart of the activity, particularly appealing to kinesthetic learners that respond to kinesthetic (movement) and tactile (touch) interactions.
In other work, I have been collaborating with Dr. Kurt Wilson (Clinical Teaching Fellow) at the Medical School to develop an iBook on prescribing. Certain chapters will include linked activities using AR, alongside other e-learning components such as videos, 3D models and interactive HTML5 widgets. There is real scope in medicine to implement AR experiences to help students understand pattern recognition, a crucial part in decision making processes. I can see the potential use of this in examining x-rays and identifying important elements or abnormalities.
I created the example below illustrating a simple concept for recognising different anatomical elements of a forearm. The user would be able to strip away the skin and look at different parts of the anatomical makeup by clicking on labelled buttons. Although very basic information is displayed, it is extremely visual and with further development could provide a good catalyst for deeper research and study.

 

 

 

 

 

As I have stated on countless occasions, development of meaningful and pedagogically sound AR can only be achieved working with an academic, aligning the e-learning to course aims and objectives. Working with Kurt, his enthusiasm is obvious and is central in working towards a common goal of improving the student experience. As an advocate of using innovative learning methods, he can see the potential of technologies such as AR to engage students and build on existing support materials.

In my experience AR is most successful when you are delivering learning in a unique way. Over the next few months I hope to make available further demonstrations showing the ways in which it can be delivered to support otherwise abstract learning concepts in a visual and engaging manner.

St John Fragment AR content

AR material relating to the St John Fragment has now been released as part of the SCARLET project in collaboration with Dr. Roberta Mazza. It provides the user an opportunity to surface contextual information about the most famous piece of papyri at The University of Manchester Library , view an expert academic commentary and see how the reconstructed pages would have originally appeared, both in ancient greek and translated to English.

Simply download the attachment below, open Junaio and scan the QR code to activate the channel content and experience the wealth of contextual resources that bring the fragment to life.

Dante final AR outputs released

SCARLET outputsI am pleased to announce the release of the complete suite of AR outputs related to Dante’s Divine Comedy, which have been developed through the SCARLET project. These can be accessed by downloading the outputs PDF and scanning the QR code to enter the SCARLET Dante Junaio channel.

Working with Guyda Armstrong (Department of Italian, School of Arts, Languages, and Cultures), 10 editions of the poem published between 1472 and 1555 were selected, which are all particularly important in terms of its publishing and intellectual history. Image recognition based AR was used as a focal point, so that students could access mobile optimised webpages related to individual editions of the manuscripts. They could also consult expert academic video commentaries on each edition identifying elements for further investigation and research.
It is our hope that the learning resources, in addition to HE, can appeal to a wider community, bringing these fascinating books held at the John Rylands university library to life and encouraging people to visit the special collections in the future.

SCARLET+ ‘Coding for the terrified’ or how I learned to stop worrying and love php.

One of the greatest worries I had when taking on my role as the project co-ordinator for University of Sussex’s augmented reality (AR) project was the actual building of it. The AR applications I first looked at had such an impressive wow factor that I wondered how I was ever going to learn to construct something that looked to be so complicated and techy. I had experimented with a range of different programmes designed to create your own AR application and had found them either very simple to use but extremely limited in their capabilities, or wonderfully flexible but containing a technological brick wall against which it felt like I was beating my head with very little result.

On Monday 1st October I took a trip up to Manchester along with Marie-Therese Gramstadt from the University for the Creative Arts, for a technical workshop with Matt Ramirez from Mimas, as mentioned in  Matt’s post, below.

What a difference a day makes!

The importance of face-to-face contact for anyone venturing for the first time into the world of code, and webspace, and php files cannot be overstated. Having a real person, an expert with practical experience, to ask questions of is a powerful learning tool. This is particularly true of such new technology, where how-to guides and the like are few and far between.

 

By providing us with carefully explained lines of code to and showing us what happens when different sections are changed, Matt has given me the framework and confidence to create my own AR applications. My understanding of the way AR works has increased exponentially and I now feel far more able not only to build the application but also to explain to others how it works and show them how to create their own; embedding the skills into our department as we have wanted to from the beginning.

SCARLET+ Technical Workshop

On Monday 1st October, myself and the two technical leads from the University of Creative Arts (Marie-Therese Gramstadt) and University of Sussex (Rose Lock) took part in a technical workshop in Manchester. The main aim was to gain a basic knowledge of how to create Natural Feature Tracking(NFT) channels in AR, linking printed images to surrounding electronic resources.

It was a very informal affair, using a multitude of mobile devices to look at the test training channel content and imagining how it could be adapted for use in their institution. Using a pre-built template we were able to look at overlaying different types of content (images, 3D models), adding pop-up information and finally linking them to web resources. We also looked at the important planning stages, mapping out a user journey for each piece of AR using storyboarding.

It was great to see how much enthusiasm there was from Rose and Marie-Therese, plenty of questions and interesting ideas as to how they would present their materials. I hope that after the day had finished they were more confident in being able to create their own channels and realise the potential of using AR with their academics in teaching.

SCARLET team are joint second in learning technology awards

Staff from the SCARLET team have been awarded the joint second prize for the Association for Learning Technologists “Learning Technology of the Year” team award.

The award’s overall purpose is to celebrate and reward excellent practice and outstanding achievement in the learning technology field.

The SCARLET team, led by Mimas and involving academics from the School of Arts, Languages and Cultures and Special Collections staff from The University of Manchester Library, received the award at the 2012 ALT Conference Gala Dinner on 12 September in Manchester.

The Special Collections using Augmented Reality to Enhance Learning and Teaching (SCARLET) project, funded by JISC, has been instrumental in addressing one of the principal obstacles to the use of Special Collections in teaching and learning – the fact that students must consult rare books, manuscripts and archives within the controlled conditions of library study rooms, which are isolated from the secondary, supporting materials and the growing mass of related digital assets.

The project will provide a model that other Special Collections libraries can follow, making these resources accessible for research, teaching and learning.

Project Director Dr Jackie Carter, Mimas, said:

“It’s a tremendous honour to receive this award in recognition of the passion and commitment of Team SCARLET.”

“The SCARLET project has demonstrated perfectly the strength of working in a mixed team of project managers, academics and content experts, and learning technologists. This small project has punched well above its weight in demonstrating how an innovative technology, Augmented Reality, can be used imaginatively to enhance the student learning experience with special collections materials at Manchester.”

“We hope to use this to further develop opportunities in the pursuit of outstanding learning and teaching at Manchester and beyond.”

SCARLET+ ‘Voices in your pocket’ – challenging the meanings of Mass Observation

Blog post by Rose Lock

 “History is an art; history is a political act; history is a job”. Dr. Lucy Robinson, lecturer in contemporary British history and University of Sussex academic lead on Scarlet+ project.

After a very productive workshop during which Laura Skilton (formerly Shaw), Matt Ramirez and Dr. Guyda Armstrong introduced us to the wonderful world of AR teaching, the Scarlet+ team at University of Sussex met to decide what we are going to do with AR at Sussex.

We were lucky to be joined by Sussex staff members Stuart Lamour, an E-Learning Developer with experience in AR, and John Davies, an Educational Developer, Dr. Lucy Robinson, history lecturer and our academic lead, and a variety of Special Collections staff.

We are working with Lucy to develop an AR application using material from the Mass Observation Project (MOP), to be used as part of her course on Thatcher’s Britain.

As both the Mass Observation Project and the Observing the 1980s digital resource are vast, we decided to take a section that can be seen as a discrete unit. Lucy suggested the directive from Autumn 1990 – Retrospective on the 1980s. This covers a huge number of subjects and attitudes across the decade and some responses to the questions have already been digitised as part of Observing the 1980s.

The team was eager to make sure that AR added something unique to Lucy’s teaching and the idea of using the application to present additional voices and their attitudes towards MOP seemed to fit the bill perfectly.

The three voices that will be presented when the question sheet from this directive is scanned will be:

  • A member of Special Collections.
  • The Mass Observers themselves, through their writings mixed with biographical details.
  • A student historian who has used the Mass Observation Project in their research.

We are also looking into the idea that students using the application will be able to add their own comments, thus creating a living, growing resource that is somewhat akin to Mass Observation itself.

A structure like this that concentrates on the interpretation of archival material could be applied to any discipline and any collection; for example an application could be created that uses part of our Bloomsbury collections to give voices to an archivist, Virginia Woolf herself, and a fan of her books.

All we need now is a name.

SCARLET+, adding Augmented Reality to the University of Sussex’s Toolkit

SCARLET+ was funded by JISC to embed Augmented Reality knowledge and skills at the University of Sussex (UoS) and the University for the Creative Arts by Mimas. The UoS wanted to investigate how they could use AR with their Observing the 80s project: http://blogs.sussex.ac.uk/observingthe80s/

This was our first workshop at the UoS; these workshops along with one to one support from Mimas and internal working groups will lead to the development of an AR application but more importantly one that will be used by academics and students. By the end of this one year project these institutions will have the skills they need to continue developing and embedding AR into learning and teaching.

Rather than telling you what a great day we had I caught a few people on camera and asked other to tell me in their words what they got out of the day:

[youtube http://www.youtube.com/watch?v=hHy3_xWJ-iQ&w=560&h=315]

Video: Rose Lock, Senior Archive Assistant and SCARLET+ Project Officer

Image“I am always looking for opportunities to pull together people in the University from different areas – the SCARLET+ workshop was ideal with e-learning, the library and a historian attending. What was delightful was that everyone present got involved and everyone was very excited and enthusiastic about exploring new ways of working with augmented reality to add value to teaching”. Jane Harvell, Head of Library Academic Services (Library), SCARLET+ Institutional Lead

[youtube http://www.youtube.com/watch?v=xPHzSFR6yi8&w=420&h=315]

Video: Dr Lucy Robinson

Image“Thank you to Laura, Guyda, and Matt for coming down and inspiring us.  I’ve been thinking about how to get students to engage with augmented realities for a while, and particularly thinking about how history students can engage with them as part of a historical development – be it around public history and collective memory, historicising concepts of space, or thinking about the impact of new information technology.  The workshop gave me opportunity to think about how to harness AR in a more active way, and to think about ways to move beyond ‘making sense’ of AR’s and thinking about how to wield them to overcome pedagogical challenges.  A couple of things have sprung to mind, but I’m sure more will arise as I digest further. AR as ‘playful’ learning – at a time when skill acquisition is being privileged over learning for learning’s own sake, the idea of playful approaches to learning seem particularly important. I don’t think we quite pinned down the relationship between the two yet, but the way in which AR’s offer a way to BOTH invite new activity into the archive AND take archives out in wider communities seems very fruitful. At a general level using ARs will help underline the extent to which the meaning of a source is much wider than its narrative content alone seems invaluable, being able to visually and interactively represent meaning and constructed not just by content, but by the context of production and processes of reception”. Dr Lucy Robinson, Lecturer in History (History, Centre for War and Society, Centre for the Study of Sexual Dissidence), SCARLET+ Academic Expert/ content developer

“I think I am still musing on what I got out of the day.  I found the whole AR idea really intriguing and could see huge value in it in relation to things like exhibitions or campus orientation.  Where I’m still uncertain is how we use it in teaching in a way which gets maximum value out of the technology – so really adding something to reality, rather than simply using it as an alternative media for delivering information.  In the workshop, Stuart made that point, and in a way his example of the models with monster heads made the most sense to me in terms of augmenting reality! So the brief mention of using photographs from the MO Worktown collection and being able to overlay information on them seems like a good idea, but they’re not part of the Observing the 80s project! However, in practical terms I can also see how creating something such as a skills guide to using MO material would be a good way into the technology and a simple example to test it with”.  Jill Kirby, Project Manager, Observing the 1980s

Since the workshop the UoS have decided the route they are taking, Rose Lock will be sharing this soon in a blog post. Next we’ll be delivering a similar workshop at the University for the Creative Arts. As well as AR development, this project aims to produce two case studies that will inform other Instuitions about how to get involved with Augmented Reality.

You can also follow our progress on Twitter:  https://twitter.com/#!/team_scarlet