AR for Science Education Webinar References – 11th May 2015

For all attending the AR-Sci – AR for Science Education webinar, references and media can be found below.

About the Delphi method:

Hsu, C., & Sandford, B.A. (2007). The Delphi technique: Making sense of consensus. Practical Assessment, Research & Evaluation 12(10), 1-7McDermott J.H. et al. (1995). A Delphi survey to identify the components of a community pharmacy clerkship. American journal of Pharmaceutical Education, 59(4), 334-341

Osborne, J., Ratcliffe, M., Collins, S., Millar, R. & Duschl, R. (2001). What should we teach about science? A Delphi study. Report from the EPSE research Network.

https://www.york.ac.uk/media/educationalstudies/documents/research/epse/DelphiReport.pdf

* Osborne, J., Collins, S., Ratcliffe, M., Millar, R. & Duschl, R. (2003). What “ideas-about-science” should be taught in school science? A Delphi-study of the expert community. Journal of research in Science Education, 40(7), 692-720

Two reviews about learning technology/AR in science:

Krajcik, J.S. & Mun, K. (2014). Promises and challenges of using learning technology to promote student learning of science. N. Lederman & S. Abell (eds.)Handbook of Research in Science Education, Vol II, p. 337-360

Wu, H., Lee, S.W., Chang, H. & Liang, J. (2013). Current status, opportunities and challenges of augmented reality in education. Computers & Education, 62, 41-49

 

Matt Ramirez - Jisc

Matt Ramirez – Jisc

 

Augmented Reality technology

Media:

“Hud on the cat” by Rama  Licensed under Public Domain via Wikimedia Commons –

http://commons.wikimedia.org/wiki/File:Hud_on_the_cat.jpg#/media/File:Hud_on_the_cat.jpg

Metaio

http://www.metaio.com/customers/case-studies/augmented-reality-automotive-prototyping/index.html

Jisc

http://teamscarlet.wordpress.com 

Håkon Swensen - HiOA

Håkon Swensen – HiOA

 

 

 

 

 

 

 

 

 

Benefits of Augmented Reality in Science Education

Media:

BBC Frozen Planet Augmented Reality
https://www.youtube.com/watch?v=fv71Pe9kTU0

Augmented Reality Sandbox
http://idav.ucdavis.edu/~okreylos/ResDev/SARndbox/
https://youtu.be/Ki8UXSJmrJE

Anatomy 4D
http://daqri.com/project/anatomy-4d/#.VU9WQpM03So
https://youtu.be/ITEsxjnmvow

Elements 4D
http://elements4d.daqri.com/
https://youtu.be/oOdMfgloUD0

Literature:

Cheng, K. H., & Tsai, C. C. (2013). Affordances of augmented reality in science learning: suggestions for future research. Journal of Science Education and Technology22(4), 449-462.

FitzGerald, E., Ferguson, R., Adams, A., Gaved, M., Mor, Y., & Thomas, R. (2013). Augmented reality and mobile learning: the state of the art. International Journal of Mobile and Blended Learning5(4), 43-58.

Radu, I. (2014). Augmented reality in education: a meta-review and cross-media analysis. Personal and Ubiquitous Computing18(6), 1533-1543.

Wu, H. K., Lee, S. W. Y., Chang, H. Y., & Liang, J. C. (2013). Current status, opportunities and challenges of augmented reality in education. Computers & Education62, 41-49.

AR Digest

Leeds College of Music AR screenshot

Leeds College of Music AR screenshot

A lot has happened in the past six months in the world of AR at Jisc and I thought it would be useful to include some links to news stories and videos that may be interesting.

Recently, I was lucky enough to film for Computerphile (YouTube channel specialising in computing and other highly technical subjects) about AR in education and the recent Leeds College of Music project I have been working on. The first of these videos is below and gives a broad definition of AR, demonstrating content developed specifically for education and future directions with other technology such as wearables.

If you want to learn more about the Leeds College of Music project there will shortly be another video released on the Computerphile channel in the next few weeks. Until then, there is a great live web stream recording describing the workflow, pedagogy and demonstrating the content available at http://lcmpanopto1.lcm.ac.uk/Panopto/Pages/Viewer.aspx?id=7e4a28ea-59f3-465c-87a5-7bb7d2725672 

For those people thinking of developing AR content there a few things you might want to consider discussed in detail in the following Times Higher Education article.

Reflection on Digifest2015

Digifest2015, where we connect, explore, and learn.

After two months of joining Jisc as an Augmented Reality intern, I was fortunate to attend the Jisc Digital Festival conference (Digifest2015) in Birmingham and take part in one of the sessions presenting about Future Applications of AR and the Evolution of Wearable Technology. I was really looking forward to our session in addition to attending some of the interesting presentations concerning mobile learning and practices, open access, 3D technologies, e-assessment and adaptive learning.

Needless to say that the event was a great opportunity for me to communicate with people from Jisc who are located in different offices as well as visitors and exhibitors. The Jisc event app had a great impact on this experience, being able to share my excitement and preparation with everybody before and throughout the event was a source of motivation.

What’s more, we had a lot of interested people in our session as people kept popping in and attending it until the very end. This was very encouraging and rewarding for me.

During the two-day event, some buzzwords were evident – open access, mobile learning, wearables, 3D, digitisation, student engagement, enhancement, augmented reality etc. There was a good number of sessions that showcased best practice for technology in education/ research, with focus on pedagogy and enhancement in education. This wasn’t a surprise as Jisc tries to always ensure that the pedagogy is the driving force, not the technology.

Although sharing best practices can be extremely useful to provide audiences with shared frameworks, what I found more useful was bringing focus to the problems and challenges they encountered in their projects and discussing them with the audiences. For me this type of session, such as the one I attended about “Electronic Management of Assessment”, was very inspiring as it stimulated dialogue between presenters and the audiences who came from different backgrounds, rather than just disseminating their outputs. This discussion resulted in exploring more realistic and applicable solutions for any initiative of integrating technologies in education.

Throughout the event two words stuck with me as they were very often mentioned in most of the sessions; while “Enhancement” and “transformation” are used by a lot of us there is a lot of variations in how we relate to them and what they mean. While it was very exciting to see all the new innovations and technologies around us in the event, I was trying to explore what these technologies can bring to education and to the learning. I think the last keynote talk was brilliant to close digifest2015 as it answered some of these questions in my mind!

B_v5D6CWIAAENe1.jpg-large

Digital vs Human

Richard Watson was able to bring us back to the role that technology should have in our life, “enhancement” for him means supporting and not replacing our human attributes.

“Digital technologies need to enhance human communication, not replace them”.

Richard Watson

Mentioning so many examples about how digital technologies is impacting our day-to-day life and interaction with each other feels like technology is a negative thing. For me it is not. However, I realize that it is essential to be aware of how technology is changing our lives, as we are all involved in some capacity in enhancing learning experiences for learners. When students are also aware of how technology is influencing how they think and interact, we will be able minimise its downfalls and maximise its potentialities.

“Enhancement” in that sense can be achieved when technologies are being used to engage learners as active partners in the process of their continuous improvement and development.

 Students like to see the benefit of technology, i.e. not being used for the sake of it.

This is exactly our goal when planning any of our augmented reality development for FE and HE; involving students as early as possible in the planning process has been key to the success of our projects. The more effort we place in getting the students’ voice heard early in the project, the more robust the learning experience, leading to a resource students will deem credible and ultimately use.

This closing talk in the Digifest2015 makes me feel really proud to be part of Jisc, an organisation that aims to establish the UK as the most digitally advanced nation in the world, as well as unleashing the extraordinary potential of our minds.

Introducing a new staff member

ATT_1421254058910_2014-09-20 14.35.40-1-1-2

Suhad Aljundi, Augmented Reality Intern

I am  Suhad Aljundi

Background

My passion for new innovations and ideas, and their applications to enhance education drove me to pursue Higher Education at the University of Manchester studying Digital Technologies, Communication, and Education (MA DTCE) graduating with merit.

Throughout my learning, I studied various theoretical methodologies and research in addition to creating educational websites and multimedia resources for practical projects. I have built up a substantial knowledge about how to use, implement, test and evaluate technology in an educational setting, making sure it is firmly aligned to pedagogical principles.

Working on a project with the director of the DTCE programme to implement an e-portfolio system within one of the programme’s core units was a great opportunity for me to apply the knowledge, theories and recommendations that I researched and developed throughout my dissertation study into a real practical project.

Role

Museumandaugmentedreality.jpeg

Visitors in museums using ArtLens App on iPad to navigate throughout the museum, both physically and virtually from off site, providing far-reaching access to media-rich stories for CMA’s treasured works of art.

Since then, my enthusiasm for exploring innovative ideas and technologies in education has grown, especially employing my knowledge and expertise to develop resources focused on supporting students. The Augmented Reality and Online Resources Development internship at Jisc presented me with the perfect opportunity to put this into practice. I believe that Augmented Reality has enormous potential to impact on current teaching and learning practices, if it is implemented well. During this internship, I will be working collaboratively with my line manager on developing Augmented Reality resources and online learning materials that focus on enhancing the learner experience in FE, HE and the Skills sector. This will include working on various projects including AR-Sci, an ERASMUS+ funded European project aimed at engaging students studying STEM based subjects with AR content.

In addition, I will be involved in disseminating good practice in AR to user communities at conferences, workshops and webinars; contributing to horizon scanning for ways of exploiting future technology opportunities in education such as wearables. Therefore, I intend to make the most of every opportunity afforded to me in turning my creativity and knowledge into practice.

In recent years there has been enormous growth in the use of technology in education, impacting on how students communicate, access resources and curate information. Understanding the changing needs of learners will be a critical aspect to help investigate new ways to better engage students and provide them with a more personalised learning experience.

With this in mind, I intend to respond to these trends through developing innovative learning resources and activities using AR.

Ultimately, being able to drive innovation and support the Jisc user communities is my goal, promoting the effective use of innovative technologies such as AR.

I am looking forward to the challenge of learning new skills and contributing to the success of AR within Jisc; it is exciting to have the opportunity to show how innovation can help engage and inspire students across the UK, making their learner experience more valuable.

AR in the City – Its finished!

arcityall

We are delighted to announce that the learning resource developed in the AR in the City project is now available for use. The content (best viewed on iPads), developed in collaboration with the HEA, BSA, BSC and Jisc encourages the user to explore three fictitious parts of London; guided by a classroom activity to uncover correlations and stimulate discussion using socio-economic data (Census and Police). It also provides interesting facts about the different data sets – Housing, Crime and Family. While it is primarily aimed at sociology students, it could also be used more widely for both A-Level students and first year undergraduates.

Initial feedback was positive with one respondent indicating that is was a“…Really good overview and intro to stats;” and  “could work really well on focused topic.”

Explore or utilise this resource by downloading Junaio from the iPad App store and scanning the QR code in the postcard below (this can be printed out for ease of use). When the channel has loaded, hold the iPad over the map of London to display models from three fictitious areas – Forestminster, Pinkham and Cobalt Wharf. Clicking on each building reveals illustrated data snapshots relating to Housing, Crime and Family.

AR in the city postcard V2.0

A set of worked learning activities can be downloaded below to use with students  in the classroom environment in conjunction with traditional group tasks.

Learning Materials

Exercise Proforma_crime

Exercise Proforma_WorkedExample

An instructional video on how to use the resource can be viewed below:

[youtube=http://youtu.be/vXT7mmni5qY]

It is hoped that the resource can act to inspire similar examples in other disciplines, repurposing the idea and demonstrating multiple applications. Already, Dr Susanne Boyle in collaboration with Glasgow Caledonian university has developed an IPE (Interprofessional Education) resource around Cochlear Implants that was recently presented at Thomas Jefferson University IPE Conference

IMG_0026

Leeds College of Music AR Project

Screen Shot 2014-09-02 at 10.57.16

Over the past couple of months Jisc Mimas have been involved in leading the technical development of a new Augmented Reality resource around the music production studios in Leeds College of Music working with Craig Golding and Ruth Clark. It hopes to support students  working/studying in the music studios by displaying 3D visual overlays, technical documentation and other media assets, linking them to the physical production equipment in front of them. Using iPads, students stand in front of the production desk and the Augmented Reality software tracks the 3d object before snapping different coloured content accurately over it providing the user with a way to interact with different parts, surfacing contextual material. More information about the content and student feedback will appear here in the following few weeks.

Introducing the Fabulous Frogs App: Splendid and Native

What can children learn?
The development of the Fabulous Frogs App: Splendid and Native completes the second phase of the Mapping the Museum Project. Developed using Junaio this app is an interactive AR tool which is targeted at 7 – 11 years and maps to Key Stage 2 of the National Curriculum in England and to the “responsible citizens” and “successful learners” capacities of the Scottish Curriculum for Excellence: specifically the AR app helps to develop children’s capabilities to understand the environment and to use technology for learning independently. The app addresses the following learning objectives:

  • Species Information e.g. where the Splendid Leaf Frog lives – presented with a map of its geographical extent
  • Frog anatomy – using label overlay on the trigger image
  • Frog life cycle – interactive quiz comparing between the Splendid Leaf Frog and the native Common Frog

Other fun features

Due to the successful collaboration with Manchester Museum we have been fortunate to gain access to some fantastic content to produce the app such as:

  • Sir David Attenborough’s introduction to the Splendid Leaf Frog during his visit earlier this year to Manchester Museum’s Vivarium
  • In-depth description of the Splendid Leaf Frog provided through the very popular Frog Blog
  • A high quality image of the Splendid Leaf Frog photographed by Chris Mattison (see below)

How does it work?

  • Download Junaio
  • Open Junaio and scan the QR code below
  • Hover you smart phone or iPad over the below trigger image and enjoy the interactive content
Fabulous Frogs App; Spendid and Native Trigger Image

Fabulous Frogs App; Splendid and Native Trigger Image

Instructions

Instructions

Fabulous Frogs App: Splendid and Native QR Code

Fabulous Frogs App: Splendid and Native QR Code

 

 

 

 

 

 

Next steps

Andrew Gray, Curator of Herpetology plans to implement the Fabulous Frogs App: Splendid and Native in the Vivarium Gallery over the summer, this will allow visitors to try out the app then and there in the gallery with their mobile phone. He will also be adding a link to the Virtual Vivarium on the Frog Blog where the Google Earth KMZ file can be downloaded so that all the Vivarium collection can be explored and viewed in the 3D globe either at home or school.

To conclude…

It has been a great experience working with Andrew and Adam Bland (Vivarium Assistant) at the museum and is an example of a very successful collaboration with Mimas in developing the Virtual Vivarium and Fabulous Frogs App. I would also like to thank Tom Hart (User Experience Developer) for all his hard work on developing the Virtual Vivarium website and interactive Frog Life Cycle quiz and to Matt Ramirez (AR Developer) for his help and advice when I have been developing the AR app.

Find out more

If you found this post interesting you might also like to read the complementary Frog Blog  post ‘Fabulous Frogs’ featuring a link to Sir David Attenborough’s Nature episode about Fabulous Frogs he encountered at the Vivarium during his visit to Manchester Museum.

Initial thoughts of wearables in Education

photo-2

Object based Augmented Reality on tower PC to enhance instructional learning

I have been fortunate enough over the last few weeks to get hold of a pair of Google Glass and do some initial research of potential use cases in education. As stated in previous posts, at present there are some serious limitations on their use in an AR capacity, probably the most worrying is that they get very hot after only a couple of minutes! However, I wanted to see if I could demonstrate a simple application where wearables could add to the learner experience rather than replicate what is already available.

The idea of the connected world is very popular at the moment, the nirvana for many is that the sensors we use in our devices/wearables can provide a highly engaging, informative and personalised experience, especially when practical tasks are concerned. Augmented assets could potentially complement the physical environments we work in to assist in so many technical processes. With this in mind, and having the Mimas Sys Admin conveniently sat at a desk opposite, I acquired an old PC tower with a view to building an AR experience guiding a user on how to remove the Riser-Card Cage. It was relatively quick to build the assets for the channel, most of the time was spent putting together the 3D CAD tracking model and the animation in Blender.

The video on Google Glass is relatively low spec (720p) so it was necessary to sacrifice the accuracy of the trigger model to display the content. As a result, sometimes the alignment of the 3d assets was off but still perfecting usable. The major issue with Google Glass and third party apps at the moment is the lack of navigation afforded to the user, put simply there is no way of interacting with the AR environment dynamically.

In the coming weeks I hope to experiment with Epson Moverio to port the same experience to their wearable which offers a trackpad for user interaction and in my mind a more valuable experience. Metaio are about to release Junaio Mirage that was first demoed at InsideAR last year which specifically adapts their current AR browser for delivery on wearables (Google Glass, Epson Moverio and Vuzix). Although a long way from  being the year of the wearable, it is interesting to see how these new additions to the consumable market can conceivably benefit instructional learning in the future.

Virtual Vivarium Launch @ Reptile Big Saturday Event!

I had a fabulous time this weekend at the Manchester Museum – Reptile Big Saturday event! The day was full of activities from creating a sock lizard to holding a real life Chameleon.  Tom (User Experience Developer) and I (Geodata R & D Officer) were at the event to promote the new Virtual Vivarium app to visitors. We were given a fantastic large screen to project the Virtual Vivarium onto.

Virtual Vivarium in Google Earth

Virtual Vivarium in Google Earth

I had created 60 copies of a Reptile Finder Quiz Sheet which parents and children could work through to discover key facts about the Cone Headed Lizard and Fijian Banded Iguana. Happy to say all the quizzes had been used by the end of the event with many children finding the answers to all the questions and rewarded with a Chameleon Virtual Vivarium sticker!

Virtual Vivarium Sticker

Virtual Vivarium Sticker (many thanks to Jennifer Matthews for printing these out on short notice)

The event had a fantastic atmosphere, I enjoyed the music from La Tinto Bros and was able to look at the tortoises on the Cheshire Chelonia Group stand.

I found that the children really engaged with the Virtual Vivarium and parents were pleased to hear that they could use it at home. Some visitors to the stand said they would suggest the app to be used in local Brownie groups and at schools. For younger children Tom and I helped with the navigation but children around 8 years and above were more confident navigating the app.

Younger children required assistance with the app

Younger children required assistance with navigating the app

A Virtual Vivarium Feedback form was provided for visitors to provide suggestions on how the app could be improved in the future. We got seven responses on the day with many saying after using the app at home they would provide feedback (survey open until 30 June 2014). All respondents found the Virtual Vivarium experience to be either Very Good (71 %) or Good (29%). Ease of use was rated as Very Good (43%) and Good (57%). All seven respondents would use the Virtual Vivarium at home with one of the reasons given as being “great for kids to learn about nature”. Areas for improvement included:

  • Making the app quicker
  • More child friendly
  • Make the areas overlay more

Others felt it was great the way it is currently and couldn’t think of any improvements.

Me with daughter and mother after helping them to do the reptile finder quiz

Me with daughter and mother after helping them to do the Reptile Finder Quiz

A great day all round and many thanks to Vicky, Anna and Andrew at Manchester Museum for inviting Tom and I from Mimas to participate at the Reptile Big Saturday event which was a real success with over one thousand people attending this family event.

Mapping Amphibians and Reptiles!

As previously posted I am working with the Vivarium team at Manchester Museum as part of the Mapping the Museum Project. One of the main objectives of the project involves illustrating the spatial distribution of 20+ amphibians and some reptiles to reflect all the species that live at the Vivarium. The tool selected for the final visualization was Google Earth as this is a freely available tool which anyone can download and use both at home or in a classroom.

The challenge…. To find a dataset where the spatial distribution of amphibians and reptiles have already been mapped.

The solution…. The International Union for Conservation of Nature (IUCN) 2013 Red List Spatial Data Download. This is a fabulous site allowing you to download for free a zip file containing a shapefile for all amphibians and reptiles. Other data is available too e.g mammals, corals, birds, marine fish, mangroves, seagrasses and cone snails.

Species Extraction

The shapefile from IUCN can be viewed in a Geographical Information System (GIS) such as ArcGIS or the open source QGIS. In ArcGIS I opened up the attribute table for the ALL_AMPHIBIANS_NOV2013.shp file and used the Select by Attributes function to obtain all spatial location records for a particular species e.g. Oriental Fire-bellied Toad, Bombina orientalis. The selected data can then be exported as its own shapefile which I called bombina_orientalis.shp and can be viewed in ArcGIS.

bombina orientalis

Spatial distribution of the Oriental Fire-bellied Toad in ArcGIS using the IUCN dataset.

Convert to KML

Keyhole Markup Language (KML) is defined by Wikipedia (2014) as ‘an XML notation for expressing geographic annotation and visualization within Internet-based, two-dimensional maps and three-dimensional Earth browsers’. KML is used for Google Earth visualization and hence the shapefile (bombina_orientalis.shp) needed to be converted to KML. Using the tool Layer to KML in ArcGIS I was able to convert the shapefile into KML and view the Oriental Fire-bellied Toad polygon in Google Earth.

google_earth

Visualization of the Oriental Fire-bellied Toad distribution in Google Earth using KML.

The KML defines a string of geographic coordinates within the following tags: MultiGeometry, outerBoundaryIs, LinearRing, coordinates. The coordinates in Geo-Global (EPSG 4326) provide the location values required to visualize the spatial distribution of the Oriental Fire-bellied Toad (as shown above).

Additional information can then be provided about the species itself such as:

oriental_fire_bellied_toad

Further information provided about the Oriental Fire-bellied Toad with a relevant link to the Vivarium Frog Blog.

The methodology detailed above has been repeated for all species found at the Manchester Museum Vivarium. However for one particular species the Lemur Leaf Frog, Agalychnis lemur the spatial distribution from the IUCN (green polygon outlines on the map below) was edited by Andrew Gray, Curator of Herpetology, whereby he felt the red polygon outlines on the map below reflects the most up-to-date spatial extent for the critically endangered Lemur Leaf Frog, Agalychnis lemur.

IUCN_Distribution_Map_Agalychnis_lemur

IUCN spatial distribution of the Lemur Leaf Frog (green polygon outline). Vivarium’s spatial distribution of the Lemur Leaf Frog (red polygon outline)

The Virtual Vivarium App will be available to explore and try out at Manchester Museum’s Big Saturday: World of Reptiles 11:00 – 15:00 24 May 2014.