Case study 1:
Pyrus – Burn Holograms.
(Research sourced from https://www.bestfolios.com/casestudy/pyrus:burnholograms)
Pyrus is a healthcare company that use AR, VR, MR another useful technology to develop and use tools that supplements the healthcare industry.
In December 2017 they produced a product which focused on visualising burns. By this I mean that they developed a way in which burn victims were able to use this product (shown below) to show the burn victims what their burn will look like sit progresses. For this they used a pair of glasses that use anon built camera to show the hologram of their burn progression.
These glasses use mixed reality (MR) to deliver the experience to the patient. One problem they faced was inaccuracy in the visuals their technology produces for the patients. One example of this that they gave was burns on joints can develop specific stretch marks that can be healed overtime by the patient doing regular exercise, however, each patient is unique and individual so it may be inaccurate for some circumstantial patients.
However, They managed to turn this issue around and find a solution. They decided to use HoloLens that the patient would put on and this HoloLens would use image recognition to create a 3D mesh of their burn and skin. This then would be overlayed on their actual burn area. This process of creating a new 3D mesh based on each specific patient using holoLens meant that each patient could seethe progression of their own specific burn overtime and this solved the problem of not being able to do this for each individual before.
The below image displays their explanation of how they developed the 3D mesh.
This inspired me to think into how I could use AR in a healthcare setting. It made me consider innovative ways that I could use AR to develop a way in which enhances healthcare. This is where I thought of my idea of allergies AR (this is explained in further depth in the ‘planning’ section of this blog).
Case study 2:
Disney meet’n greet – HeroMirror Interactive Augmented Reality Experience at the 2018 GHC
Women in Computing Celebration for Disney
(Research sourced from: https://www.indestry.com/hero-mirror-augmented-reality-experience-at-ghc-for-disney)
the company INDE installed a ‘hero mirror’ for Disney. This concept is simply a screen shaped as a mirror where people are able to view themselves next to a famous Disney character as shown below using the example of Minnie Mouse.
When these used this it played a timer and then took their photo and washable to instantly print their photos with a stand-by printer. It also had the option of emailing their photos to them which worked successfully too. I found this interesting as I always associated successful, however this use of a 2D character in AR seems very effective and interactive. This is something that will inspire me when I make my final product, the fact that the use of 2D in AR can still be very effective and interactive.
The initial action I took for developing some concept ideas was to draw out a mind map, this would help me to brainstorm and then collect any useful ideas. This mind map is displayed below.
My main developed concept ideas from the mind map:
Car safety –
One idea I had was to incorporate car safety into AR, by experiencing a Car crash, interacting through AR on the users phone. This would allow the user to walk around the crash scene and see everything that has happened. This would be really effective as it would emphasise and display the dangers of driving irresponsibly and would really show users the reality of and severity of car crashes as a result of a driver’s actions. This would hopefully make users question themselves about responsibly driving and be more cautious on the road.
Fridge contents –
Another idea was to use AR for fridge contents. The way that this would work is when a user holds up their phone to the fridge it shows them what they need to buy. This could be taken further to the point where it automatically adds those items to the user’s shopping list. This would be effective and really useful, however, because this would take extensive programming to be able to read the contents the fridge and determine what is needed, I will not be developing this idea.
This idea involves allergies and if worked could even save lives. The concept is that when a user holds their camera up to a certain allergic reaction their phone tells them which allergy it is and how to treat it. This would be helpful as people tend to catch allergic reactions without knowing where from but this would help them establish that and further treat it. Not only is it good for the individual user to use but also means that if you come across somebody having a severe allergic reaction you could use the AR to establish how to help that person and save their life. This is an amazing concept, however, I have chosen not to develop this further as it would take extensive programming and it wouldn’t necessarily work with aspects such as poor lighting in the environment using the AR and also many allergies appear the same so this could confuse the AR or even lead to wrongfully treating an allergic reaction.
Music: Spotify song codes –
This concept was inspired by scrolling and using Spotify, I was struggling to find a specific song and I couldn’t remember how to spell it. However, I then realised that Spotify song codes exist. These are codes similar to QR that when scanned with the camera take you to Spotify and starts playing the song. An example of this is shown below:
This is something that I thought I could use for the ‘hidden in plain sight’ project. The way I would like to implement this would be that when somebody types in a specific song, the song code appears in their AR environment, when they then tap on this it takes them straight to the Spotify song. This could also further developed to the point where the user can place more and more of these codes within their AR environment which is then saved so that they can go back to and have their favourite songs all logged in their own AR environment. This would save people time as they just have to open their own AR program and they have their collection of songs. This could also be further developed into the concept in which they could share their AR environment of collected song codes with their friends for them to explore and listen to (the same concept of a playlist but in an AR environment).
This Spotify song code is my favourite idea and one that I think is more realistic for me to be able to develop using Adobe Aero.
Spotify song code
My idea for this, as stated above (see ‘Spotify song codes’ in the planning section above for full explanation of my idea), is to create an environment in which users can play their favourite song using AR.
Developing my idea
I started by drawing a sketch of what I envisioned my AR object to look like. Below shows this concept.
The idea for this was to create a Spotify themed play box for the users song code. The play button hovering on top the box which is tapped to play the song and the song code to be tapped to take the user to the Spotify link the specific song.
One problem that I faced with this was that I realised that I won’t be able to make an official working Spotify code as I would need to partner with Spotify to develop one. My solution for this was to choose a specific song and use its song code to develop my idea as close to my initial concept as I possibly could.
I went onto Spotify and located a song to use. I chose to use the song ‘Gives you hell’ by ‘The All American Rejects’. I used the app only phone to find the song code for it and found it as shown below.
I put this image into Photoshop and used the lasso tool to cut out the song code as shown below.
The next step I took in this process was to look at Spotify’s themes with use of colour and design etc. After looking through Spotify, I noticed that they use duotones and gradient colours a lot, as displayed below, and psi decided this was something that I wanted to integrate with my work to fit the aesthetic and make it more appealing to the user than a blank colour. Another thing that noticed was that Spotify use circular and rounded shapes, so this is also something that I wanted to apply to my work.
(All three sources below sourced from: Spotify)
After seeing a lot of use of pink and blue colour on Spotify I decided to use that as inspiration for my colour scheme. I used a pink to blue gradient and overlayed it on top of the Spotify song code to get anaesthetic look. I then round the corners of the shape, just to subtly incorporate the rounded shaping that Spotify use. This is shown below.
One problem that I had:
I wanted to make this Spotify song code into a 3D object as before however I didn’t have access to Maya in time to be able do so.
I began to experiment with things that may help to make this look 3D in photoshop. The best that I could do was to add a bevels underboss effect which is displayed below. This didn’t work as it didn’t look as aesthetic as I wanted it to and didn’t look the same when I imported it into Aero.
I also tried to make a 3D shape using photoshop to create the sides and top and bottom of the object I envisioned, however, thisdidnt work as Aero doesn’t merge layers together like Maya or photoshop would. Because this didn’t work I had to think of a way that I could do this just as effectively as my original concept but in a more basic but effective way.
I chose to go back to my original 2D png image as shown before and this seemed tone much more effective. I remembered from my research into case study 2 that something 2D can still be just as interactive and effective a 3D object or character. My solution to making this more interactive was to add subtle animations to it to make the user feel in control of the song code but also to create an almost personality and add character to my png.
Developing my solution
I started by drawing another sketch of how envisioned my 2D interactive Spotify code, as shown below.
My idea for this was to create an animated 2D png interactive and fun for the user. I wanted to make it so that hen these taps the png it bounces up once and when it lands it plays the song. Another problem that had was that I couldn’t make it so that when you tap it a second time it takes these to the link of the Spotify song. My solution to this problem was to automatically send the user to the link once the song has played for a minute.
I audio recorded the song as Aero only accepts .mp3 and .wav files.
Below displays my final Spotify code png.
Now i had everything made, ready to produce, I just had to open Aero and create the finished animated and interactive png on my computer.
I imported the Spotify song code png into Aero and centred it to 0.0 to X,Y and Z. this means that will be centre of the anchor point when the user opens the interface making the png quick and easy to find.
I used the ‘behaviour’ section of Aero to create the series of animations wanted. I managed to complete the animations towhead I envisioned, by having it lay the .wav audio file when tapped and then once it’s finished playing it directs the user to the Spotify link. Below shows my order of behaviours in Aero.