Week ??? (IDK anymore) – Final Project Prototype

In the last two weeks, I have developed my red panda model further, adding some color/texture and smoothed out the behaviors in each scene in RC. I applied the code I received from Nien to make the app one continuous experience, rather than each individual scene. I ran into some problems with the RC file size, as each scene is rather large and takes a while to build. I had to reduce the scenes to 5 instead of 7. I will need to fix up the 5 physical scenes (on paper). 

Here is the link!

Next steps:

  • smooth out previous scene transition by applying notification trigger in RC behaviors
  • sometimes tapping functionality is off – figure out what is going on with that (for ex: sometimes when I tap the globe, it doesn’t spin)
  • potentially add a voice over trigger for page readings
  • add RC function so that pandas follow user camera
  • would like to wire up a button on the last page (adopt) on the UI interface so that each time the user presses it, a panda will pop up 

Week 11 – Functional Prototype for Final Project

This week we were given the task to create a functional prototype of our final project. My final project idea revolves around the idea of creating an interactive AR storytelling experience of the endangered red panda. 

This week, I focused on putting all my assets into RC and creating a very basic prototype of what I imagine my final project to be. I mapped out the storyboard and the ideal interactions I hope to carry out. I initially had trouble referencing multiple scenes (aka. different pages) from RC into Xcode and that caused a lot of problems. I also focused a lot of 3-D modeling through Blender and texturing them through Substance Painter. Since I spent more time on building and importing other assets, I didn’t get a chance to fix up my panda to the ideal image yet. 

For the screen design, I wasn’t exactly sure how to go about the design because as of now RC detects the image on each page and triggers actions, rather than through an interface on the screen. I will need to think more about how to vamp up the interactivity. For future work, I was thinking I could add more interactivity on the screen by including buttons/labels/etc that trigger some sort of popup notification that allows the user to learn more about something or go to a link? It would also be cool to have the assets “pop out” of the book! Not sure yet.

My end-to-end user journey is still in the process of being finished, but you can kind of see the idea of what the story journey is like. For future work, I would like to figure out how to create a start and end page so that the user journey is more clear. I am just not sure how the code for that would be because we’ve mostly been working on creating in the moment AR experiences. As of now, I’ve been working on separate scenes rather than one coherent one. So, when the image anchor is detected, it triggers an interaction for that anchor, causing overlap from the previous page. I’m not sure how to make the previous pages assets go away or make certain assets (like the pandas) stay. 

Here’s a link to this problem.

Here is a link to the pages I have put into RC so far.

I have 3 more pages that I would like to add, but I wanted to see how to make the story/experience coherent before I put those into RC. I’m just not sure how I would begin to go about the code for that. 

Moving forward, here are some tasks I still need to work on/need help with:

  • Creating a final physical storybook
  • Editing and texturing my assets, mainly the red panda
  • Creating + designing a start and end page
  • Figuring out how to make the story one whole journey (as of now it’s just scene by scene)
  • Integrating user interactions through UI?

Week 7 – Midterm Pt 2

For this weeks assignment – our midterm – I built off the idea of what I created last week. I wanted to create an experience that would mimic a farm dog herding/leading its clan of chickens. Of course, during the process of creating this app, I ran into various challenges.

First, I had difficulty adjusting the constraint layouts of the buttons. On Xcode, I wanted the buttons to be laid out this way (shown below) to allow the user to have a larger screen space for the animals. But for some reason, I could not figure out how to get the constraints to be organized in the way shown on the screenshot below. I would change the constraints of each individual button, but it would always end up becoming even more messy and disorganized so I left it in the plane detect storyboard format that was created during class. This layout of buttons still worked though, as there was still enough space to see the animals. In the future, I will need to go back an adjust the constraints to allow for maximum screen space. 

 

I also ran into a problem where I could not get the chickens to be added on as a child. After speaking to Nien, we figured out that the problem was because I set the default scale of the entities in Reality Composer as something very small, so that when I pressed “add a chick”, there was a chick being added but it was so small that you couldn’t see it by eye. I had originally set the scale small because in the past when I imported models into RC and put them into the AR app, the scale/size of the entity in real time was huge (as seen for my sheep in my previous week’s assignment). However, I realize now this is something you can and should alter through code via Xcode, not through RC scalability. 

In addition, I tried to include the alteration of scalability of the entities that were added. I tried to write code that would create a relationship between the slider and the size of the chicks (so it would mimic if the chick added were a baby or an adult) but the slider didn’t end up working. I’m not sure if it were the code I wrote that had incorrect logic or if the relationship created on Xcode was faulty.

Here is the link to the screen recording!

In the future, it would have been fun to experiment with having the entities jiggle or wobble back and forth in real time to resemble “walking” when the user pressed on the “move to cursor” or “move right” buttons . I didn’t know if this were possible/would cause a crash because I knew you could trigger a notification that would cause an entity to jiggle through RC functions, but I didn’t know how to link up two actions – with one being on the Swift side and the other being on the RC side. When I tried writing the code for it and wiring it up, the build failed or just caused my screen to stay frozen. 

Week 6 – Interactivity

This week we were given the task to create an interactive AR app. Using Xcode’s storyboard, I attempted to design a very basic user interface. I continued on from my previous week’s farm animal theme and wanted to create an interface where the user could add more animals to the farm – particularly sheep and chickens. Initially, I wanted to have a page swipe component so that the user could add sheep to one page and chicken to another page, but it seemed really complicated and I just ended up going into the deep web of stack overflow and confusing myself more. So, instead, I just put two buttons that call the entity of the animals. However, as I was laying out the components, I found a lot of difficulty trying to break the links created from the storyboard to code and assemble the constraints so that what is seen on the storyboard is reflected on my device. 

*old version

 

Here’s the link to the video!

Week 5 – ARKit & Swift

This week we were given the task to create 2 programs – Fibonacci and Lyrics – on Swift. Below are the screenshots of the programs and the app interface when you tap on the screen.

The second part is the creation of our own One Button AR app. For this portion, I wanted to continue my farm animal streak from the weeks before. I imported a grass field mesh and a bunny mesh into Reality Composer.  I added the grass field and the bunnies to the origin anchor, so when you enter AR this view would pose as the world.

For user interactivity, each tap would cause a carrot to appear, as if you were feeding the bunny. I wanted to try and make the bunny to scale up in size to indicate that the bunny “ate” the carrot, but I was having trouble figuring out the code for it. So that would be the next step for this project. Also, it would be really cool if I could target the bunny that the carrot drops on scale up and have the carrot be removed after a set period of time. 

Here’s the link!

Model Credit: Grass: Somersby, Bunny: yellowkab