Nature VR App

From LVL1
Revision as of 01:14, 19 December 2018 by Jhlink (talk | contribs)
Jump to navigation Jump to search

As a recipient of the Sept. Makership from Lvl 1, I will document my progress and provide general guides about the development of this VR app.

Weekly progress updates will be provided every Tuesday through a Medium article release.

A Scavenger Hunt 'Nature' VR Adventure

Description

This project targets the Google Cardboard VR platform for wide availability. Essentially, the user will view sequences of nature scenes accompanied by a narration of Ralph Waldo Emerson's 'Nature' essays. In a given scene, the user will have to search for a unique point of interest to unlock the next sequence.

This Medium article [1] explains more about the app description, inception, and interaction overview.

Footage

Generating 360 footage is expensive, time consuming, or flat out scarce. This Medium article [2] details this further.

Audio

Audio content is actually quite easy to find and process. . This Medium article [3] details this further.

Interaction Control

A general overview of what interactions will be covered is addressed in the second section of this Medium article [4]

Progress Updates

Log #1

General updates and experiences shared can be found through this Medium link. [5]

Milestone #1

The first milestone has been reached!! Video post processing is finally complete! See the first section in this post for more details!! [6]

Unity Guides

Google Cardboard Controls

Implementing event handlers within Unity for the Google Cardboard platform is wonderfully easy once a basic understanding is established. Here's a short five minute read[7] for how to implement onClick, onHold, and onHover event handlers in Unity!

360 Video Streaming and Handling

In this article, a simplistic outline is provided for the steps required to implement 360 video content streamed, downloaded, and played in a Unity. [8] A rudimentary explanation is provided here about how to implement a 360 video player within Unity. [9]

The source code and a detailed explanation about the architecture and its utilization may be found at this GitHub repository. [10]