One of the key foundations for any successful project is building a plan which accounts for surprises. For Project Hindsight, we aim to create a plan which accounts for an experience that has unique facets, such as the merging of live-action and realistic interaction with CG objects.
The challenge with this type of deliverable is that it is neither a film nor a game, hence even the pipelines and workflow have to be entirely unique to reflect this project, marrying the best practices from both worlds to inform our planning choices and decisions.
Our goal is to develop rapid prototypes in a weekly and bi-weekly schedule that enable us to explore the different challenges and possible solutions. These prototypes will be self-contained, with each having different test goals and challenges which we want to overcome before our principal photography phase later in March.
In this blog post, I will be delving deeper into our first prototype, provided with a technical analysis by Sunil Nayak, Project Hindsight’s programmer and sound designer.
Human Tripod Prototype One, titled “Igor”
Week 2 Prototype: Lessons Learned
The goals for week 2 were as follows:
- Develop a VR environment with a 360 video and put in 3D models (cans of soda) that appear to be a part of the environment
- Add in Spatial Sound.
- [Stretch Goal] Oculus + Touch integration
A lot was learned while we attempted to do this. Primarily, the lighting was a big issue and there was always a problem with the lights being either too bright for the video (but not bright enough for the 3D models) or the opposite (and even worse for the 3D models). Shadows are necessary, but with the former the shadows disappeared (lights were too bright), and with the latter, everything was too dark. Directional light was tried to light up the video sphere, but that was also being horrible.