A Virtual Reality Experience of Vertigo with Heights


I’ve always viewed Virtual Reality as something that could challenge the mind. And a tightrope trip walking between two skyscrapers above a city I thought would make a great experience. I searched for such an app but never found one strangely.

So I thought I’d make one.


Philippe Petit, a man who walked between the Twin Towers in 1974.


Initial Thoughts/Challenges

In order for this experience to be effective, the experience had to look as close to reality — a great, close-to-photo-realistic city as possible. Audio is an important part of VR, so the sound of wind blowing would contribute to the experience. Fog at that height might make a great addition, also. What would frighten, contribute to the harrowing occasion of being so high up in the sky with no safety below?

Tools Used

The following tools were used to craft the experience:


Challenge #1

After designing the model carefully in Cinema 4D, it became evident how difficult and time-costly it is to export this model with textures and animation to Unity. Textures are a pain to reapply in Unity, and the UV maps are always different/need to be remapped. The end result was that it was faster to obtain a city from the asset store. I did, however, retain the two twin towers that I made in C4D.

Challenge #2

In order for this city to look real, every detail in the cityscape had to be carefully attended to: specular and bump maps, lighting, texturing became immense for every building.

Challenge #3

The Android processor. I wasn’t developing for the Oculus DK1 or the HoloLens. So with the Android’s limited processing capabilities, all of the details and exquisite lighting with the sun in Problem #2 — were out. The camera lag and crashing of the app were constant. I had to settle on a very plain, low poly, un textured city. What a disappointment. I hadn’t ever exported that much detail into Android before and learned my lesson.

Challenge #4

Once I was satisfied with the scene in Unity, there was the issue of movement of the user which was key. Using the mobile device’s accelerometer, I wanted the user to actually move along the tightrope to the end to the other building virtually. I did get it working but due to the already over-taxed processor, the movement lagged and every-now-and-then worked. It just wasn’t reliable and the static movement didn’t feel real as the walking wasn’t constant and fluid. Despite several attempts to subtract a good deal of geometry with the buildings and textures, the interaction of the user’s walk didn’t work well enough to produce a sense of reality.

It was decided upon to make the experience at VR 360 degree video and have the user follow along. It was no longer interactive but still an experience which allowed some semblance of realism/fear/vertigo.


VR is never complete without audio through the use of headphones. The sound of wind was added. The factor of using ambisonic or binaural audio would make the realism effect even greater of which I didn’t have the know-how to do. Just a thought….

User Testing


Harry Josh, Hairstylist

Harry placed the Samsung Gear VR on his head and after walking immediately exclaimed “Where’s my feet? I can’t see my feet?”

I asked Harry why it was important for him to see his feet. He said “I feel that way I am actually walking.” An immediate relation with the user’s body made sense as they were experiencing an all-encompassing 360 degree vertigo-like sensation.


Shay Gipson, Public Relations, Consulting, Marketing

Shay placed the VR headset on her head and yelled as she looked down. The vertigo seemed to have worked with her! After a few seconds she proclaimed motion sickness: she explained it was the result of the camera movement/perspective of the user walking.


Upon taking in the user testing data, I assessed the need for the user’s suggestions for additions. In order to simulate the user’s feet, there was obviously no possible way I could have feet walking as the user walked without some sensory adaptation (perhaps motion capture with nodes) which was beyond the scope of the project and its technology. Since it had been decided to make a 360 degree video, I could put an animation of user’s feet on the tightrope as the user walked.

In addition, it took 151 hours to render the video due to its massive size (typical of VR). Therefore, I had one chance to either place the animated camera movement as the user walked in the video, or leave it out. Leaving it out, I feared the lack of this simulation would be too static and humdrum. The experience needed something else that became a fine line between user sickness and realism. Therefore I reduced the motion of the walking animation to be more subtle.

Heuristic Analysis and Conclusion

The UX principles of VR are pretty new. But here is my usage of them on this project:

  1. Degree of Freedom: The DOF is typical for a project done on a Samsung Gear. The Gear only tracks your head along the three rotational axes, so the DOF is very poor on letting you move as if you have the Oculus Rift or the Vive which is the “6DOF” (six degrees of freedm for full VR movement). Upon further research, I don’t know if it was the processing speed on my project or the accelerometer that doesn’t work well with moving on the phone typically.

  2. Latency: The latency was not great in the interactive version built in Unity due to the issue of mobile’s limitations. on this project There was a definite lag of parts of the 360 experience as blackness as the user moves his/her head and the display attempts to catch up. This problem was solved for the 360 degree video.

  3. VR Presence: Probably the best thing about this project as delineated from user testing. Both user’s reported feeling a bit frightened. I designed and developed the project and I myself was a bit scared.

  4. Screen Door Effect: Was definitely present and was a major stumbling block to try and get the resolution much more refined. Pixelation was severely evident on the video for this project despite a clear and clean render from Cinema 4D. In addition, there is definite moireing taking place on the buildings. This was a limitation of the time it took to render: in order to get this project out, anti-aliasing and resolution considerations had to be cut.

  5. Simulator Sickness: Apparent with Shay Gipson as a user. The latency of the VR interactive experience built in Unity contributed to this.