Various notes without a single topic, but all related to the possibilities of immersive art and text. CG and 360 possibilities are innumerable and exciting...but aren't there also untapped resources in the traditional 2D artforms?
What if someone like Virginia Fleck had a way to easily, and dramatically, present her works in VR?v
I am just thinking out loud here...would love to hear your comments.
...some links related to this. Oh yes... the image used at the top of this post was "borrowed" from the blog of Skarred Ghost, who is doing some amazing things with VR and full body presence.
Here is a link to one of his superb posts: on https://skarredghost.wordpress.com/2016/12/28/how-to-show-a-video-in-a-texture-with-unity-on-android/
Netflix home theatre in the Oculus. http://techblog.netflix.com/2015/09/john-carmack-on-developing-netflix-app.html
The screens on the Gear VR supported phones are all 2560x1440 resolution, which is split in half to give each eye a 1280x1440 view that covers approximately 90 degrees of your field of view. If you have tried previous Oculus headsets, that is more than twice the pixel density of DK2, and four times the pixel density of DK1. That sounds like a pretty good resolution for videos until you consider that very few people want a TV screen to occupy a 90 degree field of view. Even quite large screens are usually placed far enough away to be about half of that in real life.
The optics in the headset that magnify the image and allow your eyes to focus on it introduce both a significant spatial distortion and chromatic aberration that needs to be corrected. The distortion compresses the pixels together in the center and stretches them out towards the outside, which has the positive effect of giving a somewhat higher effective resolution in the middle where you tend to be looking, but it also means that there is no perfect resolution for content to be presented in. If you size it for the middle, it will need mip maps and waste pixels on the outside. If you size it for the outside, it will be stretched over multiple pixels in the center.
For synthetic environments on mobile, we usually size our 3D renderings close to the outer range, about 1024x1024 pixels per eye, and let it be a little blurrier in the middle, because we care a lot about performance. On high end PC systems, even though the actual headset displays are lower resolution than Gear VR, sometimes higher resolution scenes are rendered to extract the maximum value from the display in the middle, even if the majority of the pixels wind up being blended together in a mip map for display.
The Netflix UI is built around a 1280x720 resolution image. If that was rendered to a giant virtual TV covering 60 degrees of your field of view in the 1024x1024 eye buffer, you would have a very poor quality image as you would only be seeing a quarter of the pixels. If you had mip maps it would be a blurry mess, otherwise all the text would be aliased fizzing in and out as your head made tiny movements each frame.
The technique we use to get around this is to have special code for just the screen part of the view that can directly sample a single textured rectangle after the necessary distortion calculations have been done, and blend that with the conventional eye buffers. These are our "Time Warp Layers". This has limited flexibility, but it gives us the best possible quality for virtual screens (and also the panoramic cube maps in Oculus 360 Photos). If you have a joypad bound to the phone, you can toggle this feature on and off by pressing the start button. It makes an enormous difference for the UI, and is a solid improvement for the video content.
Still, it is drawing a 1280 pixel wide UI over maybe 900 pixels on the screen, so something has to give. Because of the nature of the distortion, the middle of the screen winds up stretching the image slightly, and you can discern every single pixel in the UI. As you get towards the outer edges, and especially the corners, more and more of the UI pixels get blended together. Some of the Netflix UI layout is a little unfortunate for this; small text in the corners is definitely harder to read.
So forget 4K, or even full-HD. 720p HD is the highest resolution video you should even consider playing in a VR headset today.
Dragon Front Card Game
(Interesting in terms of card size/distance from viewer) For a card game, Dragon Front was an exhilarating experience that mixes the tense moments of high-level strategy play with the full-body escapism of VR. Yet after a few turns going back and forth, you start to completely forget that you're even playing a game with a headset on. The competition starts to feel as natural as a physical table-top experience, while the Rift just becomes an interface for your virtual showdown."
After just 10 minutes or so of tutorial playing, I was able to grasp the game's lengthy turn-based combat and try my hand at a real one-on-one fight with another human being in VR. For a card game, Dragon Front was an exhilarating experience that mixes the tense moments of high-level strategy play with the full-body escapism of VR. Yet after a few turns going back and forth, you start to completely forget that you're even playing a game with a headset on. The competition starts to feel as natural as a physical table-top experience, while the Rift just becomes an interface for your virtual showdown.
Dragon Front makes you forget you're playing a VR game
So Dragon Front may not be the most immersive VR title out there or one you could show your parents to convince them of the technology's potential. But it's certainly a unique rethinking of the VR approach, one that will most certainly catch on as headsets like the Rift start becoming a more common way to play a wide variety of games and not just first-person experiences.
Great piece by Vincent McCurley
From Vincent's article: Download the printable VR Storyboard template PDF via DropBox or just grab the image below. (THANK YOU VINCENT!)