360 VR video showcasing

Discussion in 'Mapping Questions & Discussion' started by AI_, Jan 13, 2019 at 9:14 AM.

  1. AI_

    AI_ L1: Registered

    Messages:
    3
    Positive Ratings:
    3
    Hi there, I am the current owner and dev for Jump Academy and am currently working on a project that streamlines map showcasing in 360 VR. The motivation was to use this to showcase jump maps and individual jumps for informative purposes. However, I figured this may be useful for the Source mapping community in general.

    Here is an example output for what I have so far (testing via CS:GO due to the generally better texture quality needed for qualitative evaluation):


    View: https://www.youtube.com/watch?v=PVEXCNy70eU


    Noted, there is still some work needed for stitching the sides of the 3D cube map to blend better.

    This is all done with freely available software (SourceMod, Python, OpenCV, FFmpeg), so there is no need for commercial software like Adobe Premier. The recording can be done in-game, so there is no need to port the map to run in Source Filmmaker either.

    Please let me know if you guys are interested and I will keep you guys posted about the development and releases.
     
    • Like Like x 3
  2. Kobolite

    aa Kobolite Your local dutch person

    Messages:
    641
    Positive Ratings:
    530
    ooh i really like these kinds of videos so i'm interested
     
    • Agree Agree x 1
  3. MegapiemanPHD

    aa MegapiemanPHD Doctorate in Deliciousness

    Messages:
    1,035
    Positive Ratings:
    534
  4. AI_

    AI_ L1: Registered

    Messages:
    3
    Positive Ratings:
    3
    The hardest part is probably getting the prerequisites installed and working. It needs Metamod+Sourcemod to run on the local game server, and Python libraries have to be installed for communicating with the local server, and later for image processing.

    After that, this is the procedure so far:

    1. Create the camera paths in-game using the Sourcemod plugin via an in-game menu and drag+drop for path nodes. This can take around 10-15 minutes per map depending on pickiness, and and can be replayed live. There's actually an option to have the camera point in the direction it is traveling along the path for those who just want a regular fly-through video without the VR:


    View: https://www.youtube.com/watch?v=DCdXzvN5rGU


    2. Run the Python recording script while the game is running the map. It will talk to the local server and automatically start/stop recording (using the hotkey for your video recorder, e.g. FRAPS or Nvidia ShadowPlay) for each of the 6 view angles needed for the cube map. The Sourcemod plugin controls the player camera during this whole process. This outputs 6 video files.

    3. Run the Python stitching script given these 6 files as input. Currently this will output all the frames as PNG images after converting them to equirectangular projection that most VR viewers/editing software understand. It currently runs at about 2 seconds per frame on CPU at 4K output resolution on my rig with an Intel i7 3770k clocked at 4.3 GHz.

    4. Run FFmpeg with the frames as input to output the final video at the desired frame and bit rate.
     
  5. AI_

    AI_ L1: Registered

    Messages:
    3
    Positive Ratings:
    3
    You can also skip the entire video recording process if you want it to process just one frame with screenshots of the 6 faces. This would be useful if this site adds support for VR previews in the browser without going through YouTube. There should be public JS libraries for static-image VR.