CDH Test 2

Project Summary

I made this.

Project Details

Congratulations on publishing your project!  Don't forget to set the permissions if you want to use it in AR or VR or share with others.  
Click here to edit the detailed description for your project.

 

PUBLISHING FOR AR/VR (XR) USING CATAPULT?:
You can include images, videos, and other important details.

GLB Assets are your friends!  While you can publish other file formats to Catapult for display in XR including OBJ, FBX, STL, GLB (single file GLTF) assets will usually work best.  This is because they are typically optimized for AR/VR with minimal visual compromise, and they can be hot-loaded into your project and device using Catapult.  

You can save to GLB format directly from most 3D design and creative platforms directly (some source apps may require a plugin for file export).  Do you have a specific creative workflow in mind?  For more information on specific source formats please contact support@makeSEA.com.

You can also use videos in XR, in .mp4 format.  Including 360 surround and flat-screen videos as well as live streams.  We recommend 1080p, although, you may get up to 4K video to play on some devices.  For surround video we recommend 1.5K video (3072x1536).  Bitrate matters:  4-8Mbps is usually good; higher rates may cause poor frame rates on some devices.  To point to external video sources and indexed livestreams use the Remote Assets to define a source.  See below for more information.

Catapult also supports images in .jpg and .png format, PDF files, Word and Excel docs for display in XR.

Supported file formats:

.glb (GLB/GLTF) –Preferred for 3D and Digital Twin Assets; max poly count varies by device start small (Oculus typically handles meshes of <1M polys well, HoloLens 1-2M polys and Magic Leap MM polys)

.obj* (OBJ)

.fbx* (FBX)

.stl (STL

.png (PNG)

.jpg (JPEG)

.mp4 (video; SD, HD, 360 surround**)

.pdf (PDF)

.doc* (Word)

.xls* (Excel)

 

*Limited translation; some content may not convert
**Up to 8K on Oculus, 1.5K on Magic Leap and HoloLens; recommended stream rate 4-8Mbps, “faststart” flagging recommended at encoding time.  

 

Preparing Video Content for display with Catapult:

Use your favorite video production method and software or for a quick and easy conversion tool See:

https://support.apple.com/downloads/quicktime

https://ffmpeg.org

https://obsproject.com

 

makeSEA highly recommends using FFMPEG to optimize any video for XR regardless of the source:

For 720p (SD):
ffmpeg -i <input.mp4> -r 30 -b 4096k -vf scale=1280x720 -movflags faststart <output.mp4>

For 1080P (HD):
ffmpeg -i /<input.mp4> -r 24 -b 4096k -vf scale=3072x1536 -movflags faststart <output.mp4>

For 360-Surround or UHD:
ffmpeg -i <input.mp4> -r 24 -b 4096k -vf scale=3072x1536 -movflags faststart <output.mp4>

 

Note that 3072x1536 is max resolution for MaicLeap 1 devices; larger scales may work on Magic Leap 2 and Quest 2.  Max resolution will limit if the video displays at all for a given class of hardware device.  Bitrate impacts video quality/frame-rate.  Cached videos may play more reliably at full bitrate over slow network connections.  Optimal max stream bitrate for Magic Leap 1 is 4Mbps, for Quest 2 and Magic Leap 2 is 8Mbps.  Livestream videos using Web streaming or WebRTC protocols will automatically choose the optimal bitrate and resolution for the network connection, preferring frame rate over quality.

 

Livestreams supplied in .mu38 and RTSP livestream format are supported up to the resolutions limits of each device and can be linked under the Remote Assets project section

Flatscreen videos and livestreams are available for placement and display from the bookshelf.  360 surround videos are available from the Project menu while in the project.

 

Video tagging nomenclature:  put the following string in your video title to tell Catapult how to handle the stream:

“360”:  assume the video should be used as a 360-surround video

“avatar”:  have the video canvas follow the local user (self), when playing

“greenscreen”:  chromakey/greenscreen trim the video being played

“autoplay”:  play the video as soon as it is loaded

For example, a video named “My Movie chromakey avatar autoplay”, will start as son as it is loaded upon entering the scene, will have any green background trimmed, and will pivot to follow the viewer as they move around.

About chromakey/greenscreen video:  shooting your video in front of a green background is often sufficient as long as the lighting of the green background is consistent.  However, it can help to use an editing software like iMovie to clean up the background so that it is a solid green, which will produce a cleaner trimmed effect.

Saving and Recalling Scenes:  If you are a Project owner or Team Member, and are Master of Ceremonies (MC) while in the scene, you can save the layout and placement of assets.  An opportunity to save is presented upon exit from the scene.  Or, you can save and recall the scene at any time from the General Settings menu.  Saved scene layouts are recalled automatically upon reentry into a saved scene.

 

For complete instructions see: 

https://www.makeSEA.com/how-to

Average (0 Votes)

Assets
Related Assets
Remote Assets