Grabbing and docking
From the actions menu in the VR workbench you have the option to make an object "grabbable", a grabbable object is an active object
in the VR experience which the user can hold and change its position and rotation in the scene, it is the most basic form of interacting
with an object in Simlab VR.
Objects marked as "grabbable objects" are also transformable in the scene building mode during the VR session, in the scene building mode
you can move, rotate, scale, copy and delete an object in the scene.
From the same menu you can also create a grabbable sequence which is a type of constrained object animation in the VR, whereas the regular
grabbable objects can be moved freely by the user in the scene, the grabbable sequence follows a pre-designed path of motion, an example
of grabbable sequence application is the motion of a lever or a control stick where the user can grab the lever and move it to any degree as
long as it falls within the allowed motion path of the lever.
Docking can be considered as a type of extension to the grabbable object function, however the grabbable object in the docking system is
paired with a static object that serves as a docking station for the grabbable object.
In a docking system the grabbable object can be moved and rotated freely by the user and when released it will remain in its last position
same with the regular grabbable object, however, if the grabbable object was released while it is intersecting with its designated docking
station, it will assume a position and a rotation that was already predetermined by the docking system.
An example of a simple docking system is placing a bolt in its designated hole, you do not have to accurately position the bolt in a hole
when using a docking system, once the bolt is released in the vicinity of the hole, it will automatically be positioned inside the hole in
accordance with the predefined docking system position.
VR In Showing
Scene Building Mode
The Scene Building Mode in Virtual Reality allows the users to modify objects in the scene, they can move the object around,
rotate them, and scale them.
In addition to the basic transformation operations, the user can hide and show elements in the scene,
duplicate objects or delete them.
To make objects editable in the scene building mode they have to be set as "grabbable objects" by using the "make grabbable"
tool in Simlab composer.
The scene building mode is suitable for interior designers since they can rearrange the scene during the VR session without
the need to go back to simlab composer.
The assembly/dis-assembly tool allows the user to build a system that defines relations between grabbable objects,
it is similar to the docking system however it can involve several objects and not only 2 objects.
In an assembly system the user builds a hierarchy that will dictate the order of which the objects can be grabbed,
and since not all the components of the system can be grabbed at any given time but in a certain order, it makes the
assembly system useful for building assemblies that mimics successive construction systems or assembly stations.
An example of building an assembly system would be creating a system that allows the user to assemble a chair design
where he has to follow the building steps in a certain order placing and securing each component of the chair.
From the object's attributes panel, the user can specify a certain action that will be triggered upon interacting with that object, these are the VR functional actions that can be attached to an object :
- URL : upon interacting with the object, a web browser will appear displaying the content of eaither a specified web page or an html document that the VR designer specified.
- Scene State: a specified Scene state that can contain transformation info, material info or visibility will be applied once the object is activated.
- Play Sequence : upon interaction, an animation sequence will play, the sequence can be set to play in a loop.
- Play Sound : a selected audio file will play upon interacting the object.
- Message box: a message box with a title and a body will appear upon activating the object.
- Multiple Actions: you can set a multiple number of the previously mentioned actions to be triggered upon interacting with the object, the actions can be set to be triggered consequently or simultaneously.
An example of using actions would be attaching multiple scene states each with different material to an object, and once the user interact with the object, the scene states will be applied successively showing different option for the product's material.
SketchUp to VR
for Rhino 3D
Visualize Scene Options
The Visualize scene options tool allows you to attach scene states and animation sequences to an object with a more appealing
visualization than using actions on an object.
The animation sequences and scene states will be represented by 3d entities in the VR space and are accessible by activating a
pin or several pins that appear when inside the Visualize Scene Options Mode in VR.
The designer can attach as many scene states and animation sequences as he pleases to a single pin, he can also choose between
different visual representations for each of the pin's attached options.
The visual property types of the options lists are:
- Label : where a preview image or a text will be displayed to represent the animation sequence or the scene state.
- 3D Model : where a 3D model from the scene will be displayed as the options representative.
- Material : where a specified shape with a material will appear to represent the materials captured by the attached scene states.
Using the pins system to display options is more practical and appealing to the user than the direct actions method when having a
multitude of options to display, and since it is a function confined in a separate mode it will not conflict with having actions
attached to the same object.
A practical example of using the visualize scene options tool is creating a list of 5 scene states each containing a different design
of living room couches that the user can display and examine then choose to place in the scene.
VR Events system
The VR Events system in simlab composer allows you to map responses that would be triggered upon the occurrence of certain events.
That type of events that simlab VR can monitor and react to are :
- Object enters object : when a specified object's geometry intersects with another object's geometry.
- Object exits object : When a specified object's geometry leaves a specified intersecting geometry.
- Scene State applied : when a certain scene state in the scene is applied.
- Sequence Ended : when an animation sequence that is playing comes to an end.
- Variable condition changed : when the value of a variable is set to a specific value.
- Scene start : once the VR scene is loaded.
The aforementioned events can be mapped to trigger one or more of the following responses :
- Change action for object : attach a specific action to an object which upon triggering would perform that action.
- Apply Scene state : apply a specific scene state when the attached event occurred.
- Change variable Value : change the value of a specific variable.
- play sequence : play an animation sequence.
- play sound : play an audio file.
- Change grabbable state : Set an object to be a grabbable object or remove the grabbable attribute from an object making it static.
This advanced system allows the user to connect multiple responses to a single or multiple events which can lead to creating a fully
harmonious connected environment, applications of the Events system can vary from creating a simple system of interacting with a cabinet or a desk to
creating an intricate complete VR mechanical training experience.
VR Events in
Light baking is a new technique that stores the value of light occlusion and shadows in a new layer of texture.
Generating and utilizing light baked textures adds top quality visualization for mobile viewers not capable of performing those calculations in real time.
Light baking also can be used with Desktop and VR viewers to provide great quality shadows and lighting without wasting precious CPU power on calculations.
Once you create a light baked map for a model, the effect is preserved and is not affected by material or the texture applied to the object changing, so the user is free to adjust or modify the objects material without the need to regenerate light baking maps.
360 Rendering and HDR
A 360 or a panorama image, is a special type of image that allows the user to experience being inside a 3D scene,
the user can view the scene on a mobile, desktop or share it on Facebook or other viewers.
Creating a 360 image using SimLab Composer is very simple, first you need to create a VR camera, place it in the scene,
then select the VR camera, and render a 360 image.
You can also place more than one VR camera in the scene and render multiple 360 images from different positions in the scene
creating a 360 Grid.
360 Grid Technology is a smart and simple way to navigate the whole design, using multiple 360 rendered images that are connected
to each other allow you to move between them insuring full coverage of the design.
Using VR Cameras the User can also create HDR images, HDR (High dynamic range images) or environment images, are one of the preferred
methods of lighting a 3D scene for rendering.
Traditionally HDR images are generated using special types of cameras, or through a specific software, which are costly methods and
limit your freedom of distributing the created images due to licensing issues, however simlab composer offers this feature among countless
other at reduced cost and with no constraints on distributing the HDR images that you create.
360 For Facebook
There are 3 ways to navigate scene in a VR experience while using a VR headset:
- In room navigation (as supported by HTC) this method is suitable for navigating small areas (up to 3 meters),
the user can simply move in the VR designated area to move in the 3D scene.
- On ground teleportation, this method is suitable for navigating medium sized buildings (up to 20 meters),
In this method the user picks the point on the ground to be teleported to.
- Large scene navigation, In this method, the designer adds 360 cameras to the scene, the user can select
to move from current location, to the location of any 360 camera in the scene, this method is suitable for navigating
large models like schools, hospitals, malls, or large building complexes.
When running the VR experience on a mobile device or on Desktop mode, the available navigation methods are :
- Walking Mode, in this mode the user can move around in the scene as a human character would,
he can walk on surfaces but not be able to move vertically in the scene but he can still access the large scene navigation mode.
- Flying Mode, in this mode the user can freely levitate above the ground and move around the scene without any
gravitational force pulling him down as though he was flying, this mode is suitable for exploring large exterior scenes.
- Mechanical Mode, in this mode the user will pick a point or an object in the scene to be set as the cameras focus
point, once set, the camera will rotate around this point giving the user the ability to examine components and focus on specific objects in the scene.
SimLab VR Viewer
Simlab composer allows you to add visual elements to the scene to enhance the scene's appearance and add a touch of realism to static scenes.
Through the Visual effects menu available in the VR workbench you can add a fire effect, the fire effect comes in 3 types mimicking a
blazing fire, a calm fire and a candle fire, you can adjust the fire's color through the attributes panel as well as the smoke's color
which you can choose to disable or enable.
In addition to fire you can add a smoke effect which comes in 2 types, the default ascending smoke and an area smoke,
you can also adjust the smoke's color from the attributes panel in simlab composer.
Simlab composer allows you to convert surfaces to water surfaces, so instead of having a flat surface with a water material,
you can have a dynamic water with moving waves.
You can adjust the water's color as well as it's density, you can also set the waves speed to mimic a range of water surfaces types in real life.
Through the environment settings in the VR workbench you can add a fog effect to the scene, by adjusting the fog's color and
density you can create a range of environmental effects such as fog, haze, dust or even a steamy atmosphere.
Night mode is also accessible from the environment settings, when switched on, the night mode will replace the blue sky with a
starry sky, it will also dim the light in the scene simulating a moonlit atmosphere.
Simlab Composer allows you to convert any surface in the scene to an interactive video player using the "Make Video"
tool in the visual effects menu.
From the Video's attributes panel you specify which video to play by browsing your computer and selecting a video file.
You can also choose to set the video to be an interactive video which the user can pause and play during the VR experience,
and you can choose to play the Video upon starting the VR experience, loop the video, or play the video file without sound by
toggling the mute option.
Embedding Video files within the Virtual Reality experience is very useful when designing VR scenes with informative content such
as VR training and virtual guides and tours.
Clipping planes also known as section planes are handy tools that allow the user to dissect a 3D
object and view its inner components.
In architectural design you can cut a building horizontally to see the floor plan, this two-dimensional
top-down cut is called a planimetric view, while cutting a building vertically to see inside the rooms of several floors at once is called
a sectional View.
In a mechanical or a product design Clipping planes allow you to see the interior components of the design,
since you can determine which objects get clipped, you can clip the outer shell of a design using the plane while maintaining
the full visibility of the inside parts.
Clipping planes can be animated and attached to an action as an animation sequence or a grabbable sequence, this method of using
clipping planes allows you to created multiple animated presets to use during the VR experience each focusing on certain components of the design.
The clipping planes can also be converted to a grabbable object which the user can hold and move to control at which angle and distance
should the components clipping occur.