Crystal Vibes finally Released on Steam! Download now.

Download 3D version now on SteamVR! https://store.steampowered.com/app/623040/Crystal_Vibes_feat_Ott/

Experience body vibrations of candy colored psychedelic sound rippling through an endless crystal universe. Crystal Vibes utilizes the cutting edge of spatial 3D audio, full-body 'Synesthesia Suit' vibro-tactile stimulation, and sound visualization that maps sound and light based on the science of the human senses, to push the frontiers of technology-mediated sensory experience in virtual reality. With the project's predecessor described as “transcendent” and “like traveling through a psychedelic kaleidoscope” (Forbes 2016), this piece ups the ante with music from producer Ott. and is all-new for a trippier Sundance Film Festival and New Frontier 2017. 

Procedural Audio in Unity - Noise and Tone

Procedural audio is audio that is created using code at run time, rather than using a wav or mp3 sound file.  It can be very useful for creating audio that reacts to the situation and does not sound repetitive, which is difficult with prerecorded sound files.

To use the following script, create a GameObject in Unity and add an AudioSource.  Add the following C# script to the GameObject.  The script will automatically add a AudioLowPassFilter to the GameObject.

Press play and listen to the procedural audio!  You can mess around with the public variables on this script.  Frequency is set to 330Hz (an A tone) but change this to change the pitch.  The amount of noise compared to the tone volume is set by noiseRatio.  You can change the sound of the noise using the settings on the AudioLowPassFilter

//By Benjamin Outram
//C# script for use in Unity 

// create a new file and copy paste this.  Name the file:
// ProceduralAudio.cs
// attach script to a GameObject that has an AudioSource on it.
// adjust the public variables to experiment.


using System.Collections;
using System.Collections.Generic;
using UnityEngine;

[RequireComponent(typeof(AudioLowPassFilter))]
public class ProceduralAudio : MonoBehaviour
{
    private float sampling_frequency = 48000;



    [Range(0f, 1f)]
    public float noiseRatio = 0.5f;

    //for noise part
    [Range(-1f, 1f)]
    public float offset;

    public float cutoffOn = 800;
    public float cutoffOff = 100;

    public bool cutOff;


    
    //for tonal part

    public float frequency = 440f;
    public float gain = 0.05f;

    private float increment;
    private float phase;



    System.Random rand = new System.Random();
    AudioLowPassFilter lowPassFilter;

    void Awake()
    {
        sampling_frequency = AudioSettings.outputSampleRate;

        lowPassFilter = GetComponent<AudioLowPassFilter>();
        Update();
    }



    void OnAudioFilterRead(float[] data, int channels)
    {
        float tonalPart = 0;
        float noisePart = 0;

        // update increment in case frequency has changed
        increment = frequency * 2f * Mathf.PI / sampling_frequency;

        for (int i = 0; i < data.Length; i++)
        {
            
            //noise
            noisePart = noiseRatio * (float)(rand.NextDouble() * 2.0 - 1.0 + offset);

            phase = phase + increment;
            if (phase > 2 * Mathf.PI) phase = 0;


            //tone
            tonalPart = (1f - noiseRatio) * (float)(gain * Mathf.Sin(phase));


            //together
            data[i] = noisePart + tonalPart;

            // if we have stereo, we copy the mono data to each channel
            if (channels == 2)
            {
                data[i + 1] = data[i];
                i++;
            }

            
        }
        

        
        
    }

    void Update()
    {
        lowPassFilter.cutoffFrequency = cutOff ? cutoffOn : cutoffOff;
    }



    

}

Virtual Reality Reading UIs

Virtual Reality (VR) devices have increasingly sparked both commercial and academic interest. While applications range from immersive games to real-world simulations, little attention has been given to the display of text in virtual environments. Since reading remains to be a crucial activity to consume information in the real and digital world, we set out to investigate user interfaces for reading in VR. To explore comfortable reading settings, we conducted a user study with 18 participants focusing on parameters, such as text size, convergence, as well as view box dimensions and positioning. This paper presents the first step in our work towards guidelines for effectively displaying text in VR.

Read full text here!

Liquid Crystal Music Video With Max Cooper

Max Cooper is one of my favourite artists in the world.  He takes electronic music to fine-art status and brings in many ideas from physics and science.

I was fortunate enough to collaborate with him on a music video for his track – “Music of the Tides” (Out now on Balance 030 - smarturl.it/Balance030MaxCooper)

The footage shows many liquid crystal phase transitions, including isotropic, nematic, cholesteric, columnar, smectic A, smectic C, twist-grain-boundary, and crystal phases of matter.

You can purchase special edition prints from Max Cooper's online store here, or a wider selection from my shop here.

Recently featured on Vice here

Evolutionary Algorithm with Neural Networks for Modelling Behaviour

Below are the result of some of my tinkering with machine learning and evolutionary algorithms with neural networks.  The first shows the network learning to get better at better at moving around in the space while not hitting itself, while the second is more advanced and shows a predator-prey scenario with co-evolving behaviours.  More information is in the video descriptions.

You can download a unity project and run the first one for yourself from my GitHub repository.

Neutrino at Dubai International Film Festival

Neutrino went down well at DIFF 2017 with over 100 people trying the demo. In addition, I was with Tanner Person demoing Flow Zone (Tanner Person, Benjamin Outram, Youssef Bouzarte), a poi-based virtual reality flow toy designed to induce states of flow, creativity and freedom.

Neutrino is a completely new kind of virtual reality experience. Dance dance revolution meets juggling in virtual reality for a rhythm action experience like never before As you pass through the levels you learn more and more complicated juggling tricks. This demo is being exhibited at Dubai International Film Festival this week.

Time control in virtual reality

This demo shows moving the controller through space to control time in the virtual environment. This kind of instant replay could be very useful for a variety of applications, and gives the user a feeling of having extra-ordinary control over their environment.

You can download the Unity project (Unity 5.6.1f1) on my GitHub here.

Crystal Vibes officially selected for Kaleidoscope Showcase Vol.2!

Crystal Vibes feat Ott. has been officially selected by Kaleidoscope for exhibition at Kaleidoscope Showcase Vol.2!  Check out the Crystal Vibes project page at Kaleidoscope here. Crystal Vibes will be exhibited alongside brilliant projects such as Chocolate, Dear Angelica, Mind Show, and The Life of Us.  You can see the full line up here!

For more about Crystal Vibes go to the Crystal Vibes main page here!

For more of the brilliant psychedelic dub music producer Ott, visit his website here!

niceimage1.png

Crystal Vibes Prints Now Available to Purchase!

Crystal Vibes is a psychedelic synesthesic full-body-haptic virtual reality experience that I am presenting at Sundance Film Festival 2017.  Please see here for more information and video of Crystal Vibes!

Prints of music visualising fractal art are now available at my new store called Unfurl Media.  Please check them out!

Unity GPU Instancing: Unlit Instanced Shader

Unity GPU instancing allows you to duplicate meshes without using much CPU overhead, which means you can render more cubes or more copies of trees, fishes, fractal geometries, or whatever else you can dream up!

You can read more at in the Unity 5.5 documentation on GPU Instancing.

To Instance a cube object, for example, create a cube GameObject, and then create another empty GameObject, whose transform we will use for the instanced cube.

For instancing to work, you have to use an Instanced Shader on the Material of your cube.  To create a new instanced shader, goto Create => Shader => Standard Surface Shader (Instanced).

 The Create menu is found below the Project tab.

The Create menu is found below the Project tab.

Use this shader (or another Instanced shader) on the material of the object you want to instance.  Then add the following script to the cube:

using UnityEngine;
using System.Collections;
using UnityEngine.Rendering;

public class InstanceMesh : MonoBehaviour {

    //Attach this script to the object you want to instance, such as a cube object.  It should have a mesh renderer on it.

    Mesh mesh;
    Material mat;

    //Make an empty game object and drag it into the obj variable in the editor.  This object's transform will be used as the transform for the instanced object.
    public GameObject obj;

    Matrix4x4[] matrix ;
    ShadowCastingMode castShadows;

    public bool turnOnInstance = true;

    void Start () {

        mesh = GetComponent<MeshFilter> ().mesh;
        mat = GetComponent<Renderer> ().material;
        matrix = new Matrix4x4[2]{ obj.transform.localToWorldMatrix, this.transform.localToWorldMatrix};
        castShadows = ShadowCastingMode.On;


        Graphics.DrawMeshInstanced (mesh, 0, mat, matrix, matrix.Length, null, castShadows , true, 0, null);
    
    }

    void Update () {

        if (turnOnInstance) {
            
            mesh = GetComponent<MeshFilter> ().mesh;
            mat = GetComponent<Renderer> ().material;
            matrix = new Matrix4x4[2]{ obj.transform.localToWorldMatrix, this.transform.localToWorldMatrix};
            castShadows = ShadowCastingMode.On;


            Graphics.DrawMeshInstanced (mesh, 0, mat, matrix, matrix.Length, null, castShadows , true, 0, null);

        }
    }
}

Drag the empty GameObject you created into the "obj" public variable on this script in the Inspector.  Now hit play!  The cube should be copied using the transform of the empty GameObject.  Try modifying the position, rotation and scale parameters on the empty GameObject at runtime to see the instanced cube also change, as in the below screenshot.

 The cube on the right has been Instanced and copied based on the empty GameObject's transform.

The cube on the right has been Instanced and copied based on the empty GameObject's transform.

Lighting problem

One annoying limitation is that instanced meshes do not work with lighting very well, for example point source lighting.  Only a single directional light seems to work on instanced meshes (please correct me if I am wrong!).

 Directional light looks fine...

Directional light looks fine...

 Point light only lights original cube...

Point light only lights original cube...

Using Unlit Instanced Shader

One way around this is to not use lighting and use instancing for unlit meshes instead. Unfortunately, there is no Standard Unlit Shader (Instanced) option in the Create menu.  So I just quickly made one:

Shader "Instanced/Unlit"
{
    Properties
    {
        _MainTex ("Texture", 2D) = "white" {}
    }
    SubShader
    {
        Tags { "RenderType"="Opaque" }
        LOD 100

        Pass
        {
            CGPROGRAM
            #pragma vertex vert
            #pragma fragment frag
            // make fog work
            #pragma multi_compile_fog

            #pragma multi_compile_instancing

            
            #include "UnityCG.cginc"

            struct appdata
            {
                float4 vertex : POSITION;
                float2 uv : TEXCOORD0;
            };

            struct v2f
            {
                float2 uv : TEXCOORD0;
                UNITY_FOG_COORDS(1)
                float4 vertex : SV_POSITION;
            };

            sampler2D _MainTex;
            float4 _MainTex_ST;

            UNITY_INSTANCING_CBUFFER_START(Props)
            UNITY_DEFINE_INSTANCED_PROP(fixed4, _Color) // Make _Color an instanced property (i.e. an array)
            UNITY_INSTANCING_CBUFFER_END
            
            v2f vert (appdata v)
            {
                v2f o;
                o.vertex = UnityObjectToClipPos(v.vertex);
                o.uv = TRANSFORM_TEX(v.uv, _MainTex);
                UNITY_TRANSFER_FOG(o,o.vertex);
                return o;
            }
            
            fixed4 frag (v2f i) : SV_Target
            {
                // sample the texture
                fixed4 col = tex2D(_MainTex, i.uv);
                // apply fog
                UNITY_APPLY_FOG(i.fogCoord, col);
                return col;
            }
            ENDCG
        }
    }
}
 Using the Unlit Instanced Shader given above.

Using the Unlit Instanced Shader given above.

Using the unlit shader, you can bake in lighting, or use it for purposes where you don't want to use lighting anyway.

Please let me know if you found this useful!

Adding a detail map

As a freebee, here is also an Unlit Instanced Shader which allows a secondary detail mask so that you can create nice things like the image below:

 This shader allows a detail mask that can be used to do interesting things like only highlight the edges, as in this example.

This shader allows a detail mask that can be used to do interesting things like only highlight the edges, as in this example.

Shader "Instanced/UnlitDetail"
{
    Properties
    {
        _MainTex ("Texture", 2D) = "white" {}
        _Detail ("Detail", 2D) = "gray" {}

    }
    SubShader
    {
        Tags { "RenderType"="Opaque" }
        LOD 100

        Pass
        {
            CGPROGRAM
            #pragma vertex vert
            #pragma fragment frag
            // make fog work
            //#pragma multi_compile_fog

            #pragma multi_compile_instancing

            
            #include "UnityCG.cginc"


            struct appdata
            {
                float4 vertex : POSITION;
                float2 uv_MainTex : TEXCOORD0;
                float2 uv_Detail : TEXCOORD1;
            };

            struct v2f
            {
                float2 uv_MainTex : TEXCOORD0;
                float2 uv_Detail : TEXCOORD1;

                //UNITY_FOG_COORDS(1)
                float4 vertex : SV_POSITION;
            };

            sampler2D _MainTex;
            sampler2D _Detail;

            float4 _MainTex_ST;
            float4 _Detail_ST;

            UNITY_INSTANCING_CBUFFER_START(Props)
            UNITY_DEFINE_INSTANCED_PROP(fixed4, _Color) // Make _Color an instanced property (i.e. an array)
            UNITY_INSTANCING_CBUFFER_END
            
            v2f vert (appdata v)
            {
                v2f o;
                o.vertex = UnityObjectToClipPos(v.vertex);
                o.uv_MainTex = TRANSFORM_TEX(v.uv_MainTex, _MainTex);
                o.uv_Detail = TRANSFORM_TEX(v.uv_MainTex, _Detail);
                //UNITY_TRANSFER_FOG(o,o.vertex);
                return o;
            }
            
            fixed4 frag (v2f i) : SV_Target
            {
                fixed4 col = 0;

                // sample the texture
                fixed3 main = tex2D(_MainTex, i.uv_MainTex).rgb;
                fixed3 detail = tex2D(_Detail, i.uv_Detail).rgb;


                col.rgb = main * detail;

                // apply fog
                //UNITY_APPLY_FOG(i.fogCoord, col);
                return col;
            }
            ENDCG
        }
    }
}

Occlusion culling illustration

I made this! It shows the spheres that would be visible from the point of view of someone standing in the center of a huge cubic lattice field of spheres, with the space filling factor changing from very small to very large and back. Black regions are where spheres are not visible due to occlusion from foreground spheres.

I extended the method to 3 dimensions. The following shows the visible spheres in a 3D cubic lattice viewed from the center (200X200) with a radius-to-spacing ratio of 0.12 and an angular z-depth buffer with angular resolution of pi/8000.  The gif scans through one of the symmetric dimensions in time, and plots the other two dimensions in the x-y plane of the screen.  It creates a very mesmerising pattern, don't you think?

I created an occlusion culling algorithm in order to make the rendering of spherical opaque objects in 3D space more efficient, for making videos like this one: