@Bean While I agree with pretty much everything you said, especially the bits about thinking of the OP-Z as a game controller/sequencer of sorts, I have to take issue with your assessment of the OP-Z not being a video synth.
Just for arguments sake, and primarily because I would rather discuss teenage engineering conspiracy theories than do my monday work, lets just investigate for a moment what synthesis is.
Mirriam Webster defines synthesis as the composition or combination of parts or elements as to form a whole.
The OP-Z is a system that incorporates screenbased devices, a Unity based realtime visual engine and an audio synthesizer to syncopate your sequences of Sound, DMX and I am assuming midi or OSC data from one portable device.
I think it is unfair to say that this system is not a visual synthesizer when two of the elements that it controls are inherently visual, DMX lighting and Unity graphics.
Now I agree that it is not a traditional visual synthesizer in the sense that the sound of the oscillator is not being immediately visualized. Of course if you wanted this to be the case I would hope that you could create your Unity Game to do exactly that. Imagine for instance that your Unity “game” was a black line on a white background… when you triggered one of the synth engines the line might wiggle at the frequency of the audio… The way I understand these things… that is visual synthesis, the combination of audio and other elements to create a composition which is then output as a video source. My modular video synthesizer is just a bunch of oscillators, filters and faders that all operate at a video rate. Circuits must be designed to create the individual modules. It seems to me that this is comparable to the OP-Z and its Unity counterpart. Your game must be designed and then can be sequenced and/or modulated by the OP-Z’s onboard sequencer, and then can be visualized on your iOS device.
Creating the unity game is no different in my eyes than creating an individual module in a traditional synthesizer. You are just creating a universe and assigning rules and functions to how incoming data will interact with each other as it streams in over time.
VERY fun stuff, very new take on it. It remains to be seen the depth which TE is going to allow users to explore this new avenue… it could be as wide open as Unity itself or limited to simply assigning functions of preexisting games to the four dials on the OP-Z. Hopefully the latter is not the case and we will be able to go as far as our Unity prowess allows. This is all also further excited by the ability to then trigger DMX devices…
Imagine the possibilities of the OP-Z, a software like garagecubes madmapper ( a dmx and video mapping software ) and a portable projector and LED bar setup… AMAZING
I guess we will have to wait to get our hands on it to truly know if I am just in my own fantasy or this is all going to soon be reality…
but the potential is there, and I hope for the best 
in regards to the audio portion of the show, I am very interested as well… the possibilities of step sequenced modulation would blow open doors.