OP-Z dual domain synthesis

I’m pretty certain I remember during one of the preview videos, someone stated that the audio engine would run on your phone/tablet when connected, so that any visuals generated by your device would be in sync with the audio. From the old description on their website, I think they referred to this as dual domain synthesis.

It seems like this definitely isn’t the case in the final release, which is good imo since it makes the OP-Z less of a very expensive controller for something that could be done in an iOS app, and means they can do things that wouldn’t be possible on iOS (like adding extra synths/fx via a hardware module). I’d expect the DSP hardware in the OP-Z is also more capable than an iPhone but not 100% sure?

I know basically nothing about the power of iPhone vs the power of the opz CPU wise, but I suspect the iPhone, especially newer models is about 10 billion times more powerful than O P Z CPU

There are many iPhone CPUs, but I think the most recent ones (if not all “A” models) are way more powerful than the OP-Z DSP, given the much higher clock frequencies, number of cores, cache size, 64-bit architecture, etc. The OP-Z DSP is a 32-bit architecture, single core running at 400MHz only.

Yeah, that dual domain thing sounds like a lot of unecessary complexity.

I was thinking they could run all of the OP-1 Engines (FX, tape and synths) on the iOS and the OP-Z acts as a dongle so only OP-Z users get access!

I think it’s a buzz word that is referring to how you the op-z works with unity.

I’m quite sure that it was intended to run the synth engine directly on the iPad…and was advertised as doing so

Did they ever actually say that? I could have sworn that they did, but all I could find is “synchronised dual domain synthesis with equal focus on video and audio creation” which I think just implies that the audio is on the Z and the visuals are on the other device. Strange!

Oh wait there we go
“To eliminate any latency between image and sound, we ported the OP-Z sound engine to run simultaneously on your IOS device. Something we call ‘dual domain synthesis’”

I don’t remember exactly where it was but in one of the later preview videos when they were showing it off I’m pretty sure the TE guy explained it as the sound engine running on the iOS device.

I wonder why they didn’t do that in the end if the iPhone is powerful enough. I guess it could be confusing if you added a midi module and had midi coming out of the OP-Z and video and audio coming out of the iPhone.

Yeah it’s strange, I’m pretty sure I saw that too. Maybe it was a bit unstable in some way.

Yeah, you may have seen this on a transatlantic flight