OP-Z SPECS AND INFORMATION AT ALL!!

yeah. i feel like i remember one of the te guys doing the whole "don't quote me on this" thing and saying around $300.

it's not something i completely pulled out of thin air because i originally thought it'd cost much more.

but in comparison to the op-1 this is plastic and not aerospace grade aluminum. no screen and only one pressure sensitive switch.

i guess we'll see

What’s the pressure sensitive switch?

the pitch bend button. i’m 99% sure i read/saw/heard somewhere last week that the pitch bend button is pressure sensitive

Cool! Which button. Is it? And how do you control up and down bend?

:)

Dude! Preach.

And I can’t do any visual stuff, although I did a unit of max msp at University. Blew my mind. Wish I kept up with it actually. I can’t wait to add visuals to my music. High hopes that the opz facilitates that in a meaningful way. I would love for them to release the graphics side early too. Probs will never happen :stuck_out_tongue:

@Najrock SSsssshhhhhhhh I want people to keep being dismissive of OP-Z… With crazy suggestions of usefulness like that youre going to make it hard to get one at launch.

@Najrock The OP-Z is not a video synth, consider it more of a game controller for an IOS device running your animated video created in Unity and rendered into an OP-Z format that runs in the OP-Z app. This file format should be provided with the tool kit. Within Unity (installed on a laptop or desktop) you are in a sense creating a “game” for lack of better term with a predetermined set of effects and views that can be controlled and sequenced by the OP-Z. You will not likely see any graphics packs as there is no need to compete with Unity in that respect. I’m sure the app will include already rendered “games” to sequence with the OP-Z and your song without having the need to render anything in Unity right away.


With regards to Unity… it is free right now for personal use. For a noob like me. I can start the tutorials and create a game or short movie and use my entire studio of music gear to add the audio. There is nothing stopping that from happening right now for free other than the desire and motivation to learn the art of making games or short CGI movies. I might give it a shot.

Check out Unity here
https://unity3d.com/

As a musician, I am very interested in the OP-Z audio synthesis, which I have not seen in any review to date. How will it be different from OP-1 and PO’s? Watching the NAMM videos, it will have the same LFO’s and effects as OP-1 and sequencing the same as the PO’s but nothing yet on the actual synths engines.

I am aware that the specifications could change and I hope they also provide Android capability to this… I own a few apple phones and Ipad but nothing that is capable of running the app according to the Spec provided by TE.

compatible with any ios device that is apple metal graphics specified

Here is a link to that IOS compatibility list,
https://developer.apple.com/library/content/documentation/DeviceInformation/Reference/iOSDeviceCompatibility/HardwareGPUInformation/HardwareGPUInformation.html#//apple_ref/doc/uid/TP40013599-CH106-SW1

If TE cannot port the APP to anything other than IOS, Then the question for me is how will it stand on its own with regards to something new in synthesis. as most of its capability will be locked to IOS devices.

OP-Z, ,compatible IOS device, HDMI output cable. seems more expensive than an OP-1 already.

One other side note… that USB cable attached to the OP-Z in the NAMM videos has a light on it…charge light or bluetooth?







I still don’t get it lol :confused:

@Bean While I agree with pretty much everything you said, especially the bits about thinking of the OP-Z as a game controller/sequencer of sorts, I have to take issue with your assessment of the OP-Z not being a video synth.


Just for arguments sake, and primarily because I would rather discuss teenage engineering conspiracy theories than do my monday work, lets just investigate for a moment what synthesis is.

Mirriam Webster defines synthesis as the composition or combination of parts or elements as to form a whole.

The OP-Z is a system that incorporates screenbased devices, a Unity based realtime visual engine and an audio synthesizer to syncopate your sequences of Sound, DMX and I am assuming midi or OSC data from one portable device.

I think it is unfair to say that this system is not a visual synthesizer when two of the elements that it controls are inherently visual, DMX lighting and Unity graphics.

Now I agree that it is not a traditional visual synthesizer in the sense that the sound of the oscillator is not being immediately visualized. Of course if you wanted this to be the case I would hope that you could create your Unity Game to do exactly that. Imagine for instance that your Unity “game” was a black line on a white background… when you triggered one of the synth engines the line might wiggle at the frequency of the audio… The way I understand these things… that is visual synthesis, the combination of audio and other elements to create a composition which is then output as a video source. My modular video synthesizer is just a bunch of oscillators, filters and faders that all operate at a video rate. Circuits must be designed to create the individual modules. It seems to me that this is comparable to the OP-Z and its Unity counterpart. Your game must be designed and then can be sequenced and/or modulated by the OP-Z’s onboard sequencer, and then can be visualized on your iOS device.

Creating the unity game is no different in my eyes than creating an individual module in a traditional synthesizer. You are just creating a universe and assigning rules and functions to how incoming data will interact with each other as it streams in over time.

VERY fun stuff, very new take on it. It remains to be seen the depth which TE is going to allow users to explore this new avenue… it could be as wide open as Unity itself or limited to simply assigning functions of preexisting games to the four dials on the OP-Z. Hopefully the latter is not the case and we will be able to go as far as our Unity prowess allows. This is all also further excited by the ability to then trigger DMX devices…

Imagine the possibilities of the OP-Z, a software like garagecubes madmapper ( a dmx and video mapping software ) and a portable projector and LED bar setup… AMAZING

I guess we will have to wait to get our hands on it to truly know if I am just in my own fantasy or this is all going to soon be reality…

but the potential is there, and I hope for the best :slight_smile:


in regards to the audio portion of the show, I am very interested as well… the possibilities of step sequenced modulation would blow open doors.

Remind me again how you can’t control DMX, Max MSP, Processing, Pure Data, D3, or even Unity from Midi and Audio sources today? I’m failing to see the point of your enthusiasm I think. I do agree that visualization (or even visual synthesis) is the next “analog” but until now I haven’t seen anything revolutionary in the OP-Z demos.

..... Mirriam Webster defines synthesis as the composition or combination of parts or elements as to form a whole.

The OP-Z is a system that incorporates screenbased devices, a Unity based realtime visual engine and an audio synthesizer to syncopate your sequences of Sound, DMX and I am assuming midi or OSC data from one portable device. ......


If Monday’s the day for looking up words like “synthesis”, you might want to look up the meaning of “syncopate” too :slight_smile:


Meanwhile, in agreement with @Bean, I too see the OP-Z’s video functionality as controlling/sequencing pre-specified handles in video content, with the new software providing a) an abstraction layer for the Unity API and b) a friendly UI for OP-Z users.

I did enjoy reading your first post. I love your enthusiasm.

CB

Remind me again how you can't control DMX, Max MSP, Processing, Pure Data, D3, or even Unity from Midi and Audio sources today? I'm failing to see the point of your enthusiasm I think. I do agree that visualization (or even visual synthesis) is the next "analog" but until now I haven't seen anything revolutionary in the OP-Z demos.


You can and I do. I am excited about bringing the opportunity to a greater community of people in an inventive and interesting new way. Everything is already possible but this is in one little affordable package. Just like everything was possible already with the OP-1 and yet it still changed the way people worked and also invited in new creatives.

So really, I am excited that a piece of hardware has a native link to a generative software and will be made with creatives in mind, rather than programmers and systems techs. Its the potential I find so fascinating… also the potential of traveling with a small package that has a large impact.

@cloudburst apologies, Iphone fat finger, *synchronize

I have a question for the game engine peeps : it is my understanding that the unreal engine is a good (best?) option for non-programmers as you can rely on its visual scripting blueprint system VS learning C# in unity.
Why do you think TE would opt for the unity engine, when their goal (seemingly) is to bring the visual aspect of performance to artists that deal mainly with sound and are most likely not programmers?

I am super excited about the opZ (after understanding it more over this round of promo material). As a professional 3d artist, i see the potential to create some cool stuff. However, one thing I am well aware of is that I simply don’t enjoy scripting/programming. No matter how cool i know it is to make anything via programming, I just don’t like doing it.

Is there any hope for me in unity, to make something unique and interesting artistically, without hacking together scripts and pulling my hair out in the process?

@Najrock If you review the OP-Z preview on TE’s website you will notice that they don’t call it video synthesis but animated art and video. After some meditation it seems like I am having a hard time defining what a Video Synth really is… So then I thought maybe I should just get a drink… So I got my drink then thought…


If I could apply the definition of audio synthesis to video than is it possible that it could look like this?

My OP-Z contains the elements and combination of parts required to create the sound of a trumpet.
My Unity software contains the elements and combination of parts to create an object that looks and moves like a trumpet.

Websters has no definition of Video Synth… So i can only guess… Is it possible that the OP-1 is a video synth because it has a screen and you can manipulate the DNA synth with knobs and audio that feeds back crazy lines on the screen?

I still believe a better definition of the OP-Z’s graphic capabilities as an “animated art and graphics manipulator” as it has no graphics creation or video output capabilities at all. I don’t think that’s unfair…

So who is this Keijiro Takahashi that TE is working with on the video side?.. Google search time…

I did find this video of Keijiro-san manipulating Unity video in real time with Korg products.

Well if he can do this with his Korg why can’t I do this with my OP-1 right now… Well low and behold Keijiro Takahashi is on GitHub. This confirms for me @crudeoperator comment on the ability to utilize OP-1 to modulate video right now. I did not know this…thank you.
https://github.com/keijiro
check out this file…

<a href=“https://github.com/keijiro/Reaktion” class=“text-bold” style=“font-family: -apple-system, BlinkMacSystemFont, “Segoe UI”, Helvetica, Arial, sans-serif, “Apple Color Emoji”, “Segoe UI Emoji”, “Segoe UI Symbol”; font-size: 14px; box-sizing: border-box; color: rgb(64, 120, 192); outline-width: 0px;”>Reaktion
<p class=“pinned-repo-desc text-gray text-small d-block mt-2 mb-3” style=“box-sizing: border-box; -webkit-box-flex: 1; flex: 1 0 auto; font-family: -apple-system, BlinkMacSystemFont, “Segoe UI”, Helvetica, Arial, sans-serif, “Apple Color Emoji”, “Segoe UI Emoji”, “Segoe UI Symbol”; background-color: rgb(255, 255, 255); margin-top: 8px !important; margin-bottom: 16px !important; color: rgb(118, 118, 118) !important;”>Audio reactive animation toolkit for Unity<p class=“pinned-repo-desc text-gray text-small d-block mt-2 mb-3” style=“box-sizing: border-box; -webkit-box-flex: 1; flex: 1 0 auto; font-family: -apple-system, BlinkMacSystemFont, “Segoe UI”, Helvetica, Arial, sans-serif, “Apple Color Emoji”, “Segoe UI Emoji”, “Segoe UI Symbol”; background-color: rgb(255, 255, 255); margin-top: 8px !important; margin-bottom: 16px !important;”>Well I think I will explore this software and tool kit and see what I am capable of creating along with my OP-1.

@bean Let us know how you go with the tool kit!!!

it is my understanding that the unreal engine is a good (best?) option for non-programmers as you can rely on its visual scripting blueprint system VS learning C# in unity.

Well “visual scripting” is also available in Unity. So it’s more like “if that isn’t doing it for my project, do I want to learn C++ or C#”. And I would choose C# to be honest. Still for visuals running with the OP-Z you’ll likely never have to touch a line of code in either of those Engines.

Why do you think TE would opt for the unity engine, when their goal (seemingly) is to bring the visual aspect of performance to artists that deal mainly with sound and are most likely not programmers?
Well only 3 theories:
1) Unity was founded in Copenhagen. Maybe they are more "scandinavian spirited" than Epic games
2) licensing/royalties. Maybe they offered the better deal
3) practicality. Even though the OP-1 (and likely the OP-Z) are programmed in C++, the guys doing that are lileky not the same people doing the graphics and the interop with the potential game engine. If they have a cooperation going with Takahashi and he knows his way around in Unity, that would be an easy choice, right?
Is there any hope for me in unity, to make something unique and interesting artistically, without hacking together scripts and pulling my hair out in the process?

I don’t know. Maybe just take a weekend off, download latest Unity (and the visual scripting plugins) and have a go at it.

@tabascoEye

Thanks for the insights! I’ll definitely download both engines and give them a try. The studio where I work is going down the unreal lane, but in the end software is software, so I guess once I am familiar with the concepts, its not such a big deal to use both.

Whatever the OP-Z is, and however it integrates into the universe TE creates for it all I really wanted is already happening right now in this thread.


Everyone researching Unity and its capabilities, discovering current AV artists and realizing the tools are already there. Getting amped on the OP-Z and adding video to their quiver in general

progress.

audio and visuals… same thing different frequencies :slight_smile:

Nice @najrock

The sequencer page at 1:33 in the Sonic State video looks great! Let’s hope the data from that page is bidirectional rather than just displaying the steps currently selected for each track of the OP-Z.

(From reddit AMA):


Me: Seeing as the OP-Z has a MEMS mic… what functionality will be used for this?

TE: The OP-Z has a input track with it’s sequencer track and dedicated effects, and the input track can be further routed into the rest of the OP-Z.

!!!

OP-Z might have a USB host interface, according to the AMA

Yep, I had to ask them!


[–]sparky742
Will the OP-Z act as a USB Host (e.g. for OP-1 direct MIDI over USB connection without anything in the middle?)
[–]sparky742
(pleeaaase make this a yes??!)
[–]teenage_engineering[S]
Ok as you ask so politely … Yes!