How did TE design the UI?

This a questions for the graphic / software designers amongst you:


How did TE design and prototype the OP-1’s UI? I was wondering whether anyone had seen any making of videos, interviews or had any insight?

I know that in the firmware the main UI elements are stored as SVGs, which must be manipulated live in software - it’s super clever!

I’m a designer and am interested in developing a similar interface mechanism for an unrelated product (non-musical - not even close). I’m not a software designer. In the past designing apps we build a prototype in something like Axure and the developers code it up. But in the world of physical product and embedded UIs I haven’t found a tool that interfaces with hardware and allows a UI to be prototyped - i have absolutely no idea how TE turned Illustrator files into a working prototype, and then a product. It’s executed so well and I’d like to learn about it.

Anybody got any knowledge?

Tom

There is a thread here where people are reverse engineering the beta firmware. Apparently you can change the images. TE has posted some images to instagram showing part of their design process

SVG’s are vector based so most likely as you said designed in Illustrator.


I’m guessing they just get everyone in the UI design and build involved through the project so that the programmers know how things work/animate/move so when they get given the SVG’s they know how to code it all in to work this way.
Any platform that would allow previewing SVG’s would still need someone to go in and indicate movement/animation etc anyway.
Maybe they’ve built their own testing platform but I doubt it. I’d bet it’s all just artworked then prorgammed in then tweaked from there.

C++, Quicktime, etc. have libraries for utilizing SVG files. http://softwarerecs.stackexchange.com/questions/4325/c-c-library-to-parse-svg-files


Once the application is written in a high level language it can be compiled to run on the OP-1’s Blackfin processor via Visual DSP: https://www.scribd.com/document/97800763/Blackfin-Processor-Programming-Reference


@PoisonousBirds I believe little is known about the design process/workflow of the OP1 in particular, except the implementation details from above. Probably @cuckoo can enchant them and get better answers?

If you’re only doing the design, then you probably don’t need to care too much about how it’s implemented. At first at least. With some glue code you could prototype the UI in a browser, e.g. with d3, which does SVG as well if you fancy that. Have your programmers build a generic interface to the HW, e.g. with a serial protocol that you can call through some REST service from your web page, if you actually want to interface with it. A Raspberry might be interesting for this as you could access the GPIO pins to interface with the real HW.

In the end, once you’ve got your first draft of the UI you need to sit together with however is going to actually code it up so see how it can be translated into something the machine can actually handle. If you’re not constrained for space and power you might get away with just sticking the Raspberry into the Product itself :slight_smile:

Thanks @kingof9x, @spacetravelmadeeasy, @Hamster, @eesn, @crudeoperator for your responses.


I'd bet it's all just artworked then prorgammed in then tweaked from there.
In the end, once you've got your first draft of the UI you need to sit together with however is going to actually code it up so see how it can be translated into something the machine can actually handle.

This unknown black box of “then code it up” is precisely the bit I’m interested in. There isn’t just a magic code wand that makes designs move. Manipulating SVGs live by a relatively low spec processor isn’t something I’ve come across before and am struggling to find out how it’s done. The OP-1’s display is magical and I’d like to know how they do it.


P.S. Apologies for the slow response, forgot to check if this thread had any replies.

AFAIK they use the AGG library to render anti-aliased vector graphics: http://www.antigrain.com/

(Once when my OP-1 crashed it listed a few AGG source files in the onscreen dump)


AFAIK they use the AGG library to render anti-aliased vector graphics: http://www.antigrain.com/
(Once when my OP-1 crashed it listed a few AGG source files in the onscreen dump)


YES now we’re onto something thank you.