Jump to content

live api - anybody successful with it?


ultra
 Share

Recommended Posts

as some of us know, liveapi is rather poorly documented.  i was wondering if anybody here has gotten it to work?  all i need is a good starting point so i can run with it.

ultra

edit: it's not so poorly documented.  i wasn't looking in the right places at first, and the documentation is growing. :)

Link to comment
Share on other sites

Hi Ultra,

I had a quick look at the Live API but didn't get on with it.  I found some other info on the web far more useful - see the links to it that I posted in this forum:  http://www.soundonsound.com/forum/showflat.php?Cat=&Board=GTR&Number=681597&Searchpage=1&Main=681388&Words=+PaulB&topic=&Search=true#Post681597

I used the information on those pages as a starting point for integrating my Behringer FCB1010 MIDI controller into my setup, although due to limitations in the FCB I am now replacing its "guts" with a MIOS-based project (hence my presence in these forums now  :)).  Python isn't particularly easy to get up and running with (on OSX anyway) with regards to Live, but setting up is easy than MIOS development IMHO.

Regards,

- Paul

Link to comment
Share on other sites

i've gotten a little bit further with liveapi.  i've also been attempting to inspire the liveapi community (http://groups.google.com/group/liveapi) to get more comprehensive documentation going on their wiki (http://www.assembla.com/wiki/show/live-api).  there are some pretty friendly and informative people over there and they seem to be happy to help whoever is interested.  in fact, it seems the more people are interested in liveapi, they more information they will produce :).  i started out by writing a simple how-to for setting up the necessary tools and getting live to recognize your script (http://www.assembla.com/wiki/show/live-api/Getting_started_with_LiveAPI_-_Toolchain_Setup).  next, i will post an article on how to control mixer devices in live, and how to monitor the devices and send the data back to your midi controller.  that will hopefully happen soon, i just first have to get a few questions answered.

in the end, i do see this developing into a nice set of how-to's similar to the "so complete, any idiot can do it" how-to's that stryd makes.  so if anybody is interested in working with this, please let me know and we can work together.  best to keep the details over at the liveapi forum and wiki, but i post about this here because i would think some midiboxers would love to dig into live.

ultra

Link to comment
Share on other sites

i've now figured out a number of things with liveapi:

i can control any mixer parameter of any track by simply sending the same data with a different channel - it'll automatically attach to the correct channel no matter what other mappings you've made.

i also can control master volume, pan, crossfade, cue volume, etc, all without ever having to map anything.

today i figured out how to get data back to the midibox.  via sysex, i can print the name of a track on my lcd.  i can also query any instrument/effect on the track and display knob names, values, etc, on the midibox.  this means i can have automatic access to its controls without having to map them.  if i put an eq on the track, i can check for available cc numbers, assign them to the low/mid/hi values, and show the name of what i'm controlling on the midibox's lcd.  all without ever having to map anything!

with the sysex capabilities, i wonder if this means live can query sysex capable hardware midi instruments and begin taking control of those automatically...

Link to comment
Share on other sites

Hi Ulta,

Ok, you've got me interested too now!  Is it really simple to get LCD messages displayed in response to feedback messages, or is it just that you are (a) spending hours working out how it all fits together, or (b) a genius ?? (Or both...)

I'm still plodding along in native Python, although I have to admit I haven't put all I've got into it yet as I am still finishing off the hardware side.

One question you might be able to tell me though:  is it possible to send Live a PC message, get the feedback sent AND forward the PC to a VST instrument?  That is my challenge for the weekend anyway...

Regards,

- Paul

Link to comment
Share on other sites

Man like I can't hold it down any more...

WTF

Where are all the ableton users? It seems like there's always 5 dudes wanting to build a new ableton controller around here, this is like the ultimate ableton controller, complete with custom plugin code for Live, FFS... Where are all the people going "OMG that's MAD!! nice one ultra!" ?!?!?

Link to comment
Share on other sites

this is like the ultimate ableton controller

It is the ultramate ableton controller ;D

I don't use abletone, hence I am not all that excited about this really, really cool, pretty complex and incredibly powerful project - you ableton users should be though!

Link to comment
Share on other sites

ultra = (a) + (b);

thanks stryd :)

Is it really simple to get LCD messages displayed in response to feedback messages, or is it just that you are (a) spending hours working out how it all fits together, or (b) a genius ?? (Or both...)

i've been putting quite a bit of time into this.  many hours just trying to figure out where to start.  but once i figured out how to control one thing, the rest just takes a few more minutes.  i'm really not very experienced in coding but i've learned a lot lately.

for getting messages to show on the midibox lcd, it's just simple sysex.

from consts.py in APIMidi: WELCOME_SYSEX_MESSAGE = (240,0,72,69,76,76,79,247)

i didn't even really know what consists of a sysex message, but i did know that 240 and 247 means it's the start and stop of sysex information.  i went over to www.asciitable.com and looked at the character codes for the rest, and sure enough this message means (START, HELLO, STOP).  so i retrieved a track name, turned it into a tuple, and sent it to the midibox.  on the MIOS side of things, it was just a simple routine to start listening when 240 is received and output the characters to the screen using the character codes.  so yes, now that i have a function to show the sysex and learned a bit about tuples in python, printing anything to the screen via sysex is really easy.

One question you might be able to tell me though:  is it possible to send Live a PC message, get the feedback sent AND forward the PC to a VST instrument?  That is my challenge for the weekend anyway...

sorry paul i have no idea what you mean.  can you explain better?

Where are all the ableton users?

the answer may just be that nobody is interested.  but if that's not it, the reason could be that i've yet to talk about how this can be applied to controllers designed for playing live and DJing.  it seems that people have been more interested in creating those kind of devices.

i've had the idea of creating a rather simple 2-4 track controller for live use (after i create the studio controller).  it would have just what's needed for live use, such as controls for EQ, volume, crossfader, etc.  the biggest feature of this would be the use of sparkfun/monome buttons with 2 or 3 color leds behind them.  there would also be a big glcd aligned to the sparkfun buttons.  the glcd would give you the clip names and the activities going on with them.  you'd never have to look at the computer screen.  this is entirely possible with liveapi and it's already been done with the lemur.

even if there is no interest in this, i'm going to continue with it and create documentation, example scripts, and example MIOS applications.  i'm also going to formulate my own "protocol" using sysex instead of cc's because i'll run out of cc's at some point.  with sysex, i can simply assign some kind of identification in the sysex string and parse it out in the ableton live script.  then i can control a virtually unlimited number of items in live.  also, i'm going to do away with my current code that sends absolute values into live and go with something similar to relative midi.  i'll have a lot of control over the resolution by sending out different values and there will be no parameter jumping.  lastly, the plan is to always have the midibox slave to whatever is going on in live and save as little data as possible onto the bankstick.  that will keep you from having to save the same things twice (on the computer and on the midibox) and will also keep things in sync.  you simply turn on the midibox, the midibox asks live what it should do, and sets itself up accordingly.  we'll see how it all goes.  these are all just goals and don't mean very much until i've managed to get something done.

Link to comment
Share on other sites

Ultra keep up the good work. i know that i will be more interested in this in a few months when i've figured out the max/msp side of things to get my live pa up and running.... then all i will need LiveAPI for is the names of clips onto lcd's...

i just have attention issues and can only focus on one thing at a time... i will be checking in from time to time to check on your progress...

Link to comment
Share on other sites

on the MIOS side of things, it was just a simple routine to start listening when 240 is received and output the characters to the screen using the character codes.

Don't forget realtime bytes that may be inserted in the middle of the sysex message... TK's written a sysex handler example that might be handy here :)

from consts.py in APIMidi: WELCOME_SYSEX_MESSAGE = (240,0,72,69,76,76,79,247)

That's a weird ass way to write it. I dunno python but I'm assuming you can use hex notation? In your app, I would do that - once you've been doing this way too long, most of the ascii codes are easier to read in hex, as are the midi status bytes  (0xF7 and 0xF0). If you get cosy with the LiveAPI guys I'd suggest the same to them. It's not "wrong", but it's definitely not "standard"

Actually now that I think about it, it IS wrong - it uses an illegal manufacturer ID. Well, no it doesn't, but if it's legal, it doesn't say "HELLO" any more.....

See here: Manufacturer IDs

The byte following 0xF0 should be the manufacturer ID. In this case, it's 0x00. That means, that the subsequent two bytes should be treated as the manufacturer ID. So, that sysex message really says:

0xf0 = sysex start

0x00 = manu ID 0

0x4845 = Resulting manufacturer ID

0x4c = "L"

0x4c = "L"

0x4f = "O"

0xf7 = sysex end

Oops! Maybe it is time to get cosy with the LiveAPI guys ;)

the reason could be that i've yet to talk about how this can be applied to controllers designed for playing live and DJing.  it seems that people have been more interested in creating those kind of devices.

True... perhaps it's just that I'm seeing the future possibilities you've opened up, but it may not be apparent to everyone...

i'm also going to formulate my own "protocol" using sysex instead of cc's because i'll run out of cc's at some point. 

What about (N)RPN?

that will keep you from having to save the same things twice (on the computer and on the midibox)

I'd guess that live's api will let you send a message to the midibox when you save in Live anyway :)

Link to comment
Share on other sites

as it turns out, live doesn't respond to sysex via the api.  so i'm going with cascading values.

i thought about using something already defined (like the nrpn you suggested), but i don't see a point when the only thing this communicates with is a script i'm writing myself.

so i'm developing my own protocol based on the midi commands i'm able to send and receive.  in a lot of cases, i can do it with three bytes if i define what all the parameters mean.  using notes, i can address a LOT of instrument parameters this way:

note on status: address channel and says to increment

note off status: address channel and says to decrement

note number: address which instrument to control and scaling

note value: address the parameter to control

everything is relative, so i don't need to send any actual values.  with two scaling options (normal/slow), i still have access to 64 instruments on that track (way too many) because i can send 0 for instrument 0's normal scale, and 64 for instrument 0's slow scale.  also, i'll be able to control 128 parameters per instrument.

i've thought pretty hard about this and i think going completely custom is the only way to do it.  i understand why the various standards exist but i don't think the apply in this case.  also, having more i/o available via gm5 gives me extra ports that aren't assigned to live as a control surface :D.

Link to comment
Share on other sites

everything is relative, so i don't need to send any actual values.

Hmm, you would want to be able to send inc/decrements of greater than +/-1...

i've thought pretty hard about this and i think going completely custom is the only way to do it.

Works for Mackie!

Of course, it's best avoided, if it's possible. IF.

Link to comment
Share on other sites

Hmm, you would want to be able to send inc/decrements of greater than +/-1...

it works different than that.  i can send any value i want and translate it on the liveapi side.  when coding it to set the value, it's not 0-127.  when i was sending absolute midi values i had to turn it into a percentage (#/127) because most values actually range from 0 to 1 with a resolution of about .000000000001.

i still haven't hashed out exactly how i'll control scaling but i'm not very concerned about it at this moment. 

i have some of the framework done for an example app where i can send simple functions to live.  anybody can use it because you'll be able to send functions like setKnobParameter(track + channel, volume, incrementer); when a knob is turned.  this would control the volume of a track based on a channel setting in your app.  pretty easy.  scaling isn't really in the code yet, but that'll get there when i figure out how to do it.

so other functions would look like setKnobParameter(master, pan, incrementer) or setButtonParameter(track + channel, mute, pin_value).  so easy!

Works for Mackie!

one way to look at this is being a single instrument because of the tight integration.  so within your own instrument, there's a lot of custom code and it's commented well so others can understand.  same with this, but it just has code in two places.  i think that when i'm done, most users would only have to write code to customize their midibox and the liveapi script will be complete enough that they won't have to touch it.

Link to comment
Share on other sites

  • 2 weeks later...

well, i've completed a lot of functions for use with liveapi.

these are just control.  none of the request functions have been written yet.

these functions will be able to be called from any midibox application you're writing and the script will respond.  i believe there will be enough available control that you don't have to even touch liveapi or python.

i've also figured out how to get live to tell the midibox when anythign has changed, and the midibox will simply run a function.  so, for example, if you send out the command to increment master volume, you won't have to track the volume in your code.  when the volume is changed, live will automatically report back what the volume has been set to (all knob/fader functions are increment/decrement so no parameter jumping) and you can just update your display.

in case anybody's curious, here's what i've got so far (functions to be sent from a midibox).  something like setMasterRecord would be sent as setMasterRecord(on).  pretty damn easy :).  if anybody has some items they think i've missed from master controls or track specific controls (track plugin code hasn't been started yet) let me know.  there's a few methods i can't find, but i think i will find them eventually.  i can't find how to switch session/arrangement view, for one, but mackie does it so the capability must exist.  if anybody is familiar with python and knows how to use decompyle, please let  me know.

void setMasterPlay();

void setMasterStop();

void setMasterRecord(unsigned char state);

void setMasterFollow(unsigned char state);

void setMasterOverdub(unsigned char state);

void setMasterBackToArrangement();

void setMasterPunchIn(unsigned char state);

void setMasterPunchOut(unsigned char state);

void setMasterLoop(unsigned char state);

void setMasterSongTime(unsigned char incrementer, unsigned char scale);

void setMasterLoopStart(unsigned char incrementer, unsigned char scale);

void setMasterLoopLength(unsigned char incrementer, unsigned char scale);

void setMasterQuantization(unsigned char incrementer);

void setMasterTempo(unsigned char incrementer, unsigned char scale);

void setMasterSignatureNumerator(unsigned char incrementer);

void setMasterSignatureDenominator(unsigned char incrementer);

void setMasterGrooveAmount(unsigned char incrementer);

void setMasterMetronome(unsigned char state);

void setMasterVolume(unsigned char incrementer, unsigned char scale);

void setMasterPan(unsigned char incrementer, unsigned char scale);

void setMasterCueVolume(unsigned char incrementer, unsigned char scale);

void setMasterCrossfader(unsigned char incrementer, unsigned char scale);

void setMasterReturnAmount(unsigned char incrementer, unsigned char scale, unsigned char returnNumber);

void setMasterSendAmount(unsigned char incrementer, unsigned char scale, unsigned char returnNumber, unsigned char sendNumber);

void setMasterReturnPan(unsigned char incrementer, unsigned char scale, unsigned char returnNumber);

void setMasterReturnMute(unsigned char state, unsigned char returnNumber);

void setMasterReturnSolo(unsigned char state, unsigned char returnNumber);

void setMasterReturnCrossfade(unsigned char state, unsigned char returnNumber);

void setMasterPlayScene(unsigned char scene);

void setMasterStopAllClips();

void setMasterDrawMode(unsigned char state);

void setTrackVolume(unsigned char incrementer, unsigned char scale, unsigned char channel);

void setTrackPan(unsigned char incrementer, unsigned char scale, unsigned char channel);

void setTrackMute(unsigned char state, unsigned char channel);

void setTrackSolo(unsigned char state, unsigned char channel);

void setTrackSendAmount(unsigned char incrementer, unsigned char scale, unsigned char sendNumber, unsigned char channel);

//not implemented void setTrackCrossfade

void setClipPlay(unsigned char clip, unsigned char channel);

void setClipStop(unsigned char clip, unsigned char channel);

void setClipNudge(unsigned char direction, unsigned char amount, unsigned char clip, unsigned char channel);

Link to comment
Share on other sites

  • 2 weeks later...

yeah i saw this and it's quite interesting.  of course, especially how it can be used to control live.

at this moment i've been holding off a bit with midibox/liveapi stuff so i can see what happens.  i'm not sure if the max stuff will be even necessary for my purposes.  also, the guys who work on liveapi have found some new functions, and i'm waiting to see what happens with that.

Link to comment
Share on other sites

can you elaborate?  as in, the specific steps you take to do this.

i'm holding off on this for now, but i still will complete the project.  i'm having trouble finding the time to get going on it again.  but something usable is already almost done :).

eventually this will change to osc when mios32 supports it.  i think it'll be a better way in the end.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
 Share

×
×
  • Create New...