Jump to content

ultra

Programmer
  • Posts

    832
  • Joined

  • Last visited

Everything posted by ultra

  1. no i believe it's for mac too. check the forum (http://groups.google.com/group/liveapi) or the wiki (http://www.assembla.com/wiki/show/live-api). there's mac information on which svn client to use, so there's gotta be mac stuff in there somewhere.
  2. yeah it's unfortunate, but live doesn't seem to respond to sysex at all. even with liveapi, i couldn't get it to work. but writing a midibox application for this shouldn't be too difficult. i would think you can get realtime control out of it. for an application i'm writing for mios and liveapi, i'm having to use 3 sets of 3 byte midi messages instead of sysex to get detailed information into live, and receive sysex via live (sometimes many bytes at a time) and it's been very responsive. connect your controller to midiOX, turn a knob, and paste us a few lines of what the controller data that sends out the sysex looks like. it might be difficult to translate if the controller data has a much higher resolution or contains more than two bytes of control information. alternatively, i know that a script could be written for liveapi that sends out the correct sysex strings, but i don't think you can "catch" the midi recorded to a channel and translate it. you'd only be able to receive incoming midi and bounce sysex to the output.
  3. regarding your axiom problem: install midiox (http://midiox.com/), turn a knob, and post some of the output here. it could be a couple things: if your encoders are set up for absolute values (usually the default), it's going to jump the parameter in live because it matches them up. that's perfectly normal. you can either use signed bit, or go into the midi options and change your takeover mode. if your encoders somehow got switched to signed bit values, live's behavior will be even more weird. signed bit simply sends two absolute values (one for increment, one for decrement), but with the axiom that value changes slightly when a knob is turned faster, to tell live you are accelerating it. using signed bit mode is the workaround for the first case, parameter jumping when using absolute values. both the axiom and live support signed bit mode, so check the manual of each for how to use it. if you're using midi learn in live, it can detect signed bit if you turn the axiom's encoder very slowly. but really it's best to make the assignment and then choose signed bit from the drop down box below. read up on signed bit. i went through this problem and have posted about it somewhere on the forum.
  4. i've began the documentation: http://midibox.org/dokuwiki/doku.php?id=midibox_live
  5. well, i've completed a lot of functions for use with liveapi. these are just control. none of the request functions have been written yet. these functions will be able to be called from any midibox application you're writing and the script will respond. i believe there will be enough available control that you don't have to even touch liveapi or python. i've also figured out how to get live to tell the midibox when anythign has changed, and the midibox will simply run a function. so, for example, if you send out the command to increment master volume, you won't have to track the volume in your code. when the volume is changed, live will automatically report back what the volume has been set to (all knob/fader functions are increment/decrement so no parameter jumping) and you can just update your display. in case anybody's curious, here's what i've got so far (functions to be sent from a midibox). something like setMasterRecord would be sent as setMasterRecord(on). pretty damn easy :). if anybody has some items they think i've missed from master controls or track specific controls (track plugin code hasn't been started yet) let me know. there's a few methods i can't find, but i think i will find them eventually. i can't find how to switch session/arrangement view, for one, but mackie does it so the capability must exist. if anybody is familiar with python and knows how to use decompyle, please let me know. void setMasterPlay(); void setMasterStop(); void setMasterRecord(unsigned char state); void setMasterFollow(unsigned char state); void setMasterOverdub(unsigned char state); void setMasterBackToArrangement(); void setMasterPunchIn(unsigned char state); void setMasterPunchOut(unsigned char state); void setMasterLoop(unsigned char state); void setMasterSongTime(unsigned char incrementer, unsigned char scale); void setMasterLoopStart(unsigned char incrementer, unsigned char scale); void setMasterLoopLength(unsigned char incrementer, unsigned char scale); void setMasterQuantization(unsigned char incrementer); void setMasterTempo(unsigned char incrementer, unsigned char scale); void setMasterSignatureNumerator(unsigned char incrementer); void setMasterSignatureDenominator(unsigned char incrementer); void setMasterGrooveAmount(unsigned char incrementer); void setMasterMetronome(unsigned char state); void setMasterVolume(unsigned char incrementer, unsigned char scale); void setMasterPan(unsigned char incrementer, unsigned char scale); void setMasterCueVolume(unsigned char incrementer, unsigned char scale); void setMasterCrossfader(unsigned char incrementer, unsigned char scale); void setMasterReturnAmount(unsigned char incrementer, unsigned char scale, unsigned char returnNumber); void setMasterSendAmount(unsigned char incrementer, unsigned char scale, unsigned char returnNumber, unsigned char sendNumber); void setMasterReturnPan(unsigned char incrementer, unsigned char scale, unsigned char returnNumber); void setMasterReturnMute(unsigned char state, unsigned char returnNumber); void setMasterReturnSolo(unsigned char state, unsigned char returnNumber); void setMasterReturnCrossfade(unsigned char state, unsigned char returnNumber); void setMasterPlayScene(unsigned char scene); void setMasterStopAllClips(); void setMasterDrawMode(unsigned char state); void setTrackVolume(unsigned char incrementer, unsigned char scale, unsigned char channel); void setTrackPan(unsigned char incrementer, unsigned char scale, unsigned char channel); void setTrackMute(unsigned char state, unsigned char channel); void setTrackSolo(unsigned char state, unsigned char channel); void setTrackSendAmount(unsigned char incrementer, unsigned char scale, unsigned char sendNumber, unsigned char channel); //not implemented void setTrackCrossfade void setClipPlay(unsigned char clip, unsigned char channel); void setClipStop(unsigned char clip, unsigned char channel); void setClipNudge(unsigned char direction, unsigned char amount, unsigned char clip, unsigned char channel);
  6. this is a fun one finish a liveapi script and functions for midibox (actually i was hoping to get it done THIS year) finish my 9090 get a seq kit from wilba and a gm5x5x5 from nils do another ultracore run finish my box-o-trix maybe start my x0xb0x build the ultimate ableton live controller get engaged to chellie and buy a house get my business going well enough to quit my job (yeah right) do better in school than i did last semester learn a lot more about programming build a core32 buy a dsi evolver edit: oh, and get to the post office to mail cimo his adapter thingy
  7. i'm not sure what kind of access i'll have to racks yet. it depends on how the api looks at them. i'll know soon enough though (i hope).
  8. hi rigo, i'm not sure if this helps, but check this thread: http://www.midibox.org/forum/index.php/topic,12585.0.html i'm in the process of creating a liveapi script and an include file so you can easily write midibox applications that interfaces with live. rather than control each clip with a different note, you can simply address a clip directly using channel and slot and do whatever you want with it, including putting the name of the clip into the midibox. i'm having problems with it as i type this, but i think i'll get there. ultra
  9. well, i'm going to go ahead and order my lcds when i can. sometimes investing some money is a kick in the butt to get it figured out. of course, i'll probably be bugging the heck out of you guys to figure this out :P.
  10. it works different than that. i can send any value i want and translate it on the liveapi side. when coding it to set the value, it's not 0-127. when i was sending absolute midi values i had to turn it into a percentage (#/127) because most values actually range from 0 to 1 with a resolution of about .000000000001. i still haven't hashed out exactly how i'll control scaling but i'm not very concerned about it at this moment. i have some of the framework done for an example app where i can send simple functions to live. anybody can use it because you'll be able to send functions like setKnobParameter(track + channel, volume, incrementer); when a knob is turned. this would control the volume of a track based on a channel setting in your app. pretty easy. scaling isn't really in the code yet, but that'll get there when i figure out how to do it. so other functions would look like setKnobParameter(master, pan, incrementer) or setButtonParameter(track + channel, mute, pin_value). so easy! one way to look at this is being a single instrument because of the tight integration. so within your own instrument, there's a lot of custom code and it's commented well so others can understand. same with this, but it just has code in two places. i think that when i'm done, most users would only have to write code to customize their midibox and the liveapi script will be complete enough that they won't have to touch it.
  11. as it turns out, live doesn't respond to sysex via the api. so i'm going with cascading values. i thought about using something already defined (like the nrpn you suggested), but i don't see a point when the only thing this communicates with is a script i'm writing myself. so i'm developing my own protocol based on the midi commands i'm able to send and receive. in a lot of cases, i can do it with three bytes if i define what all the parameters mean. using notes, i can address a LOT of instrument parameters this way: note on status: address channel and says to increment note off status: address channel and says to decrement note number: address which instrument to control and scaling note value: address the parameter to control everything is relative, so i don't need to send any actual values. with two scaling options (normal/slow), i still have access to 64 instruments on that track (way too many) because i can send 0 for instrument 0's normal scale, and 64 for instrument 0's slow scale. also, i'll be able to control 128 parameters per instrument. i've thought pretty hard about this and i think going completely custom is the only way to do it. i understand why the various standards exist but i don't think the apply in this case. also, having more i/o available via gm5 gives me extra ports that aren't assigned to live as a control surface :D.
  12. thanks stryd, i'll use the "educational" sysex mfg id. and lookie here, tk went ahead and did the hard work already. :)
  13. any progress on this? i'm interested in using 5 glcds! ultra
  14. no i haven't. you have to order $100 minimum so i'm just going to dive in and order all 5 of them. i also don't know if i can actually connect 5 of these to a midibox (i've heard you can). stryd_one did find a driver for them so i'm ok with that. the fancy displays are for the purpose of not loading up the interface with buttons that access information on the tracks. and i wouldn't use them for anything critical. this is a studio controller and i don't see a point of having so much information on it that you don't have to look at the screen. the simplicity of not having to map controls is the biggest reason i want to do this. also to get complete control over live. for live use, check what i've said at my liveapi progress thread: http://www.midibox.org/forum/index.php/topic,12585.0.html a controller that's powerful enough to play live without looking at the screen is certainly possible. i've already managed to access clip names, and yes clip state can also be monitored and you can show the info on screen, use an led, or whatever you want. i have definitely thought about the possibility of having the touchscreen enter an x/y mode, allowing you to drag your finger to control two parameters. for me, this is a low priority but it could be added. i've also been considering the method of accessing the different devices on the track. an encoder to cycle through them is an option. otherwise, the touchscreen can be used as well. i'll look into racks a little bit more, hopefully it'll help me better understand what you're talking about :). using anything like midi yoke, bome's, etc, is not an option. i do not want any kind of intermediary program to be necessary. the communication will all be direct with live.
  15. after learning a bit about programming for MIOS and liveapi, i decided to ditch my original live control surface idea. it used no feedback of any kind from live. this new one is completely integrated into live as a custom control surface like mackie, but better than mackie in the sense that more information is exchanged, and virtually anything is mapped automatically without even thinking about it. the drawing is of the bankable 4-track section of the controller. i left off the master section because i'm still working out the details on what functions i'd want to be immediately accessible. part of the reason for this thread is to get feedback from live users. i want to know if i need to make more functions immediately accessible. the six buttons at the bottom-left of the image are dedicated clip/track controls. you can fire/stop a clip, arm for recording, and solo/mute a track. no matter what state the controller is in, these functions will always work. to the right is the volume pot. this is a softpot that uses a 20 segment led just like stribe. stryd_one had a great idea for using this for scaling. if your volume is at 80% and you touch the bottom of the pot to slide upward, the entire softpot controls only the remaining 20% of volume, giving you higher resolution. the sixth button in the group will be a modifier button allowing you to get even higher resolution. this solves issues of having to use motorfaders, saves space, and offers flexibility with volume control. it might take getting used to, but i think it would work. perhaps the button would instead put it into an "absolute" mode. or vice versa. this set 6 buttons and the volume fader are automapped to live. you will never have to enter midi map mode to gain control. the 9 controls above that are six endless rotary encoders (detents removed) and 3 mec illuminated switches. these are "soft" controls that change based on what is going on in the controller. they also work in layers so they become more functional. this is where things start to get interesting. with liveapi, you have the ability to monitor what's going on in a major way. if i drop a plugin onto a track in live, the midibox is notified automatically. from there all available parameters of that control are scanned and the name and type of each parameter is gathered and sent to the midibox. even 3rd party plugins work. when the data is dumped into the midibox, one of two things happens: if you've used the control before and set up your preferences, it becomes a layer on the track you dropped it into. all control names show up on the lcd (in alignment with the proper controls) for that track and the knobs/buttons are automatically mapped to that control. you will not have to enter midi map mode, it's all automatic. when you add more plugins, they become more layers. you can select the layers and gain direct access to whatever kind of instrument or effect you have on that track. if an instrument needs more than the 6 available knobs and 3 switches, it can get more than one layer. all parameters are sent in a relative manner so you don't have to worry about parameter jumping or storing the values. however, values of those knobs will be put on the glcd. if you haven't used the control before, you have to do a simple setup. the entire control surface temporarily becomes an editor for you to set up your preferences for that plugin. you'll get all available parameter names mapped across the entire board. from there, if things aren't quite right (maybe it's toggling a button that should be momentary), you can edit the control and change it as you see fit. for instance, you could store scaling preferences per control or even edit the name that shows up on the lcd. there will also be the ability to just remove a control so you don't have to deal with it building up layers you don't need, as well as the ability to choose what order they show up (meaning certain controls can be on the top layer, the next, etc). once this setup has been done, you won't have to do it again. the next time you add that plugin for any track, these preferences are recalled and it just plugs into your track. i also want to include the ability to add your own custom setups from scratch. i see problems with certain plugins like reaktor that may not map out the parameter names properly. so instead, you will manually assign the cc's, control type, parameter names, etc and map them the old fashioned way. of course once this is done, you can recall it later (perhaps automatically) and only do new mappings if necessary. none of the automapping features will use cc's, so there will be plenty available for programs that don't want to play nice. as far as track layers go, there are two more things i want to add. if possible, i want to add a clip editor (works just like it was an additional layer). the top layer will be your "favorites". six knobs and three buttons from any instrument on the track can be selected to be on the top layer, no matter what instrument they control. edit: i haven't yet looked into controlling racks, so maybe that's a good alternative. the selection of what plugin you want to control for the moment will be done via a touchscreen glcd. stryd_one found a good deal and it should cost under $200 for five of these (four tracks + master). the master section will be very much like the track sections, with certain buttons being dedicated and others changing purpose. the touchscreen could also be used to fire scenes in live. i am able to access track names (you'll see the track # and track title on each glcd of the tracks), clip names, and probably scene names as well (haven't tested that part yet), so there's a number of options available to get data into the midibox for whatever purpose you want. so basically i'm looking for feedback on the design. forget the actual layout and how it looks. am i overlooking anything that needs immediate control? ultra
  16. thanks stryd :) i've been putting quite a bit of time into this. many hours just trying to figure out where to start. but once i figured out how to control one thing, the rest just takes a few more minutes. i'm really not very experienced in coding but i've learned a lot lately. for getting messages to show on the midibox lcd, it's just simple sysex. from consts.py in APIMidi: WELCOME_SYSEX_MESSAGE = (240,0,72,69,76,76,79,247) i didn't even really know what consists of a sysex message, but i did know that 240 and 247 means it's the start and stop of sysex information. i went over to www.asciitable.com and looked at the character codes for the rest, and sure enough this message means (START, HELLO, STOP). so i retrieved a track name, turned it into a tuple, and sent it to the midibox. on the MIOS side of things, it was just a simple routine to start listening when 240 is received and output the characters to the screen using the character codes. so yes, now that i have a function to show the sysex and learned a bit about tuples in python, printing anything to the screen via sysex is really easy. sorry paul i have no idea what you mean. can you explain better? the answer may just be that nobody is interested. but if that's not it, the reason could be that i've yet to talk about how this can be applied to controllers designed for playing live and DJing. it seems that people have been more interested in creating those kind of devices. i've had the idea of creating a rather simple 2-4 track controller for live use (after i create the studio controller). it would have just what's needed for live use, such as controls for EQ, volume, crossfader, etc. the biggest feature of this would be the use of sparkfun/monome buttons with 2 or 3 color leds behind them. there would also be a big glcd aligned to the sparkfun buttons. the glcd would give you the clip names and the activities going on with them. you'd never have to look at the computer screen. this is entirely possible with liveapi and it's already been done with the lemur. even if there is no interest in this, i'm going to continue with it and create documentation, example scripts, and example MIOS applications. i'm also going to formulate my own "protocol" using sysex instead of cc's because i'll run out of cc's at some point. with sysex, i can simply assign some kind of identification in the sysex string and parse it out in the ableton live script. then i can control a virtually unlimited number of items in live. also, i'm going to do away with my current code that sends absolute values into live and go with something similar to relative midi. i'll have a lot of control over the resolution by sending out different values and there will be no parameter jumping. lastly, the plan is to always have the midibox slave to whatever is going on in live and save as little data as possible onto the bankstick. that will keep you from having to save the same things twice (on the computer and on the midibox) and will also keep things in sync. you simply turn on the midibox, the midibox asks live what it should do, and sets itself up accordingly. we'll see how it all goes. these are all just goals and don't mean very much until i've managed to get something done.
  17. i've now figured out a number of things with liveapi: i can control any mixer parameter of any track by simply sending the same data with a different channel - it'll automatically attach to the correct channel no matter what other mappings you've made. i also can control master volume, pan, crossfade, cue volume, etc, all without ever having to map anything. today i figured out how to get data back to the midibox. via sysex, i can print the name of a track on my lcd. i can also query any instrument/effect on the track and display knob names, values, etc, on the midibox. this means i can have automatic access to its controls without having to map them. if i put an eq on the track, i can check for available cc numbers, assign them to the low/mid/hi values, and show the name of what i'm controlling on the midibox's lcd. all without ever having to map anything! with the sysex capabilities, i wonder if this means live can query sysex capable hardware midi instruments and begin taking control of those automatically...
  18. i've gotten a little bit further with liveapi. i've also been attempting to inspire the liveapi community (http://groups.google.com/group/liveapi) to get more comprehensive documentation going on their wiki (http://www.assembla.com/wiki/show/live-api). there are some pretty friendly and informative people over there and they seem to be happy to help whoever is interested. in fact, it seems the more people are interested in liveapi, they more information they will produce :). i started out by writing a simple how-to for setting up the necessary tools and getting live to recognize your script (http://www.assembla.com/wiki/show/live-api/Getting_started_with_LiveAPI_-_Toolchain_Setup). next, i will post an article on how to control mixer devices in live, and how to monitor the devices and send the data back to your midi controller. that will hopefully happen soon, i just first have to get a few questions answered. in the end, i do see this developing into a nice set of how-to's similar to the "so complete, any idiot can do it" how-to's that stryd makes. so if anybody is interested in working with this, please let me know and we can work together. best to keep the details over at the liveapi forum and wiki, but i post about this here because i would think some midiboxers would love to dig into live. ultra
  19. thanks for the link. i think the application would simply output a text file that can be uploaded via midiox. it's probably easiest and python could still be used. this is on hold though until i work my way through using liveapi and build my live controller. have you worked with liveapi at all?
  20. thanks paul, i actually did see those links. i've looked everywhere for info :). so far i've managed to get as far as control the volume for any track. when i get farther, i'll post the info if anybody wants it. edit: now i have it muting tracks
  21. stryd_one this is actually something on my list of things to do. i'd like to make a simple GUI patch editor so you don't have to mess with naming on the lcd if you don't want to. i have to learn how to use sysex anyway because the midibox i want to build next will use it. i'll have to also learn python for liveapi so i think the gui patch editor should be written in python. i just bought a python book and it says i can write cross-platform applications, so perhaps it's a good language to use for the patch editor and a good way to start out with it. :)
  22. as some of us know, liveapi is rather poorly documented. i was wondering if anybody here has gotten it to work? all i need is a good starting point so i can run with it. ultra edit: it's not so poorly documented. i wasn't looking in the right places at first, and the documentation is growing. :)
  23. did you very carefully follow this guide? http://midibox.org/dokuwiki/doku.php?id=application_development
  24. hello, i've been working on a midibox lately with the following features: 8 encoders (two rows of 4) 8 switches (two more rows of 4) 4 extra switches (misc functions) the layout looks like this (O for encoder, X for switch) X O O O O X O O O O X X X X X X X X X X these are all situated under a 4x40 lcd (right now i have two 2x40's for testing purposes). how it works: the application lets you assign names to each general purpose encoder and switch so you can see on the lcd what function you are controlling. for example, you could assign "osc1pitch" or "cutoff" to a knob. then the name of each control shows on the lcd in 4 rows of 4 controls. you get up to 9 characters per control. this setup of 16 control names can then be saved into a "patch" you can name, and you can freely move between patches to control your various instruments. the channel that controls the entire patch is also saved. the initial setup of a patch can be a little tedious, but you only have to do it once per control and it's all saved to the bankstick. each encoder saves settings for cc# and whether you want to send absolute or relative midi data. each switch saves settings for cc# and whether you want a toggle or momentary control. also, the two values sent by switches are freely assignable. to round things out i've added a midi learn function you can activate when setting the cc# for each control (not really tested yet). patch management is pretty easy. you can rename a patch once it's created and insert/delete between patches. there's no alphabetical order or any kind of sorting. if you want one patch between two others, simply press "insert" while the first patch is active. this is my first c application (before this i programmed in vb) besides the basic stuff they taught me in school. i received a lot of help from nils and stryd (thanks). i will release the code shortly but before i do that, i'd like to get it into a completed box so it's a lot easier to test. my idea for the layout of this box is that it could host other kinds of applications, with the generic layout and lcd space. so i wrote this program with it in mind that i will add more programs later on (i.e. it'll format the bankstick when you add new software without overwriting your previous patches). btw this thing will fit into a PT-8.
×
×
  • Create New...