• Content count

  • Joined

  • Last visited

Community Reputation

0 Neutral

About ultra

  • Rank
    MIDIbox Guru
  • Birthday 01/01/1970

Profile Information

  • Gender Not Telling
  1. I'm looking for a couple midiboxers who use Ableton Live and have an iPad.  Currently, all my app does is trigger clips and do a few other functions (mute, solo, arm, loop clips) over network MIDI (using an excellent network MIDI library maintained by our own Kurt Arnlund).  But I'm adding more features, such as automatic hardware remapping.  It'll kind of work like the APC-40, except you could build your own midiboxes or controllers you've bought and automap them in a few ways using the app as an interface for it.  You also won't be limited to a single controller, as any combination of controllers can be mapped to different functions and changed on the fly.    For example: you could have a controller with 8 volume faders that remap with the clip grid to control volume for whatever group of 8 tracks are selected at the time.  The knobs of that controller could remap to macros on each track.  Then, another controller could become whatever device you have selected at the time, or be an aggregate of your favorite parameters from different tracks that you could "page" through to remap, or perhaps drag a parameter name from a list to a knob and remap it on the fly.   The app isn't designed like LiveControl, where your main focus is on the app.  Instead it's designed for people who are busy with their hardware.  You can trigger clips without taking your eyes off of Live, because you can touch the iPad, drag to your clip (there's an on-screen preview showing which clip you'll trigger), and release to fire the clip.  You can also navigate around the clip grid by dragging or swiping.  Currently, it's not much different than other clip trigger apps on the app store, but i think the hardware integration might set it apart.   I can't offer any kind of payment besides a free copy, but your ideas will definitely be heard and it's an opportunity to help make this app be something you'd enjoy using, especially if you're into building hardware and want a way to extend its uses.   For anybody just interested in the existing version of the app, please find it through my website at http://www.sub-version.net (sorry for the shameless plug).  For anybody interested in testing, please ask via private message.  Any discussion of the app just post here.   Thanks!   ultra   (a busy guy who can't wait to get back to midiboxing)
  2. XCode does not see environment variables

    I'll try it later tonight and let you know what happens. The .hex itself is what you upload to the core. ultra
  3. XCode does not see environment variables

    I'm kinda new to Mac and I haven't used xcode as anything but a basic editor. I have programmer's status, but my understanding of the toolchain isn't the greatest so I just follow the wiki instructions. ultra
  4. XCode does not see environment variables

    plz do! i'm using xcode and make -s from the terminal. i'd love to see my errors highlighted among all those warnings that i ignore for now and will get to fixing later. :P ultra
  5. still working on "midibox live" after 4 years. i'd have more done if i wasn't working so much. learning a lot more about c.

    1. ultra


      i do, however, have quite a bit done. the pieces are in place. :)

  6. New LIVID ui board

    i've looked at their buttons before for one of my projects, but the button/spacer layouts aren't correct for me. it's an ableton live controller so i thought soft buttons would be cool, but i always return to the tact switch and led configuration. it's cheap and easy and works fine. i never noticed that the brain is $189. seems expensive compared the price of an LPC17.
  7. graduating college this week. my degree is 'electronics and computer engineering', and i wouldn't have taken interest in it if it wasn't for midibox.

    1. Show previous comments  4 more
    2. Hawkeye


      Congrats, man :) Sounds great!

    3. Lamouette


      congrats ! really nice !

    4. albpower2seq4sid


      Congratulations!! Good luck to find a great job!!

  8. that awkward moment when i realize I've coded a debug msg into something called by app_background

    1. jojjelito


      Haha, that can get cute fast...

  9. the best way to figure out how to do something you don't know how to do is spend money on it.

    1. nILS


      ...not true for women.

    2. Hawkeye


      but you may get a little bit further this way :)

  10. IMG 20120418 092459

    From the album MIDIbox Live

    the driver is not yet complete, but working well so far. i wrote a font generating program in python which outputs a header file that can plug right into existing code, making fonts easily selectable. it's not done yet because i want to add the ability to edit existing font files. lots of thanks to hawkeye and nils for helping me get this far!
  11. people i know personally always ask me "how do you know all this stuff about computers and electronics?" when i come to this forum, i ask myself the same thing about you guys.

    1. Show previous comments  3 more
    2. technobreath


      Yes, it's amazing. But there's a lot of shit out there too. Luckily this forum has about as close as u get to zero of that crap.

    3. Hawkeye


      could not agree more with all of you :)

    4. Shuriken


      Hear, Hear

  12. i'm not positive, but your message might be an upload request. did you check your contrast?
  13. the wheels are turning!

    1. jojjelito


      Don't make them stop while the going is good :D

    2. Hawkeye


      once its rolling, keep the speed, but brake ahead of the hard corners :D

    3. technobreath


      Hehe, you guys are full of words of wisdom :D

  14. so does gfx_1322 call into font.c to get the bitmap data? here's where i'm confused: i want to write the text "midibox" to the display, starting at x/y coordinates 16/32 with a shade of 8. i'd assume that the function would have to be specific to this display, since others might not have shades, or might even be rgb and then have a color argument. so would my function to write text actually call into gfx_1322, like this? print_text("midibox", 16, 32, 8); then gfx_1322 loops through the characters, passing each one into font.c and getting back the bitmap data on a character which will then generate the 1322 specific code to write pixel/shade data to the buffer? if this is correct, then font.c is doing nothing more than translating the hex codes into 1/0 data and returning it to gfx_1322.c for buffer and output.
  15. i have this working and i'm able to access .bitmap[x], .left_shift, and .width from my mios32 app. i'm assuming the next step is to make a font.c that handles drawing the font from the hex values? should this somehow implement the standard mios32 display functions? i'm not sure if that would work because we can add parameters such as shade, position, etc, for each character or whatever we want. also i plan on making some general use meters and perhaps some other graphics. so should the standard mios32 display functions be ignored and i write my own display printing code from scratch? ultra