K so here's an idea. The idea of music visualization is really interesting, but even the best programs don't really do a good job of it. iTunes's new visualizer is beautiful, but the connection to the music is superficial. I think this will probably always be the case for any attempt to make map something from auditory dimensions to visual ones. The music was created first, and it's bound by it's domain's constraints.
But what about creating new music with the constraints of the two domains specifically integrated? This way the composition will interact with the constraints and with itself through the constraints to make something that makes sense and appears to have meaning.
How to do this? Imagine you've got a sort of physics simulation where you can make heavenly bodies interact. You have a palate of objects with certain characteristics; the characteristics are represented both visually and with sound. As they interact, they alter each other's characteristics, and you can see and hear the change.
Still with me? Lets imagine specifics. To start, you could have a base line. A steady thump could be represented by a sphere in the middle of empty space; its diameter would change with its amplitude. If its at rest, it will continue to thump away. But if its moving in the three dimensional space, its X Y and Z coordinates could control some aspect of the sound it makes; maybe like its pitch, vibrado, and volume. Introducing other objects would make these characterestics interact with eachother in ways determined by the object's direction, mass, energy... all sorts of notions borrowed from physics.
This would be a fun thing to make, even if the music that resulted wasn't very good. I wish I had the skills! I wonder what it would take? If there's existing physics simlation programs in something like matlab, maybe you could map the variables to midi output?
Has anyone done anything like this already before?