Here's a novel idea for somone to steal and make a bundle on: Imagine an augmented-reality system for visualizing music as its performed live.
-Assume that each member of the audience has glasses or contact lenses that can overlay imagry on the real scene (a technology that will be viable in maybe five years).
-Plug the output of each musician's instrument into a "visualizer," some program that makes a pretty graphical representation of the sound thier instrument makes.
-Set up the stage and the instruments with transmitters such that each instrument can be located in three dimensional space in realtime.
-Send the musical visualization data to the audience's glasses and overlay it on the scene , such that it looks like the music is pouring out of the instruments.
What exactly the music "looks" like can be totally up to the musicians. Imagine the array of expression that they could create with a simple set of tools that uses templates and varies simple parameters! I can imagine drum hits looking like bright sparks, ryhtm guitar looking like horizontal waves, and the lead looking like a spiraling, twisting tower of light. The graphic effects might warrant adding a member to a band to dedicate creativity to that aspect of the show.
In my head, I've been thinking of using this with the sort of standard, three-guitars-and-a-drum-set sort of rock band, but I can imagine very bold artists applying idea to classical music. The visual imagry could be much richer there, but I almost hesitate to suggest it beause it seems like its trespassing on somone elses work. Maybe that could be a new dimension for contemporary composers to explore.