Blue Sky Catastrophe for piano, live computer processing, and 8 channel sound is based on performer-driven generative musical processes. Musical phrases are algorithmically generated live by the computer and the performer is asked to sight read them. The computer then assesses the performance and generates the next phrase of music for the pianist – either more or less difficult based on the pianist’s performance. The pianist is asked to read the music accurately while also being given latitude to influence the computer’s musical decisions and pace. Therefore, the score of the work is entirely performer-driven, different in every performance. The computer is set up to react not only to accuracy but also to aspects of the playing that will control live generated electronics such as harmonic attractors and spatialization. The performer and computer are engaged in a feedback loop that explores degrees of stability, periodicity, non-periodicity, mirco-tuning, and quirky, chaotic potential.