_ ENTROPY CHANT _ / 2012 ('12 min)
for flute, bs.clarinet, trumpet, ampl.piano,
el.guitar, violin, cello, dbl bass
for ensemble MODELO62, Den Haag /
conductor: Ezequiel Menalled
/ www.MODELO62.com /
audio_____________________________/ mp3
ENTROPY CHANT is the piece of
sounding silence. This work investigates into narrative property of the
silence and visual gestures of 'the performer on stage' within the
context of the musical piece - the 'silence within the sound' and
'sound within the silence'. Which begins and results by the energy of
the internal mechanism of 'the intention' and the sound production over
time – gradually shaping the liquid self-changing form of a non
static sonic sculpture of the piece.
The breathing dynamic
gestures, transforming melodic motives and textures, the echoing
ghost-sounds of appearing and disappearing shadow overtones, a
continuously created new points of departure, shifted gravity centers,
melted perception of time – all invite the listener into its
weary, androgynously fragile, frozen sound scape of the moving towards
or forward in time, yet never reaching.
The term 'entropy'
(from Greek: εντροπία
[entropía] - a turning toward; εν [en-] -in;
τροπή [tropē] - turn, conversion) is used
indirectly, focused around the inertia and the level of momentarily
available working of energies in the musical context.
In information theory:
/ In simple terms, entropy in information theory is
as a measure of unpredictability, a measure of uncertainty in the
expected value of the information contained in a message. Meyer (1956)
postulated that meaning in music is directly related to entropy –
that high entropy (uncertainty) engenders greater subjective tension,
which is correlated with more meaningful musical events.
In
physics:/ Entropy is not a qualitative thing, but
quantitative. When physicists talk about entropy, they talk about
change in entropy and not about something absolute. It's a macro
measure that describes a macroscopic system consisting of different
small (microscopic) elements. Entropy is the only quantity in the
physical sciences that "picks" a particular direction for time,
sometimes called an 'arrow of time'. As one goes "forward" in time, the
second law of thermodynamics says, the entropy of an isolated system
will increase.
In a closed system,
these small elements are initially all in different states and interact
with each other, producing useful work. Entropy describes a potential
of such a system to produce 'useful work'. By producing useful work,
system will be changing it's internal state so that differences in it's
internal organization will be leveled. In the end, you will have a
settled system where there is no difference in internal states of it's
individual elements, so they can't produce any useful work by
interacting with each other. This state is called equilibrium and in
these state the system reaches it's maximum entropy.