Sunday, December 30, 2007

resignation

. . . . the stamp of maturity?

Saturday, December 29, 2007

whist

Whist is, without question, the best of all our domestic games. The only other one which could lay claim to such a distinction is Chess; but this has the disadvantage of containing no element of chance in its composition—which renders it too severe a mental labour, and disqualifies it from being considered a game, in the proper sense of the word. Whist, on the contrary, while it is equal to chess in its demands on the intellect and skill of the player, involves so much chance as to give relief to the mental energies, and thus to promote, as every good game should, the amusement and relaxation of those engaged.

from William Pole, F.R.S., The Theory of the Modern Scientific Game of Whist, 1883

Friday, December 28, 2007

bhutto

We live in an age of disinformation, confusion, and mystery. Even within a single report, we see contradiction:

apparently, "there were no bullet marks on Bhutto's body"; nevertheless, "she was shot" . . . .

. . . . ?

Tuesday, December 25, 2007

sad clown christmas

hypocrisy

. . . on the mind each holiday season, when many set foot inside a church for the first time in 364 days. Neal Stephenson argues that the elevation of hypocrisy from minor foible to cardinal sin is a product of the rampant relativism of the 21st century. In an age where there are no moral absolutes, the only standpoint from which one can criticize another is their own, i.e. only hypocrisy, failing by one's own lights, moral self-contradiction, remains an absolute sin.

Even if this observation is accurate, however, the question remains: pragmatically, is hypocrisy useful to society / humanity or no? Perhaps, more generally: can cognitive dissonance be a precipitous means to constructive ends? Is this an empirical question?

Monday, December 17, 2007

understanding the mind VII


from Jean-Pierre Changeux, The Physiology of Truth, 2002

Friday, December 14, 2007

an alternate perspective

We should contrast our reluctance to accept the accidents of history as arbiter of obscenity with the more thoroughgoing stubbornness characteristic of full-fledged Saxonism:
Saxonism is a name for the attempt to raise the proportion borne by the originally & etymologically English words in our speech to those that come from alien sources. The Saxonist forms new derivatives from English words to displace established words of similar meaning but Latin descent; revives obsolete or archaic English words for the same purpose; allows the genealogy of words to decide for him which is the better of two synonyms. . . . The truth is perhaps that conscious deliberate Saxonism is folly, that the choice or rejection of particular words should depend not on their descent but on considerations of expressiveness, intelligibility, brevity, euphony, or ease of handling, & yet that any writer who becomes aware that the Saxon or native English element in what he writes is small will do well to take the fact as a danger-signal. But the way to act on that signal is not to translate his Romance words into Saxon ones; it is to avoid abstract & roundabout & bookish phrasing whenever the nature of the thing to be said does not require it.

H. W. Fowler, A Dictionary of Modern English Usage, 1926.
There was a minor craze for [Saxon words] early in this century, giving us all manner of quaint pseudo-archaisms like skysill for horizon, but it has passed, and with it any notion of a special virtue inherent in 'native' roots. It remains broadly true that, as compared with derivatives of Latin, a decent proportion of Saxonisms in the vocabulary is a sign of a good writer, but the reader should never be allowed to suspect that this is the result of any conscious policy of choice on the writer's part. What 'a decent proportion' amounts to cannot be defined, and it seems easier and safer to approach the problem from the other end and work on the principle that a preponderance of classically derived words in what one writes, especially words denoting abstract qualities or things, especially polysyllables, especailly those ending in -tion or -sion, is a bad sign. That is Rule 1.

Rule 2 annoyingly goes back a little way and says, Never choose to write one word rather than another on the sole ground that it has an Old or Middle English pedigree and its competitor comes from a Latin, French, anyway non-English root. In particular, never choose an English-descended word like forebear when a foreign one like ancestor seems more familiar and natural.


Kingsley Amis, The King's English: A Guide to Modern Usage, 1998.

So, "avoid abstract & roundabout & bookish," but prefer "familiar and natural" terms. Yet what of questions of propriety and obscenity? These seem wholly orthogonal to matters of clarity and elegance of expression.

Tuesday, December 11, 2007

probability and public policy VI: "priors" and the fair coin revisited

[part one of this series here]

A very real problem in public policy decisions is the role of the "prior," or the probability assignment before one has been presented with any evidence. Two politicians with very different priors, when presented with the same evidence, will come to very different conclusions. The beauty of subjective Bayes is that it gives us an analysis of how this phenomenon is a byproduct of rationality.

Consider, for example, the debacle over WMDs in Iraq. Many critics attribute irrationality to the policy makers who determined the probability that Iraq possessed WMDs high enough to justify an invasion. A disadvantage to this approach is that it rules out the prospects for dealing strategically with these policy makers (via debates, speeches, compromises, etc.) as all theories of strategic interaction presume the rationality of one's opponent. The subjective Bayes approach allows us to characterize the conclusions of these policy makers as rational given an appropriate assignment of priors.

Before discussing more realistic scenarios, let's examine a toy example. A coin has been tossed 4 times, with outcomes THTT (tails, heads, tails, tails). Now, consider 3 politicians:

Politician Q is a frequentist

Politician R is a Bayesian who believes strongly in hypothesis A, namely that P(H) = 1/2

Politician S is a Bayesian who believes strongly in hypothesis B, namely that P(H) = 1/100


When presented with the same data set, Q, R, and S will each come to different conclusions, respectively:

Politician Q will believe hypothesis C, namely that P(H) = 1/4

Politician R will continue to believe (at roughly the same strength) hypothesis A, namely that P(H) = 1/2

Politician S will continue to believe (at greatly reduced, but still more than 50% strength) hypothesis B, namely that P(H) = 1/100


[For the calculations and relevant simplifying assumptions, please see the appendix.]

Many simplifying assumptions were made here, but the essential point still stands: given suitably strong priors and suitably ambiguous evidence, rational policy makers can disagree.

What of our frequentist here? In this example, perhaps, he seems better off. However, we should not forget the conceptual problems associated with frequentism, especially the problem of one time probabilities. For example, consider a situation like "climate change": the prospects for running the relevant "experiment" repeatedly (letting industrial society evolve on earth an infinite number of times?) are nil, yet the need for some kind of conclusion is unavoidable. Returning to the case of WMDs, there is a similar situation: the relevant evidence does not allow for a "reading off" of the probability in as clear a manner as successive coin tosses.

Obviously, all relevant positions have been greatly simplified. The essential point to make here is that neither "science" nor "rationality" dictate the correct policy responses in the face of uncertainty. Furthermore, failure to acknowledge this point weakens one's position in the ensuing debate as one is left unable to strategically militate for one's own position (as one cannot model one's opponent as rational).

We turn next to some more realistic policy issues and the specific complications which arise in dealing with the relevant probabilities.

next: cancer

probability and public policy VI (appendix)

We present here the calculations relevant to the above argument

Politician Q is a frequentist, so he simply reads the probability for heads off the data = 1/4.

Politicians R and S are Bayesians, but in order to simplify the problem, we need to distinguish their hypotheses about the underlying probability from their degree of confidence in those hypotheses. Let us say that each politician judges the probability of their favorite hypothesis to be 9/10. Call politician R's hypothesis (that the coin is fair) A and politician S's hypothesis (that the coin is heavily biased towards Tails) B. Furthermore, we simplify by assuming that A and B are mutually exclusive, i.e. there are no other possible hypotheses under consideration for either politician (we revisit this assumption in the sequel).

A = hypothesis that P(H) = 1/2 (i.e. that the coin is fair)
B = hypothesis that P(H) = 1/100 (i.e. that coin is biased strongly against H)
D = THTT (our data)

Then, using subscripts R and S for the beliefs of the respective politicians, we set

PR(A) = 9/10, so
PR(B) = 1/10, and
PS(B) = 9/10, so
PS(A) = 1/10

It is important to note here that the sequence of heads and tails is completely irrelevant; this is what allows us to apply the binomial distribution to calculate P(D|A) and P(D|B) (which will hold independently of the relevant politician).

[apologies: due to the apparent incompatibility of blogger with html math tags, I will use n{choose}k as short hand for the standard notation]

Binomial distribution as a function of n = number trials, k = number positives (in this case Heads), p = P(H):

f(k;n,p)=(n{choose}k)pk(1-p)n-k

Where,
(n {choose} k) = n! / (k!(n-k)!)

so, for us,

(4 {choose} 1) = 4! / (1!(3)!) = 4

which gives for X ∈ {A, B}:
P(D|X) = 4P(X)1(1-P(X))3=4P(X)(1-P(X))3

Therefore:

P(D|A) = 4(1/2)(1/2)3= 1/4

and

P(D|B) = 4(1/100)(99/100)3=4(1/100)(970,299/1000000)= 3,881,196/100000000 ≈ (3.8x106)/(108) ≈ 4/100

In order to apply Bayes Rule, we also need the value of P(D). Unlike the conditional probabilities just calculated, P(D) will differ for each politician's probability distribution.

PR(D) = PR(A)PR(D|A) + PR(B)PR(D|B) = (9/10)(1/4) + (1/10)(4/100) = (9/40) + (4/1000) ≈ 1/4

PS(D) = PS(A)PS(D|A) + PS(B)PS(D|B) = (1/10)(1/4) + (9/10)(4/100) = (1/40) + (36/1000) = (25/1000) + (36/1000) ≈ 6/100

[excuse the rough rounding, but it won't change the qualitative result even if it introduces small inaccuracies]

Now we can simply apply Bayes' Rule to determine how strongly the politicians R and S will believe their respective favorite theories after presentation with the evidence D:

PR(A|D) = PR(D|A)PR(A)/PR(D) ≈ (1/4)(9/10)/(1/4) = 9/10

PS(B|D) = PS(D|B)PS(B)/PR(D) ≈ (4/100)(9/10)/(6/100) = (2/3)(9/10) = 6/10

It should be clear that we can increase politician S's certaint0y of his original conclusion in the face of this evidence by increasing his initial degree of belief in that conclusion. Note that even if we include more than 2 possible hypotheses, this conclusion still holds (as these hypotheses will be weighted so weakly for each agent as to not be effected by such a small amount of evidence). The point here is just (to reiterate) that given sparse or ambiguous evidence and sufficiently strong priors, it is quite possible for rational agents to disagree dramatically about the pertinent conclusion to draw from the data.

"it's really not a game, dog"


. . . wild as the Taliban, 9 in my right, 45 in my other hand . . .
~ T.I.

I put soap in my eye
Make it red so I look raa, ra ra
So I woke up with my holy quran and found out I like Cadillacs
We shooting till the song is up
Little boys are acting up and
Baby mothers are going crazy
And the leaders all around cracking up
We goat-rich, we fry
Price of living in a shanty town just seems very high
But we still like T.I.
But we still look fly
Dancing as we're shooting up
And looting just to get by.
~ M.I.A.


I'm like the second plane that made the Towers face off,
that shit that let you know it's really not a game, dog . . .
~ Mos Def

Switchblade, grenade, rhyme flows
Fuck niggaz like wild rhinos
Up in these killing fields you bound to die slow
Your style staggers like a drunken wino
That's why there's no hope to defeat a Black Knight
That's like tryin to walk a tight rope
with no feet, mercenary team, streets of concrete
Sasquatch thump a nigga ass, so why try the
Invincible, Dr. Destructor
My lyrics bring war like Lebanon
Our troupe's a Desert Storm, it be on son
Compton is the city where I come from
Act dumb if you want to, and catch a hot one
It's that real, knuckle up, lace your boots tight,
Don't give a fuck 'cause every night is our night
~ Dr. Doom


Trash icons, smash, spit bionic poems
Fuck bygones, rely on Islam and my python
Squeeze off, long fist, when I'm pissed
Result of this, gun powder cover my wrist,
black list . . .
~Killa Sin

Sunday, December 9, 2007

understanding the mind VI


from Marvin Minsky, The Society of Mind, 1988

Thursday, December 6, 2007

obscenity or euphemism?

The events of 1066 continue to effect modern usage, in particular the practice of forbidding the use of particular words of Saxon origin in the public sphere. If the preference for Latinate roots in modern English is merely a reflection of Norman hegemony, can we find alternate grounds for choosing our words which do not depend upon the accidents of conquest?
If we seek a purely pragmatic solution, we should perhaps heed the advice of Orwell, 1946:
Foreign words and expressions such as cul de sac, ancien regime, deus ex machina, mutatis mutandis, status quo, gleichschaltung, weltanschauung, are used to give an air of culture and elegance. Except for the useful abbreviations i.e., e.g., and etc., there is no real need for any of the hundreds of foreign phrases now current in the English language. Bad writers, and especially scientific, political, and sociological writers, are nearly always haunted by the notion that Latin or Greek words are grander than Saxon ones, and unnecessary words like expedite, ameliorate, predict, extraneous, deracinated, clandestine, subaqueous, and hundreds of others constantly gain ground from their Anglo-Saxon numbers. [footnote: An interesting illustration of this is the way in which English flower names which were in use till very recently are being ousted by Greek ones, Snapdragon becoming antirrhinum, forget-me-not becoming myosotis, etc. It is hard to see any practical reason for this change of fashion: it is probably due to an instinctive turning away from the more homely word and a vague feeling that the Greek word is scientific.]
. . . . . . . . .

The defense of the English language . . . has nothing to do with archaism, with the salvaging of obsolete words and turns of speech, or with the setting up of a "standard English" which must never be departed from. On the contrary, it is especially concerned with the scrapping of every word or idiom which has outworn its usefulness. It has nothing to do with correct grammar and syntax, which are of no importance so long as one makes one's meaning clear, or with the avoidance of Americanisms, or with having what is called a "good prose style." On the other hand, it is not concerned with fake simplicity and the attempt to make written English colloquial. Nor does it even imply in every case preferring the Saxon word to the Latin one, though it does imply using the fewest and shortest words that will cover one's meaning. What is above all needed is to let the meaning choose the word, and not the other way around.

Of course, if fewer and shorter words are called for, Saxonate terms will often win the day over their Latinate analogs. Nevertheless, the colonial mindset which deems these words "dirty" or "obscene" may at first hamper their use in the public sphere. To combat this prejudice, we may compare not the political history of these terms, but rather their etymological history. Such a Nietzschean "genealogy" may provide us with an alternate perspective from which to compare the relevant terms without the burden of spurious (i.e. politically-inculcated, or slave mentality) moralistic bias.

Consider, for example, shit, shit and feces, defecate: if we examine their etymologies, does one emerge from a more innocent, i.e. euphemistic perspective on the intended referent than the other? [etymologies courtesy Shipley, 1984]

Although the Indo-European root of feces is unclear (perhaps *bhƒy-), its more recent history is well known:

feces, however, is from L[atin] faex, faeces: sediment, dregs. The basic sense of defecate is to clear out the dregs, cleanse, purify. Thus, Robert Burton, in The Anatomy of Melancholy (1621), states that Luther "began upon a sudden to defecate, and as another sun to drive away, those foggy mists of superstition." And fallible man is comforted by H. Macmillan in The True Vine (1870): "By the death of the body, sin is defecated."

Shit, on the other hand, can be traced back to the Indo-European sek, to cut, separate, or divide:

shite, shit, dropped from the animal; earlier skate, skite, as in blatherskite. blather: to talk nonsense loquaciously, as with verbal diarrhea. skate: shitter, originally a Scotch term of contempt, is now softened in the colloquial "He's a good skate." The Scotch song Maggie Lauder, by F. Sempill, 1650, a favorite with the American Army in the Revolution, contains the line: "Jog on your gait, ye blatherskate." Variants are bletherumskite and blatherskite. An informative Paston family letter written in 1449, relates: "I cam abord the Admirall, and bade them stryke [pull down their flag] in the Kyngys name, and they bade me skyte in the Kyngys name." (Note that sk, as still in Scandinavian tongues, was long sounded sh in English.)

Perhaps by some standards, then, shit is the more euphemistic, as it initially referenced the act of separation, not the waste itself (as with feces). Nevertheless, the much longer history of shit in English indicates it's use in literal reference to bowel-movements dates back at least to 1449, while defecate enjoyed a more general [metaphorical?] sense of removing waste at least as late as 1870. Here, again, however, it seems impossible to separate out the role of Latinate-bias in such choices, and the prospects for any objective account of relative "obscenity" seem dim.

[As should be expected given the inherently subjective nature of the language - world relationship.]

Wednesday, December 5, 2007

probability and public policy V: propensity theories

[part one of this series here]

As discussed before, frequentism suffers from some conceptual problems as an objective theory of probability. One problem we have not discussed, however, is that posed by one time events. Consider, for example, betting odds on a sporting event. I estimate that the probability the Aggies will beat the Longhorns in their upcoming game is 1/3, and bet accordingly. Now, this game is a one time, irrepeatable event, can we make objective sense of such a probability assignment? (If the fact that the Aggies and the Longhorns have met many times in the past is throwing you, consider this: on each meeting, the teams have had different players, different coaches, and different records for the season; thus, these are not repeats of the same event as with the tossing of a fair coin.)

Another popular example of one time probabilistic events is radioactive decay: when speaking of the probability that a lump of uranium will emit an α-particle within some time period t, we cannot be referring to the frequency of the outcome of a process (what process? something internal to the uranium? ~ but uranium emits particles "spontaneously," surely if there is such a process it is unobservable).

One solution to these worries is to interpret probability in terms of propensity: to say the Aggies only have a 1/3 chance of beating the Longhorns is to speak of something about the Aggies (the makeup of the team, the strategies they use, the quality of the coaching, etc.) which objectively determines their chances of winning this one time event. In the case of the uranium, we can say it has the propensity to decay at a certain rate. In the case of a coin, we can say a coin is "fair" if it has a propensity to come up heads with probability 1/2. Here, it makes sense to speak of the coin (or, more specifically, the mechanism of the toss) as being "fair" or not (as exhibiting a certain structure) even before the coin has been tossed a single time ~ no notion of a hypothetical infinity of trials is needed.

However, there are conceptual problems with the propensity interpretation as well. In particular, propensities cannot themselves be probabilities. The symmetry in the probability calculus which allowed us to derive Bayes' Rule is not exhibited by propensities (as pointed out in Humphries, 1985):

The point can be illustrated by means of a simple scientific example. When light with a frequency greater than some threshold value falls on a metal plate, electrons are emitted by the photoelectric effect. Whether or not a particular electron is emitted is an indeterministic matter, and hence we can claim that there is a propensity p for an electron in the metal to be emitted, conditional upon the metal being exposed to light above the threshold frequency. Is there a corresponding propensity for the metal to be exposed to such light, conditional on an electron being emitted, and if so, what is its value? Probability theory provides an answer to this question if we identify conditional propensities with conditional probabilities. The answer is simple-calculate the inverse probability from the conditional probability. Yet it is just this answer which is incorrect for propensities and the reason is easy to see. The propensity for the metal to be exposed to radiation above the threshold frequency, conditional upon an electron being emitted, is equal to the unconditional propensity for the metal to be exposed to such radiation, because whether or not the conditioning factor occurs in this case cannot affect the propensity value for that latter event to occur. That is, with the obvious interpretation of the notation, Pr(R/¬E) = Pr(R/E) = Pr(R). However, any use of inverse probability theorems from standard probability theory will require that P(R/E) = P(E/R)P(R)/P(E) and if P(E/R) ≠ P(E), we shall have P(R/E) ≠ P(R). In this case, because of the influence of the radiation on the propensity for emission, the first inequality is true, but the lack of reverse influence makes the second inequality false for opensities.

To take another example, heavy cigarette smoking increases the propensity for lung cancer, whereas the presence of
(undiscovered) lung cancer has no effect on the propensity to smoke, and a similar probability calculation would give an incorrect result. Many other examples can obviously be given.


The point here is just that probabilities are symmetric with respect to causal order, this is the trick which allowed us to derive Bayes' Rule. Yet propensities cannot be made sense of in this way; they are asymmetric with respect to causal order.

Nevertheless, Suppes, 1987 and 2002 has argued that in particular cases one can derive a probability calculus from propensities. This is why we speak of propensity theories: there can be no one unified derivation of the probability calculus from a general theory of propensity.

Now that we've got some different perspectives on the table, let's see how they deal with a couple of examples.

next: "priors" and the fair coin revisited

Saturday, December 1, 2007

understanding the mind V: color vision

Why do we use words like white, black, red, yellow, etc. to describe human skin color when the related skin tones aren't "really" white, black, red, etc.?

There are four types of light sensitive cells in the retina: the rods and the S, M, and L cones (for, roughly, "short," "medium," and "long" wavelengths). These cells contain a molecule which changes shape when hit with a baseline level of photons, but each are sensitive to light waves of different wavelengths.
E. Bruce Goldstein, Sensation and Perception, 7th edition, 2006


The S, M, and L cones interact to produce color vision. We can test the degree of sensitivity to each type of cell at a variety of different wavelengths to determine it's sensitivity profile.

Brian A. Wandell, Foundations of Vision, 1985


These three types of color cell are "wired" into two opponent color circuits, the red-green and the blue-yellow circuits.

E. Bruce Goldstein, Sensation and Perception, 7th edition, 2006


This process separates the highly correlated M and L cone signals in order to provide a richer color space. A consequence of the wiring from three wavelength detectors to two opponent color circuits is a circular color space, familiar to many as the color wheel. When we graph this color circle against the dimension of brightness, we get a spindle shaped space corresponding to the subjective perception of color.

Paul M. Churchland, The Engine of Reason, The Seat of the Soul, 1996


Peter Gärdenfors has observed that if we consider the spindle shaped subspace of this color space which corresponds to possible human skin tones and attach our basic color words to the corresponding parts of this subspace, we can retrieve the use of these terms in describing human skin tone.

Peter Gärdenfors, Conceptual Spaces: The Geometry of Thought, 2004


Here we have an example of a linguistic structure, an analogy, which is suggested, perhaps even forced, by the physiological structure of human perception. . . . and how many more also are?

probability and public policy IV: frequentism

[part one of this series here]

The subjective view of probability leaves probability entirely in our heads, it is merely a reflection of our uncertainty about the relevant details. An objective view of probability places probability "out in the world" somehow. The first such theory we will examine is frequentism.

The frequentist believes probability attributions make a hypothetical claim about an infinite number of trials. Consider again our coin toss; for a frequentist, the claim that the probability of heads on a given toss is 1/2 means that if the coin were tossed an infinite number of times, then 1/2 of those tosses would come out heads. This approach considers probability as a hypothetical property of a physical system, but is extremely problematic when we examine the details.

Suppose the underlying physical mechanism of the toss is indeed fair, i.e. the probability "really" is 1/2: what guarantees that exactly half of the trials will come out heads? This relationship between the limit of the number of heads as the number of tosses goes to infinity and the underlying physical mechanism is theoretically guaranteed by the law of large numbers.

In order to further clarify this distinction between the frequency and the physical mechanism, consider another classic example, this time from Bernoulli. Suppose one fills an urn with 50 red balls and 50 green balls. Then, one uses some procedure (say shaking the urn then reaching in with one's eyes closed) in order to select from the urn at random. After each selection of a ball from the urn (a "trial") the color of the ball is noted and it is returned to the urn. Here, there is a physical fact about the ratio of red balls to total balls in the urn (50/100 = 1/2); there is also a physical process to randomly select balls. What the law of large numbers tells us is that if the process for selecting balls from the urn is indeed "random," then the ratio of red balls to total balls examined will approach 1/2 as the number of trials grows.

There are two related remarks to make here. First, the physical systems associated with most classic probability setups will deteriorate over the course of such an extended number of trials. For example, if one rolls a die several hundred thousand times, it begins to turn spherical as the corners chip away from friction. The point here is just that if the procedure of rolling the die were actually carried out an enormous number of times, the number of rolls returning five will only approach 1/6 for a while, until the die has deteriorated enough that the physical properties of the system change, at which point, it is no longer clear that an unambiguous answer of five will even be possible.

This brings us to our second remark. The assumption behind the law of large numbers and the urn, coin, and die examples is that the underlying physical processes will produce stable probabilities. Logically, however, nothing rules out the possibility of a coin which comes up heads 1/2 the time on the first 100 tosses, but 1/4 the time on the next 100 tosses. In fact, the die example shows that there are strong empirical reasons for suspecting that many systems do not in fact demonstrate this long term stability.

So, frequentism succeeds in putting probability "in the world," but at the cost of plausibility. Still, there are many situations for which frequentism is an especially useful view, and it is one of the prime competitors against subjective Bayes in the realm of statistics.

next: propensity theories

Wednesday, November 28, 2007

the beginning of social theory


The achievement of human purposes is possible only because we recognise the world we live in as orderly. This order manifests itself in our ability to learn, from the (spatial or temporal) parts of the world we know, rules which enable us to form expectations about other parts. And we anticipate that these rules stand a good chance of being borne out by events. Without the knowledge of such an order of the world in which we live, purposive action would be impossible.


This applies as much to the social as to the physical environment. But while the order of the physical environment is given to us independently of human will, the order of our social environment is partly, but only partly, the result of human design. The temptation to regard it all as the intended product of human action is one of the main sources of error. The insight that not all order that results from the interplay of human actions is the result of design is indeed the beginning of social theory. Yet the anthropomorphic connotations of the term 'order' are apt to conceal the fundamental truth that all deliberate efforts to bring about a social order by arrangement or organisation (i.e. by assigning to particular elements specified functions or tasks) take place within a more comprehensive spontaneous order which is not the result of such design.


F. A. Hayek, "The Confusion of Language in Political Thought," 1967

Monday, November 26, 2007

probability and public policy III: bayes' rule

[part one of this series here]

So, I have a subjective belief about what will happen when a coin is tossed; now, how can I adjust this belief in light of evidence (say, after I've flipped the coin 100 times)?

When we discuss probability, we often do so conditionally. For example, we claim "the probability that the coin will come up heads is 1/2, given that the coin is fair." We write the probability that A is true given that B is true as P(A|B). A basic fact of the probability calculus relates the probability of A&B to the probability of B conditional on A:
P(A)P(B|A) = P(A&B)


But "and" is a symmetric relation, i.e. A&B if and only if B&A, so

P(A)P(B|A) = P(A&B) = P(B&A) = P(B)P(A|B)


and after dividing both sides by P(A), we get Bayes' Rule:

P(B|A) = P(B)P(A|B)/P(A)


OK, but what does this mean? Bayes' Rule tells us how to update our subjective belief state in light of new evidence. To see this, replace B by H for "hypothesis" and A by E for "evidence":

P(H|E) = P(H)P(E|H)/P(E)


What Bayes' Rule now tells us is that our probability that a hypothesis H is true given that we receive evidence E is just equal to our prior probability that H is true times the probability that we would receive evidence E if hypothesis H were true, divided by the probability of E (usually found by summing over the weighted conditional possibilities given all potential hypotheses).

Updating by Bayes' Rule ensures that one's probability distribution is always consistent. It is important to note, however, that one's conclusion about the probability of hypothesis H given evidence E depends upon one's prior assignment of probabilities. Of course, your belief state will eventually converge to the "actual" probability: so, if you believe the coin to be fair, but it is flipped 100 times and every toss comes up heads, your belief in the probability that the coin is fair will be very low in light of this evidence.

To illustrate how belief update occurs, consider a contrived example. A stubborn, but rational, man, Smith, thinks it is extremely unlikely that cigarette smoking causes lung cancer. For Smith, say, P(cigs cause cancer) = 0.2. Instead, he licenses only one alternative hypothesis: that severe allergies cause cancer. Since these hypotheses are exhaustive, on pain of inconsistency, Smith must believe P(allergies cause cancer) = 0.8.

Now, suppose Smith's Aunt Liz dies of lung cancer. Furthermore, suppose Aunt Liz has been a heavy smoker her entire life, then P(Liz gets cancer | cigs cause cancer) = 1 (certainty). Suppose, also, that Liz has had minor allergies for most of her life; since these allergies are only minor, let's say the probability she gets cancer under the hypothesis that severe allergies cause cancer is only 0.5.

Briefly, how should we calculate P(E) here? We sum over the weighted possibilities:

P(E) = P(H1)P(E|H1) + P(H2)P(E|H2) = 0.2(1) + 0.8(0.5) = 0.6

So, now we can use Bayes' Rule to calculate Smith's (only consistent) subjective degree of belief in the hypothesis that cigarettes cause cancer given the evidence that Aunt Liz has died of cancer.

P(H=cigs cause cancer) = 0.2
P(E=Liz gets cancer | H=cigs cause cancer) = 1
P(E=Liz gets cancer) = 0.6

Plugging these values into Bayes' Rule we get:

P(H|E) = P(H)P(E|H)/P(E) = 0.2(1) / 0.6 = 1/3


So, in light of this evidence, Smith's belief in the hypothesis that cigarettes cause cancer has increased from 1/5 to 1/3. Two important points to note here: i) The probability calculus only tells us how to update prior beliefs consistently, it does not tell us what belief state to start from; given sufficiently different prior probability distributions, two agents may draw dramatically different conclusions from the same data. ii) Notice that our calculation of P(E) depended upon the space of hypotheses we were considering. If an agent has failed to consider the actual cause of a piece of evidence E as a potential cause, he may perceive E as increasing the probability of a spurious hypothesis. (Suppose, for example, that it is actually a particular gene which causes both a tendency to smoke and a tendency to succumb to cancer, then cigarettes will be a decent predictor of cancer (supposing sufficiently few non-gene-carriers smoke), but not the cause of cancer - nevertheless, the hypothesis that cigarettes cause cancer will be supported by the data in this alternate scenario if the agent does not license this additional possibility.)

Given these caveats, however, we can see how a sufficiently large number of pieces of evidence that are not adequately supported by alternate hypotheses will eventually push an agent's belief state toward the "true" conclusion: if Smith sees enough people get cancer who don't experience severe allergies, he will come to assign a high probability to the possibility that cigarettes cause cancer.

next: frequentism

Sunday, November 25, 2007

probability and public policy II: subjective bayes and the "fair coin"

[part one of this series here]

Subjective Bayes is the view that probabilistic statements reflect our state of uncertainty; as such, they are subjective, a mere reflection of a particular human's internal state of belief.

for example: Consider a hypothetical "fair coin": what does it mean to say this coin is "fair"? As it turns out, if we know the precise conditions under which a coin is flipped, we can confidently predict the outcome of a coin toss. The physics itself is relatively simple (comparatively), and if one knows the initial conditions one can predict the outcome (i.e. if one knows the physical properties of the coin and the details of the force which set it spinning (as well as air resistence, wind velocity, etc.), one can calculate precisely which side will land face up).

Consider briefly a simplified model where we assume that air resistance and wind play no role and that the mass of the coin is uniformly distributed. We can graph the outcome of a coin toss against the upward velocity at the time of release (V, in feet per second) and the angular velocity (ω, in revolutions per second). Assuming the coin starts heads up, we can treat grey as heads and white as tails:

Persi Diaconis, "A Place for Philosophy? The Rise of Modeling in Statistical Science," 1998.

The above analysis is from work by Persi Diaconis. He discusses where in the above graph the "typical" coin toss falls:
For a typical one-foot toss, experiments show that coins go up at about 5 m.p.h. and turn over 35-40 revolutions per second. In the units of the picture the velocity is concentrated at about 0.2 on the velocity scale. This is close to zero in the picture. Fortunately, the spin is concentrated at about 40 units up on the ω axis.

For additional details see "The Probability of Heads" by Joseph Keller.

The point here is just that the "probability," or fairness, in a coin toss is not a property of the physical setup, but rather of the observer's belief state about the outcome. Once we are privy to the initial conditions of the coin toss, there is no more uncertainty and thus no more "probability" or "chance" in the event.

Another way of thinking about subjective Bayes is in terms of one's willingness to bet upon an event the outcome of which is as yet undetermined. de Finetti used this thought experiment as the basis of his theory of probability and demonstrated that any set of odds which cannot be subjected to a Dutch Book is a valid probability distribution (a "Dutch Book" is a set of bets devised by the bookie such that you lose no matter what the outcome of the event is).

If probability is just a measure of one's uncertainty, how can we use it to interact with the world? To put it another way: how can I get my subjective probability assignments to match up with the "actual" proportion of outcomes due to the underlying mechanism at work? Or, if the coin isn't "fair, if I'm being cheated, how do I figure that out? The answers to these questions in our next installment.

next: Bayes' Rule

Saturday, November 24, 2007

probability and public policy I

Global Warming, second hand smoke, speed limits, airport security, bans on powerlines, types of food, cellphones, economic decisions, the decision to invade Iraq, etc. ~ a vast number of public policy issues depend for evidence upon the statistical analysis of data which, in turn, depends upon the philosophy of probability. Yes, statistics is that rare "science" for which the philosophical position one takes on its foundations actually effects one's conclusions. Given the same set of data, a subjective Bayesian and a frequentist may produce very different analyses. Furthermore, Kahneman and Tversky have demonstrated empirically that probability is an area which humans are very poor at analyzing. In particular, when the same scenario from a probabilistic standpoint is presented in different terms (say to emphasize the potential gain in one instance and the potential danger in another), people will make different (i.e. logically inconsistent) choices.

So, science which rests upon statistical analysis is qualitatively different from that which does not (in that the conclusions it returns are indexed by a particular philosophical view), and thus our treatment of its conclusions as evidence should, likewise, be qualitatively different from that of evidence from other sources. In particular, we should be careful to consider our own philosophical position on probability and how it accords with the analysis of the data presented to us. Finally, evidence suggests that our innate sense on this question is very poor and subject to manipulation through framing effects.

next: subjective Bayes and the "fair coin"

Wednesday, November 21, 2007

case study: einstürzende neubauten

Our examinations of Pierre Schaeffer and Tetsu Inoue both focussed on techniques for organizing found sounds into a theme and variations structure as a solution to the conceptual problem posed by musique concrète. Here we examine two other strategies, both implemented by Einstürzende Neubauten.

Like Pierre Schaeffer, Einstürzende Neubauten wanted to "perform" found sounds. Rather than sample them as Schaeffer did, however, Neubauten chose to modify the found objects themselves, turning them into instruments upon which the same pieces could be repeatedly performed. At the start, these modified instruments (including everything from pots and pans to jackhammers and buildings themselves) were usedly primarily as percussion instruments, as can be seen in this early clip:



In their present incarnation, Neubauten not only makes use of much more delicate sounds, as in their use of styrofoam peanuts as a percussion instrument, but has also found ways to perform within the Western tonal system on found objects. In several of the songs off Perpetuum Mobile, 2004, for example, Neubauten uses different lengths of piping as a pitched instrument. They also make use of pitched arrangements of metal sheets and springs and a custom built spinning device with an array of different sized metal brushes on it which can be used to perform delicate melodies.

The point here is that Neubauten looked for new timbres in found objects, but then explored ways to fit these found sounds into the Western tonal system. Here are some amusing comments on the practical difficulties of performing on such "instruments":



The second strategy we can see in the work of Einstürzende Neubauten is that of guiding composition via randomly generated constraints. This procedure is at the intersection of two distinct techniques. The first is from Western academic music where some composers of 12-tone or serial music generated the sequence of tones (or whatever) which would provide the basis for their manipulations via random methods. This procedure was developed in the latter half of the 20th century to the extent that some composers (for example, Iannis Xenakis) used chance procedures to generate entire scores. The second technique arrives via musical games in improvised music. "Free" improvisors such as John Zorn and Keith Rowe have "composed" pieces that are really a sequence of constraints defining a kind of game for the performers: a loose structure within which they can "freely" improvise.

Neubauten recently completed their "Jewels" project which involved composing a song a month for 15 consecutive months. When composing a "jewel," each member of Neubauten selected a number of cards at random from a set of 500 initially generated by Blixa Bargeld. On these cards were written phrases which might constrain the song as a whole or the individual's performance within that song in a variety of ways (which instruments to use, what style to play in, who should guide the composition of which song elements, etc.). Then, Neubauten would go into the studio and compose a song, each member constrained by the contents of his cards, though only communicating these to others when they concerned the structure of the song as a whole. This procedure left Neubauten to compose as they ordinarily would, but constrained by game-like "rules" (à la Zorn or Rowe) generated by a stochastic process (à la Xenakis).

Of course, this procedure need not be limited to found sounds; however, it does constitute a method for constraining the arrangment of sounds which is independent of any particular tonal system, and as such is particularly suited for application in the realm of musique concrète.

Jewel #8: "Robert Fuzzo"

Sunday, November 18, 2007

understanding the mind V


[above] At the bottom, a standard 3-layer neural net; these neural nets provide a toy model of neural activity during object recognition tasks. On the left is depicted the net's progress through the space of possible weight values during training (several dimensions have been suppressed). On the right we can see how the two prototypes on which the network was trained partition the space of hidden unit activation.
(Typo in original diagram: one point should be labeled "Prototype B")

[below] Such a trained neural network can perform pattern completion on a partial input (say, when only part of a nearby animal is in view) and categorize the partially perceived object with respect to one of the learned prototypes (say, as a rat).


From Churchland and Sejnowski, The Computational Brain, 1992.

Saturday, November 17, 2007

case study: tetsu inoue

With the development of digital signal processing (DSP) technology, a more subtle solution to the conceptual problem posed by musique concrète was developed. One exemplar of this solution is the work of Tetsu Inoue.

[caveat: this discussion is based on analysis of Inoue's work, not on any knowledge of his actual methods or thought processes.]

Tetsu Inoue is perhaps most famous for his ambient work, especially the albums Ambient Otaku, 1994 and World Receiver, 1996. This work (in particular World Receiver) often has a musique concrète flavor as it incorporates field recordings and other found sounds. Here, however, we will focus on Inoue's "sound sculpture" period, stretching roughly from 1998 - 2001, a period characterized by Inoue's move from the analog to the digital realm. The representative albums of this period are:

Psycho-Acoustic (May, 1998)
Waterloo Terminal (October, 1998)
Fragment Dots (May, 2000)
Active / Freeze (with Taylor Deupree) (July, 2000)
Object & Organic Code (June, 2001)

During this period, Inoue focussed more on short "sculptural" constructions, almost reminiscent of recent developments in free improv, than on the long, morphing textures of his previous ambient period (later work would combine aspects of both). The palette of sounds is quite divers, and it is often difficult to tell if a particular sound is generated digitally or is a digital manipulation of a found sound. Inoue clearly uses many different organizational principles; for example, Waterloo Terminal used the building (i.e. "Waterloo Terminal") itself as an organizational principle, inspiring Inoue to combine field recordings from within the space with sounds generated via a digital transformation of the terminal's blueprints into audio. Here, however, we consider Inoue's technique for importing the classical "theme and variations" structure into the realm of musique concrète, a task which seemed promising but problematic in our discussion of Pierre Schaeffer.

DSP provides a tool for manipulating subtle aspects of a sound: one can i) adjust the envelope (the rate at which the sound fades in and out, a manipulation powerful enough to turn a trumpet into a violin), ii) adjust the pitch ("Alvin and the Chipmunks" being a well known example (though most likely not achieved digitally)), iii) adjust the timbre (by selectively (d)emphasizing isolated frequencies, components of the overall sound, one can change it's character, turning a crowded room into a bubbling brook or a passing airplane into a pesky gnat), etc.

Of course, analog electronics could produce all these effects as well. What DSP offered was greatly increased organization, control, and memory of these parameters. At last composers could precisely specify the pitch to be filtered rather than find it by ear, they could store and "undo" manipulations, tweaking them until the desired effect was received.

In the Western tradition, a theme and variations consists of a melody and variations on that melody which tend to add or adjust the specific notes and rhythm played while maintaining the same basic chord progression. DSP opened the possibility of transposing this structure to the realm of musique concrète by taking a particular found sound (or sequence of sounds) as the "theme" and manipulations of its timbre as the "variations." This timbral "theme and variations" could then provide an underlying unity to the piece's organizational structure.

Consider this excerpt from "Cut and Clicks," the first track on Fragment Dots:



At 0:45-8 we can hear a sample of a running shower (presumably). This apparently unaltered sample is foreshadowed by a manipulation of the shower sound which integrates more closely with the surrounding bleeps and clicks at 0:05-0:07, and by brief spurts of a shower and a shower door preceeding the main sample at 0:43-5. At 1:11-2 we briefly hear the white noise of the shower again and from 1:35-end we can hear a succession of versions of the shower sound, each filtered in a slightly different manner (other segments may originate in this same sample, but have been manipulated too much to allow certain identification). For more local examples of variation via timbral manipulation, consider the patterns at 0:38-43, where the same fast vibration occurs at several different pitches and timbres, and 1:17-26, where the same rhythm is heard with three different timbres.

Of course, we have not by any means exhausted the organizational principles at work here. DSP also allows the composer to adjust a variety of source sounds such that they share a timbre, for example, and to closely control various permutations and arrangments of a predetermined set of sounds. "Cut and Clicks" surely evinces a variety of such techniques in addition to that of timbral variation.

Our final case study will examine an entirely different approach to the conceptual problem posed by musique concrète

Friday, November 16, 2007

dream

Travelling with a friend on towering 4 storey buses through a wasteland of sprawling labyrinthian malls and massive hotel / dormitories, we are subpoenad to testify at a trial. The matter is so urgent, we are pulled away from the rest of our tour group before dinner arrives. We travel through the sprawling landscape via long, mostly vertical shafts, clutching onto loose harnesses as we shoot along suspension wires. Arriving at court the proceedings are slow and convoluted. The matter of our expertise is unclear, but seems to lie at the intersection of Kantian metaphysics and WWII biplane studies - the relevance to a legal trial of such urgency is never explained. Our hunger increases, but an attempt to order Chinese leads to naught when the restaurant inexplicably hangs up halfway through our address - their haste ultimately preventing them from finding our room (despite locating our building) within the vast hotel landscape. Upon hearing the Chinese delivery men are searching for us, we desparately flee the courtroom, sliding for miles through the vast complex clinging to our flimsy harnesses in a desperate race to find the delivery men before they give up. I awake confused, uninterested in Chinese food, but plagued with anxiety about the state of biplane metaphysics.

Wednesday, November 14, 2007

"a peaceful heart"


". . . you liars, you hypocrites, you duplicitous, mealy-mouthed, unprincipled, terrorist-sympathizing scum!"

inconsistent

If I pay you to have sex with me, it's prostitution, and thus illegal.
If I pay you to have sex with me and videotape it, it's porn, and thus legal.

Tuesday, November 13, 2007

the presumption of the academy

. . . that meaningful discourse can proceed on issues which do not admit of empirical investigation and the resolution of which would have no pragmatic consequences for the lives of human beings.

Sunday, November 11, 2007

case study: pierre schaeffer

One solution to the conceptual problem posed by musique concrète, namely, what methods or principles should be used for the organization of found sounds into music, was offered by Pierre Schaeffer in his Cinq études de bruits, 1948.

Schaeffer founded musique concrete with this effort and his study of the sounds of trains, Étude aux chemins de fer, is widely recognized as the first piece of musique concrète.

In the '40s, Schaeffer worked as a technician for Radiodiffusion Française, and developed there a lab for using the new technology to experiment with sounds. In many ways, Schaeffer was more an engineer than an artist, and his experimentation was driven more by the possibilities opened up by the new technology than an aesthetic vision. It is fitting that here, at its inception, the conceptual problem of musique concrète is already at the forefront of its practioners' thoughts. As Michel Chion wrote in 1982 (as reprinted in the liner notes to the definitive CD release of Pierre Schaeffer's works):

Even at that time, [Pierre Schaeffer] was occupied with finding a basis for understanding and defining what was both an empirical and rigorous method for proceeding, even when the incongruity of that approach to music fascinated and horrified him at the same time. His own deeply felt ambivalence for the music that he invented became one of the dominant characteristics of his creativity and thought.

Schaeffer's solution to the conceptual problem in the Étude aux chemins de fer was a combination of method and principle. The method was Schaeffer's construction of the first sampler. He recorded a number of various sounds produced by trains and pressed them onto records, some in "closed grooves" such that the sound would loop indefinitely. Record players were arranged such that they could be triggered by different switches or keys. The composer (or performer), then could "play" the various sounds by pressing keys in the desired pattern (+ holding for the desired length of time, adjusting speed, etc.). The principles were twofold: improvisation and theme and variations. The former was a necessity in this uncharted musical territory, while the latter was one of the old standbys of Western music, but one sufficiently abstract that it could be applied in the new domain.

Nevertheless, the question of how presents itself: if a variation on a theme within the traditional Western framework constitutes a modulation to a different key, an insertion of additional notes within the chord progression, or a varying of the rhythmic structure, how should one vary the parameters of sounds with no key, chord progression, or rhythmic structure in the traditional sense? Doubtless, it was this dilemma that underlay the frustration and ambivalence of Schaeffer himself. We investigate some subtle solutions to this problem in our next case study.

Pierre Schaeffer's Étude aux chemins de fer:

Friday, November 9, 2007

thought crime is here

For crimes may be more severely punished when they are crimes of "hate" ~ though surely it is the dispassionate crimes which are more fearsome?

For one can be imprisoned for one's writing (music, art, etc.) if this art "inspires hate" or "promotes terrorism" ~ but is not a state which condemns its youth for their poetry deserved of destruction?

The transmutation of the West into the image of its enemy has begun: how long until we stare into the face of fundamentalist society and see but a cloudy reflection of ourselves?

Thursday, November 8, 2007

musique concrète

Before the age of notated music, composition meant production of a melodic, harmonic, and / or lyrical pattern compelling enough it could be transmitted in person to other musicians. Here all that is passed on is an idea, an essence of the tune, indifferent to instrument, setting, or interpretation.

Before the age of recorded sound, composition meant the production of written instructions which could be followed by anyone who understood the relevant code. In particular, a specific array of timbres (i.e. different instruments) was assumed, as well as a particular tonal system (the Western scale).

In the age of audio recording, the composer is no longer constrained by the limitations of a system of transmission, he can produce the definitive version of a piece of music himself, adjusting all relevant parameters. In particular, he is no longer constrained to one particular set of timbres or scales.

Musique Concrète is a form of composition which could only arise in the age of audio recording. The concept behind musique concrète is that the composer begins with a set of "concrete" sounds and arranges them into a piece of music. In practice, this means the composer is not limited to the timbres available via standard instrumentation; instead, he can collect any sounds which appeal to him (of a car passing, a bird chirping, the slam of a door, a cough, running water, etc.) and arrange them into a structure constituting the finished piece of music.

Once one has decided to use "found" sounds as one's timbral palette, however, one faces a dilemma, for the theoretical structure which has developed within the western musical tradition is no longer relevant (at least not obviously so). Since this theoretical apparatus provided a structure which constrained the arrangement of sounds, some new principles must be found by which to arrange the found sounds one has chosen to work with.

(For without some principle guiding their arrangment, what is to separate the arranged sounds from their place within a natural context? ~ what will separate "music" from "noise"?)

Next: three case studies, each examining a different solution to the conceptual problem posed by musique concrète.

1. Pierre Schaeffer
2. Tetsu Inoue
3. Einstürzende Neubauten

Wednesday, November 7, 2007

europa


. . . the gates of heaven.

~ Douglas P., 1987


European totalitarianism is an upshot of bureaucracy's preeminence in the field of education. The universities paved the way for the dictators.

~ Ludwig von Mises, 1944


Since authoritarian society reproduces itself in the individual structures of the masses with the help of the authoritarian family, it follows that political reaction has to regard and defend the authoritarian family as the basis of the "state, culture, and civilization." In this propaganda it can count on deep irrational factors in the masses.

~ Wilhelm Reich, 1946


The multitudes remained plunged in ignorance of the simplest economic facts, and their leaders, seeking their votes, did not dare to undeceive them. The newspapers, after their fashion, reflected and emphasised the prevailing opinions.

~ Winston Churchill, 1948, describing 1920

Monday, November 5, 2007

the cold war of the sexes

"I remembered Cliff Wainwright saying once that women were like the Russians - if you did exactly what they wanted all the time you were being realistic and constructive and promoting the cause of peace, and if you ever stood up to them you were resorting to cold-war tactics and pursuing imperialistic designs and interfering in their internal affairs. And by the way of course peace was more peaceful, but if you went on promoting its cause long enough you ended up Finlandized at best."

~ Kingsley Amis, Stanley and the Women, 1984

Saturday, November 3, 2007

BEEM, 1st place: Cannibal Holocaust

[Best Editing in an Exploitation Movie Award]

"For the sake of authenticity some sequences have been retained in their entirety."

In 1966, Gualtiero Jacopetti and Franco Prosperi released their follow up to '62's trend-starting Mondo Cane, Africa Addio. Africa Addio unflinchingly documents the end of colonialism in Africa with imagery both disturbing and profound. The filmmakers travel both with poachers and game wardens as the African animal reserves lose their "white-mandated" protection, and the graphic slaughter and corpses of (yes, literally) hundreds of animals are depicted on screen. Even more shocking, the film provides the only documentation of the slaughter of 5 - 20,000 Arabs during the '64 Zanzibar Revolution with footage shot from a circling helicopter. Most controversial upon the film's release, however, were several scenes of human executions, and, in particular, a scene in which a white soldier with a handgun executes a black at point blank range in a well framed close up. The filmmakers were accused of complicity, of staging this execution in order to get a "good shot" - they claim it just happened in front of them; due to the language barrier, they were totally confused and had no idea the execution was about to occur. A final controversy: the filmmakers were also accused of racism, of depicting blacks as barbaric animals unfit to govern themselves. This accusation has been widely discredited as an artifact of the bastardized versions of the film which showed in many countries. The original, intended edit is quite even-handed in its assignment of blame.

Flash forward to 1980: Ruggero Deodato releases Cannibal Holocaust, a fictional homage to / critique of Africa Addio, in particular, and exploitation ("shock") filmmaking in general. In the first half of the film, we follow an anthropologist's journey into the cannibal territory of the deepest Amazon on a search for a missing film crew. The crew is dead, but he manages to recover their footage. In the second half we see the story of the film crew revealed in their unedited footage, and the reactions of the anthropologist and the documentary's producers as they watch it. The film crew themselves are clearly modeled on (public perception of) Jacopetti and Prosperi. Before viewing the Amazon footage, the producers show the anthropologist some footage of executions from the film crew's earlier African movie, scenes clearly modeled after those in Africa Addio. When the anthropologist emits a grunt of shock, the producer says:

"Pretty powerful stuff, huh? Well, just to give you an idea how Alan and the others worked, everything you just saw was a put on. That was no enemy army approach, Alan paid those soldiers to do a bit of acting for him."

But what exactly is being criticized here ~ Jacopetti and Prosperi themselves, or some hypothetical demon created by public perception? And where does Cannibal Holocaust itself fit in? The film features numerous scenes of animal murder and mutilation - and unlike those in Africa Addio, there is no question that these acts were performed solely for the benefit of the camera. Conversely, the footage of African executions in Cannibal Holocaust is purportedly real, captured in Nigeria or East Africa. Is there a second issue here? Do we demean the memory of those who were executed even more by trivializing the circumstances of their death when we claim it is a mere "put on"? Did those who accused Jacopetti and Prosperi of complicity in the executions they captured on film efface and pervert - exploit - the memory of those who died even more than the filmmakers themselves? But if that's the case, then isn't Cannibal Holocaust, again, the most guilty party?

Ironically, we find a criticism of the exploitation of humans for the purpose of filmmaking in a film that itself commits that crime (possibly), and furthermore (definitely) exploits animals for the same purpose. Cannibal Holocaust is truly postmodern, then, in its self-referential criticism. The film accuses itself and finds itself guilty. Consider this snippet of dialogue between the anthropologist and the producer after they have watched a scene in which the documentary crew terrorizes a small village, killing a pig, in their quest for powerful footage:

"Oooo, I'm drained. You must admit, it's exceptional footage. I - I didn't expect such impact, such authenticity!"
"I don't know. I don't think exceptional is the right word."
"You don't?"
"No. I mean, what's exceptional about a primitive tribe, like the Yacumo, being ... terrorized, and forced into doing something they don't - they don't normally do."
"Come on now, professor! Let's be realistic, who knows anything about the Yacumo? . . . today people want sensationalism! The more you rape their senses, the happier they are!"
. . . .
"The Yacumo indian is a primitive and he has to be respected as such. You know, did you ever think of the ... the Yacumo point of view, that we might be the ones who are savages?"
"Well I've never thought of it that way, but it's an interesting idea."
"Yes, let's say things were reversed, right? and the Yacumo attacked your house, defiled everything that you held holy. You know that pig that was killed? That was food for those people. Now, what'd happen if somebody came to your house, when you were hungry, and took the little bit of food you had in the refrigerator, and threw it down the toilet? . . . would you behave in a civilized way? . . . would you like people to make money off your misery?"

But of course, Deodato himself is a filmmaker who has just had a pig shot on screen (although he wasn't actually terrorizing any "primitives" . . . ). In fact, Deodato and many of the participants in the film regret having made it and, in particular, the decision to kill and mutilate animals for shock value. Despite their regret, though, several of them still acknowledge the genius of the film, a genius which transcends the ethical choices involved in its filming.

Whatever retrospective view we may take of Cannibal Holocaust, the response to the director's ethical choices upon its first release were unambiguous: Deodato was arrested and charged first with obscenity, then with murder. He had convinced the actors who played the documentary crew to go into hiding for a year, but after the accusations were leveled against him, he was forced to track them down to prove his innocence. After the charge of murder was dropped, Deodato was charged with animal cruelty and the film was banned in Italy (it is purportedly the most banned film of all time).

Cannibal Holocaust has had a profound influence on exploitation cinema and spawned many immitators, though none with the subtlety and depth of the original. In addition to at least 6 unofficial sequels, the film was being remade (with animal mutilation and all) as early as 1981's Cannibal Ferox. 1999's The Blair Witch Project, though thematically very different, borrowed heavily from both the filmmaking techniques and marketing strategy of Cannibal Holocaust.

the trailer for Cannibal Holocaust:


memorable editing moment: Rather than any particular scene, it is the overall structure of Cannibal Holocaust's editing which is to be admired. Deodato uses 35mm for the real time narrative and 16mm for the film crew's footage, all of which is shot verite style (i.e. hand held, as if the filmmakers were "really" molesting primitives and interacting with cannibals). The complete consistency of this "verite" footage is what lends the film enough authenticity and power that it could be mistaken by authorites for a snuff film. It's potency is increased by placing it at the end of the film, after its recovery by our anthropologist. We see the interaction with each tribe first by our humanist anthropologist (whose job is made more difficult by their recent encounter with the exploitative film crew), then by the film crew ~ we see the aftermath of the damage before the damage itself. Add to this the profound beauty of the cinematography and Riz Ortolani (who also scored for Jacopetti and Prosperi)'s score, and the overall experience is far more powerful than the sum of its parts ~ the true mark of good editing.

"Keep rolling, we're going to get an Oscar for this!"

Friday, November 2, 2007

"irrational"

More often a codeword for disagreement than a substantive criticism of reasoning procedure.

Wednesday, October 31, 2007

icarine dreams


. . . and just when heaven seemed within our grasp, gravity in its spite did intercede, and apparent possibility did ever vanish.
. . . and hope wept.
. . . and faith did turn her back and spread her wings, abandoning her supplicants to shadows and imitations.
. . . and the night stretched on unto death, and ended thence the suffering of the innocent.

Saturday, October 27, 2007

understanding the mind IV


From Ray Jackendoff, Foundations of Language, 2002.

Friday, October 26, 2007

we are frogs

. . . .and the water is slowly coming to a boil.

I wish we were lobster. Then, at least, we could fight our damnedest against the encroaching night, instead of letting it slowly wash over us and deplete us of our humanity.

Mankind fought for millions of years to develop the sense of self, the individual. Now that he has it, he doesn't know what to do with it. And the luddites and pessimists and politicians and any cock-sucking, shit-eating, motherfucking jackass who has a smidgeon of additional power to wield over another can't think of anything constructive to do with it but rape the poor sucker's sense of self: tell him he isn't good enough, isn't smart enough, isn't free enough, isn't capable enough, isn't worth enough without this stuck-up, pocket hitler, tin-hat dictator making him better / freer / richer / more capable.

If you think you need X (insert: "education," "money," "health care," "safety," "security," "a loan," "a car," "the fire department," "40 acres and a mule," "a girlfriend," "crack," "the environment," etc.) to be free, i.e. to be an individual, autonomous, of worth, self-determining, then you're just as much a part of the problem as the tin-hat, small-dicked, crackpot, petty Hitler-wanna-be, imaginationless, hereditary aristocracies that you keep voting into power over you: you slave.

Yes, slavery really can be voluntary: for empirical confirmation, look in the mirror, sucker.

Monday, October 22, 2007

BEEM, 2nd place: DOA (Dead Or Alive)

[Best Editing in an Exploitation Movie Award]

"I know everything about this city; I came to Japan before you were born, cooking here at this stove . . . but even after 5,000 years, each day, a different flavor. Can you understand that?"

DOA, 1999, brought together for the first time Sho Aikawa and Riki Takeuchi, the two undisputed kings of Japanese V-Cinema. Takashi Miike, the greatest director of Japanese exploitation cinema (delivering examplary films in all exploitation categories short of straight-up pornography) has made better movies, perhaps, but none quite so excessive and exploitative. In fact, despite the thoughtful story and potent drama, it's hard to imagine DOA as anything other than a straight-to-video release. The sheer audacity of the imagery (a businessman snorting a 25 foot line of cocaine, a Sesame Street bird costume at a yakuza birthday party, a graveyard in the sand, a bonfire of bodies on a roof in the middle of Tokyo) strips the viewer of any lingering demand for realism, leaving him exposed to the emotional conflict which underlies the excess.

Thematically, the film explores the same subject as all of Miike's films: familial obligation and dynamics. In this case, the dedicated cop Aikawa's akward and tentative relationship with his sickly daughter and untrusting wife is contrasted with the "healthy" familial relationship exemplified by (Miike and "Beat" Takeshi regular) Susumu Terajima (who, curiously, tends to play "bad cop" when the duo interacts with criminals) on the one hand, and the strained fraternity exemplified by Takeuchi's relationship with his younger brother on the other. Takeuchi exemplifies another Miike theme, the dispossessed: in this case, as so often, embodied by 2nd generation Chinese youth in Japan, forced to bond together against both the Japanese and the 1st generation Chinese immigrants who dominate the criminal underworld. Takeuchi will stop at nothing (quite literally!) to force the other gangs out of business; Aikawa, likewise, refuses to accept the ensuing criminal rampage. The ultimate face-off between these twin stars of exploitation cinema may be disappointing from a plot standpoint, but faithfully delivers the sheer energy and violence demanded by the preceding 90 min. of this sonnet to excess.

the trailer for DOA:


memorable editing moment: DOA begins with an absolutely mind-blowing 7 min. montage, an exemplary microcosm of plot construction. Like a shotgun blast in reverse, numerous fragmentary images (a stripper falling from a 12-storey building, police beating a schoolgirl, a businessman snorting a 25 ft. line, a gangster assassinated while fucking a rentboy in a public toilet, the blood from his severed carotid spraying the tile wall and his orgasming partner equally, a bare-chested stripper, a shotgun hidden in a clown's bicycle basket, a cascade of noodles exploding out of a Triad's stomach as it dissolves under uzi spray) coalesce into larger and larger chunks until a coherent plot emerges: one gang attempts a takeover of another to the chagrin of the dedicated cops.

Sunday, October 21, 2007

the plot

"Consciousness is a plot to keep the parasites feasting off our bodies alive. Civilization is a conspiracy of microbes: they've tricked us into creating music, science, medicine, art, plumbing . . ."

~ Ivan Brunetti

Tuesday, October 16, 2007

doubly special relativity

Special Relativity is special because it places an upper bound on speed: speeds faster than that of light are simply impossible. What does this really amount to, though? - an upper bound on the speed of time. Why? Well, if we distinguish different moments in time by the different events which constitute them, and if every event has a cause, then temporal change is constrained to the same speed as causal change, i.e. the speed of light.

So we've constrained time, but what about space? Doubly Special Relativity posits a limit on space analogous to the limit on time. What does this amount to? - a smallest unit of length. Just as there is a fastest speed, there is a smallest size.

Of course, this smallest size would be pretty damn small. However, the positing of a smallest size resolves some ancient conceptual problems about infinity. If space is a continuum, i.e. infinitely divisible, i.e. there is no smallest unit of length, then we seem to have a real infinity right under our very noses. Zeno's paradoxes all turn upon the puzzling nature of such an infinite. Of course, resolution of paradox alone isn't a good enough reason to adopt a theory, but it does increase its intuitive interest. . . .