Stereo Subversion interviews Daniel Smith — the man behind Danielson, Danielson Famile, and Tri-Danielson — about a new album, Best of Gloucester County, and Smith’s creative process:
…it’s literally sitting down and carving out time and just waiting, as I said. I’m always putting little musical ideas on a dictaphone throughout every week of every year, so there’s always these raw materials that are being archived. I don’t go back and listen to them until I’m very far away from those ideas, so there is a routine of collecting. I collect riffs and melodies and chords. I also write down favorite words and phrases from the day.
So there’s the routine, but songs aren’t being written yet. It’s just collecting things and it’s creating piles of material. For me, I really enjoy distancing myself from those pieces over time, so I’ll collect something and then leave it. When I come back to it later, I don’t have any recollection… I’m able to come back at it with a fresh approach. Then if it’s still doing something for me, then I know that I have something. So there is routine. There is a practice that’s been building for years.
I think the thing that I want to embrace more and more deals with insecurities, which is to more and more get my mind shut up while I’m in the process of assembling and finishing these songs. My mind will say that’s no good, yet my gut will like it. For me what really does it is when that last step is coming about, whether it’s writing a song or assembling a lyric or whatever, I really try to make it happen fast and not allow those insecurities to come up and question the gut.
It’s awards season again, with the big ones — the Oscars — coming in just about a month. Already, various races are heating up and the pundits have been busy announcing the frontrunners, e.g., Natalie Portman, Christian Bale, Colin Firth. As such, it seems rather fitting that Brett McCracken raises an interesting question: why do the messed up characters get all of the accolades?
In Hollywood, truly good characters don’t get the attention and accolades that crazy, messed-up characters do. It’s much easier to win awards by playing a convincing evil (Heath Ledger in The Dark Knight) or a convincing trainwreck (Christian Bale in The Fighter) than by portraying a solid, stable, virtuous character. Even in Another Year, the marquee, award-winning performance is that of Lesley Manville — who is riveting and truthful as a single, sad, mess of a person. Why is this more riveting, truthful, and laudable to us than Ruth Sheen’s less showy, but no less truthful portrayal of a good friend? Why is Christian Bale’s wild-eyed druggie turn in The Fighter so much more acclaimed than Mark Wahlberg’s good-brother, reliable workhorse role? (a question alluded to by Bale in his Golden Globes acceptance speech).
I don’t always agree with Albert Mohler, particularly when it comes to the topics of creation and evolution… and yoga. But I appreciate his frankness and candor, as well as his willingness to shed light on various social and cultural issues that more folks ought to be aware of, e.g., gendercide in China and India, current abortion trends, and parental happiness. This time around, he looks at the deadly logic of anti-blasphemy laws:
In recent weeks, a coalition of Muslim nations has demanded (again) that the United Nations criminalize blasphemy. A considerable number of Christians might, at least at first hearing, think this as a reasonable demand. After all, we do not disagree that slander against the honor of God is a very, very dangerous sin. But anti-blasphemy laws place the power of theological coercion into the hands of the state, and this is deadly dangerous.
Kill Screen interviews David Gaider, the senior writer for BioWare, about storytelling, character interactions, and faith in Dragon Age: Origins, as well as his views on the future of video game RPGs:
The cinematic element is definitely becoming bigger. I think we’re trying to find a way to incorporate cinematics without making the fact that it’s cinematic so expensive; you just can’t have the volume of interaction that you had before, and that comes down to the art. There’s this hump that we’re getting over. Just adding in all the 3D animations and the cinematics was making the development so costly, that we were having to pare down the level and volume of interaction you were having. I think players were treating this as a bad thing, and it’s not, because there is potential there. When we watch movies, we’re seeing reactions and we’re seeing emotions. You never want to separate this element from the player’s imagination.
Since it’s a game, you want the player to have agency, and not make everything there on the page for them to see. There is a way, I think, for writing, game interactive-ness, and cinematics to get to a point where the player feels like they’re in charge. They see the writing, then see the emotional reaction on the screen in a way that’s believable, and fluid, and reactive.
The recent advances in neuroscience have provided many challenges to traditional ideas of theology, spirituality, and morality. And now, neuroscientists are claiming that they can explain why music makes us feel:
According to Meyer, it is the suspenseful tension of music (arising out of our unfulfilled expectations) that is the source of the music’s feeling. While earlier theories of music focused on the way a noise can refer to the real world of images and experiences (its “connotative” meaning), Meyer argued that the emotions we find in music come from the unfolding events of the music itself. This “embodied meaning” arises from the patterns the symphony invokes and then ignores, from the ambiguity it creates inside its own form. “For the human mind,” Meyer writes, “such states of doubt and confusion are abhorrent. When confronted with them, the mind attempts to resolve them into clarity and certainty.” And so we wait, expectantly, for the resolution of E major, for Beethoven’s established pattern to be completed. This nervous anticipation, says Meyer, “is the whole raison d’etre of the passage, for its purpose is precisely to delay the cadence in the tonic.” The uncertainty makes the feeling – it is what triggers that surge of dopamine in the caudate, as we struggle to figure out what will happen next. And so our neurons search for the undulating order, trying to make sense of this flurry of pitches. We can predict some of the notes, but we can’t predict them all, and that is what keeps us listening, waiting expectantly for our reward, for the errant pattern to be completed. Music is a form whose meaning depends upon its violation.
If neuroscience can explain why music makes us feel — and I’m not so sure that it can — would that necessarily make music less sublime, affecting, and transcendent? Or would holding on to the idea that music is still all of those things simply be us deluding ourselves?
Who knew that typography could get everyone’s knickers in a twist so easily? At least, that’s what happened when I posted Farhad Manjoo’s explanation of why you should never, ever use two spaces after a period:
Every modern typographer agrees on the one-space rule. It’s one of the canonical rules of the profession, in the same way that waiters know that the salad fork goes to the left of the dinner fork and fashion designers know to put men’s shirt buttons on the right and women’s on the left. Every major style guide — including the Modern Language Association Style Manual and the Chicago Manual of Style — prescribes a single space after a period. (The Publications Manual of the American Psychological Association, used widely in the social sciences, allows for two spaces in draft manuscripts but recommends one space in published work.) Most ordinary people would know the one-space rule, too, if it weren’t for a quirk of history. In the middle of the last century, a now-outmoded technology — the manual typewriter — invaded the American workplace. To accommodate that machine’s shortcomings, everyone began to type wrong. And even though we no longer use typewriters, we all still type like we do.
I’m a one-spacer myself, but I don’t discriminate against two-spacers. Indeed, some of my best friends are two-spacers.
Walter Murch, one of the most respected editors and sound designers in the business, recently wrote a letter to Roger Ebert explaining why 3D movies don’t work and never will:
The biggest problem with 3D, though, is the “convergence/focus” issue. A couple of the other issues — darkness and “smallness” — are at least theoretically solvable. But the deeper problem is that the audience must focus their eyes at the plane of the screen — say it is 80 feet away. This is constant no matter what.
But their eyes must converge at perhaps 10 feet away, then 60 feet, then 120 feet, and so on, depending on what the illusion is. So 3D films require us to focus at one distance and converge at another. And 600 million years of evolution has never presented this problem before. All living things with eyes have always focussed and converged at the same point.
That recent studies suggest that an upbeat attitude will not save you from illness is good news to grumpy-prone individuals such as myself who, nevertheless, would like to live nice, long lives:
The psychosomatic hypothesis, which was popular in the mid-20th century, held that repressed emotional conflict was at the core of many physical diseases: Hypertension was the product of the inability to deal with hostile impulses. Ulcers were caused by unresolved fear and resentment. And women with breast cancer were characterized as being sexually inhibited, masochistic and unable to deal with anger.
Although modern doctors have rejected those beliefs, in the past 20 years, the medical literature has increasingly included studies examining the possibility that positive characteristics like optimism, spirituality and being a compassionate person are associated with good health. And books on the health benefits of happiness and positive outlook continue to be best sellers.
But there’s no evidence to back up the idea that an upbeat attitude can prevent any illness or help someone recover from one more readily. On the contrary, a recently completed study of nearly 60,000 people in Finland and Sweden who were followed for almost 30 years found no significant association between personality traits and the likelihood of developing or surviving cancer. Cancer doesn’t care if we’re good or bad, virtuous or vicious, compassionate or inconsiderate. Neither does heart disease or AIDS or any other illness or injury.
Richard Clark argues that video games have killed authorial intent:
Christians love to go on about the importance of authorial intent, and I used to do the same. When it came to novels, movies, music, the main question at hand was “what is the author trying to say?”. We can go from there. Lately, I’ve started to move a little on that issue, but that’s a topic for another, longer post. With games, though, it’s clear: it’s not about the author. In fact, the authors are so numerous, and the development cycle is such a collaborative endeavor, that the author’s intent is nearly impossible to isolate most of the time. Not to mention that the best games excel at providing an experience that differs drastically from one player to the other. The developers can guide this experience, but subtle differences in how the game is played can change the meaning drastically.
Elsewhere, Clark discusses the ethics of video games in light of an upcoming ultra-violent video game called Bulletstorm:
We’ve seen these features before. Games have always been violent, and they’ve often reveled in brotastic protagonists who serve as ciphers for our own power fantasies. Games have always featured lame and offensive dialogue. The difference between those games and Bulletstorm, though, is the intentional nature of all of these things.
Bulletstorm is meant to shock and offend, but ultimately, to titillate. While games have portrayed violence, and shocked the public sense the beginning, rarely has a such a mainstream AAA title been so blatantly unabashed about the nature of the game. In the 70s, the arcade game Death Race, arguably the reason for the first video game violence related controversy, was conceived with an obvious naivete.
The marketing low-point? The infamous Cliff Bleszinsky sarcastically brags, “I made a video game where you can blow out another man’s ass-hole.”
And yeah, it’s possible. That commercial ends with a quick cut to a man’s anus being shot with burning bullets, and overflowing as he screams bloody murder and falls face first to the ground. In fact, it’s the sexual subtext of much of the dialogue, marketing and in-game text and actions that is most disturbing.
By encouraging players to pull off such skillshots as “Facial”, “Gang Bang” and “Bad Touch”, Bulletstorm becomes far more than just another violent videogame. Mortal Kombat’s spine-removal and explosive blown kisses seem perfectly reasonable (and very well may be) in the face of Bulletstorm’s seemingly complete lack of any social responsibility.
Andy Whitman proves once again why he’s one of my favorite bloggers, with this poignant and heart-wrenching article about, um, death panels:
Now, with the debate raging around me again, with every histrionic, strutting politician loudly weighing in with an opinion, I recall scenes that I would rather not recall.
The nurses in the hospice unit eventually taped my father’s eyes shut. The dancing eyes were purely muscle reflex, they assured me. Nevertheless, they were spooky. I think about them every time I read another sanctimonious editorial about encroaching euthanasia.
“You have no idea,” I want to respond. “There is life that mimics life, but is not life. It is nothing at all.” I suspect I want to respond this way so I can feel better about myself, to justify my decision, to assuage the nagging feeling that, in spite of all medical evidence, I live with doubts and fears.
Khoi Vinh reflects on Google’s recent CEO shake-up and wonders about the impact on Google’s design sensibility:
We tend to think that design is a function of good process, well-structured organizations, and copious time and budgetary resources. But design is just as much a function of leadership. Who’s in the top seat matters very much to whether a company can design well. If the leader cares passionately about producing amazingly well-designed products, then you can get a string of indelible successes that capture the popular imagination like we’ve seen at Apple for the past decade-plus. We haven’t seen that kind of result from Google during that same span of time, though. Beyond the iconic minimalism of the original Google home page, not one of their subsequent products has truly inspired us. I hope that Larry Page realizes that, with the resources and design talent he probably already employs, there’s no reason that has to continue to be the case.
Elsewhere: A collection of interesting links and articles that I’ve come across in the last week or so. Follow me on Twitter for more of the same.