Another tragic high school shooting spree, this time in Germany. And yet again, the media cannot cover it without having to shoehorn in a mention of the fact that the killer played videogames.

This from The Guardian 12 March 2009: “Testimonies of friends and acquaintances today portray the table tennis champion as a lonely and frustrated person who felt rejected by society. A 17-year-old who gave his name as Aki said he had been studying alongside Kretschmer at a private business school in the region and described him as a quiet and reserved boy who enjoyed playing a multiplayer video game called Counter-Strike that involves carrying out assassinations to complete missions. ‘He was good,’ he said.”

The full story in the paper is a reasonably standard piece of reporting but this mention of Counter-Strike seems entirely without context, crudely chucked is as a aside because, for some reason, news reporters seem to feel obliged to mention videogames whenever a young man goes on a tragic killing spree  or particular violent crime. It’s knee-jerk nonsense, not proper journalism. I’d expect it from a tabloid or a right-wing paper, but The Guardian?

The point is that millions of people worldwide play videogames, indeed as many people probably play games as watch movies, or read books – or newspapers. Videogames are just another culture medium, they’re not training grounds for killers. So if the perpetrators of the Columbine massacre played Doom, or another young American killer played Grand Theft Auto, and now this unfortunate young man is known to have played Counter-Stike. So, that means of all the videogames in the world, some infitessimally tiny number of them are also killers, if you were to look at it in terms of a Venn diagram. Well, of all the killers in the world, some of them probably watch movies, read books and watch TV too, yet whenever a young murderer is reported in the news, it’s very unlikely their television viewing or reading habits will be mentioned.

We’ve been here before so many times – 19th century penny dreadfuls were accused of corrupting the youth, as were comics in the mid-20th century, as were so-called “video-nasties” in the 1980s. Well, I’ve read a lot of comics, enjoy horror movies, and play first person shooter videogames, but my mind hasn’t been shaped into that of a desperate killer.

I’ve written about this before and here. I do wish the media wouldn’t keep casually throwing in ill-considered, implied connections between violent cultural items and acts of violence. I’m not saying that culture doesn’t nurture, and it’s possible this young man’s sensibilities were in part shaped by what he consumed culturally, as well as how he was brought up, his relationship with his parents and peers, his education, etc.  What I am saying is that nurture and the formation of personality and self are incredibly complicated. Journalism that insists of mentioning “videogames” in the same story as “school killing spree”, without  more in-depth, considered, research and evidence, is just misleading, mob-baiting, irresponsible, and plain lazy.

I consider myself a reasonably bright person, but one area I most decidedly struggle with is economics. When the newspaper goes on about “tumbleweeds blowing down your high street” I can’t quite reconcile it with a trip to the West End of London. Around Picadilly circus at 7.30 on the evening of Sunday 26 Jan was entirely comparble with the the same spot a year ago, or two years ago. Thousands of people in a frenzy of consumerism, clutching thousands of branded bags (all destined for the landfill), containing thousands of pounds worth of tat (much of that probably also destined for the landfill), and not a tumbleweed in sight. I’m really starting to feel the pinch of the rescession, but witnessesing these scenes – am I missing something?

Treated myself to a trip to the cinema yesterday, to watch the British horror film The Children. It was playing at the Empire Leicester Square, a cinema that boasts my favourite auditorium – the huge Empire 1, which has a fab, subtle lightshow before the programme starts.

The Children, being a lesser release, was in Empire 3, a pokey little auditorium up some stairs. The size wasn’t the problem though. Everyone shuffled in and found seats in the dark, with only the bright green glow of the Exit sign and the light from the adverts playing on the screen to light the way. This darkness was perfect for watching images on the screen, but not ideal for finding seats before the main feature had even started. Then when the feature did start, the lights suddenly went up, so much so that I could see reflections glinting on the bald head of a chap two rows in front. It was rubbish. Especially for a horror film.

A lot of others shuffled and grumbled, but being Brits no one got up to ask about it. One chap might have done, but still nothing was done about the lights, so I went and asked. I met a chap on the stairs who stuck his head and said “No, they’re the safety lights.” Eh? Safety lights now have to be so bright that they reflect on the screen and dull the image? That defeats the object of the cinema experience, where the illumination comes from the image on the screen, and any other light source is a distraction, be it a bright green Exit sign too close to the screen or the light pen of a tiresome numbskull journalist who never mastered the skill of taking notes in the dark.

I toyed with the idea of leaving, but the film was gripping. Indeed, The Children, along with Eden Lake, has raised the torch for quality British horror high for 2008.

Afterwards, the same guy from my row was asking what was wrong with the lights at the ticket counter, so I joined in too. The woman was joined by the guy I’d spoken to before, who may well have been the manager. He said the lighting should have been on similarly during the adverts, because it was required safety lighting. He explained that Westminster Council’s fire officer had recently been round and said all the lighting needed to be increased. So great, Britain’s disproportionate, nannying health and safety culture is now buggering up the cinema experience too. Westerminster is particularly pedantic, one of the worst governing bodies when it comes to health and safety, so this may well mean all cinema experiences in central London are now ruined.

Now, any sensible, logical person knows that putting a sign up by the ditch saying “Beware, ditch” won’t stop people occasionally tripping into that ditch. Accidents happen, full stop. They’re freak, fluke, matters of chance. Legislation cannot prevent accidents happening. In a cinema auditorium, the bright green light of the emergency exit sign is highly visble, even if you have poorer eyesight. And indeed, if you do have poorer eyesight, presumably you’ll be wearing your contacts or glasses if you’ve gone to watch a film. Insisting on bright ambient light during the main feature – the bit you’re paying for – in a cinema is only going to ruin the experience. The manager guy did say they were hoping the redevelop that Empire 3, but if Westminster now insists on X candela that probably won’t help.

These days, many people have big TVs and can control the light levels in their own homes – where thankfully, a health and safety officer can’t knock on your door and tell you to turn on your lights. If people have this option at home, more and more will stay away from cinemas that are themselves now legally required to bugger up the light levels in auditoria and ruin the fundaments of the experience. If, like me, you love going to seee movies in a cinema, where that beam of light is projected over your head onto a screen that by and large is still bigger than most TVs (and certainly our old CRT), creating that unique atmosphere that’s intoxicated punters for more than a century, this is a tragedy.

Addendum:

During the few weeks after I posted this, I went to a couple more central London cinemas, such as the Cineworld in the Trocadero and the Odeon on Shaftesbury Avenue. Neither had lighting as offputting as in that specific Empire auditorium, so either it’s not a Westminster council edict, or if it is, these other cinemas are yet to act on it.

Terminator HUD

October 9, 2008

Like many males of a certain age, I’m a fan of the Terminator franchise. Or at least, I’m a fan of some it, particularly the intense first film (which I saw at the British equivalent of a “sleep-over” as a teen, after its 1984 release). Many argue Terminator 2 (1991) was the better film, but making Arnie cute was a mistake, and frankly, T2 is a feature film constructed around special effects. Cameron, the great technological innovator, was so keen to show off the advances he’d been making with early CGI, after the trailblazing water tentacle in The Abyss (1989), the story – to my mind – was compromised. Indeed, the whole franchise was compromised by the very presence of the mercury T-1000, a Terminator model completely at odds with the pistons-and-plating hardware of the T-800 101 (the Arnie model), as designed by the late, great Stan Winston. If Skynet can produce T-1000s, why even bother with more traditional robotics? Maybe a true Terminator geek can explain that too me – maybe the T-1000s require impractical amounts of resource and energy to just roll off the Skynet production line.

Anyway, that’s not what I meant to write about. There’s something else that’s always really bugged me about the Terminator franchise, about the world of Terminator. I can handle (though not necessarily get my head around) the sundry paradoxes thrown up not just by the time travel theme, but also by the very fact that time was passing between each movie installment, requiring tweaks to the timeline. I can even get my head around the divergence after T2 into the lame T3 and the frequently excellent Sarah Connor Chronicles. Heck, I’m even excited about T4, as finally it’ll get us into the world after Judgment Day. I wrote a big preview of T4, aka Salvation, over here. At this stage, my excitement about seeing armies of T-600s and newly created T-800s and whathaveyou stomping on the bones of humanity is overriding my anxiety over it being in the hands of McG, the man who made the execrable Charlie’s Angels movies.

No, the thing that’s been bothering me is the Terminator HUD. It’s been bothering me for years, but after watching the latest Sarah Connor episode, my brow furrowed again. Look, here’s my point. A Terminator, a real Terminator, not that T-1000 liquid nonsense (sadly revived for the series – bah), is a robot with a computer brain. The data it would receive through its visual sensors, basically some sort of sci-fi camera lenses, would immediately be converted into binary and read by the computer’s central processor. So why then do we have these ridiculous Terminator POV shots where there’s an interface, with scopes and on-screen text? “TARGET IDENTIFIED: JOHN CONNOR. TERMINATE”. It’s ike that on the head-up display of a fighter pilot or gunner, or even like the UI of a computer game. It just doesn’t make sense because, as a digital entity, all the data received by a Terminator would be integrated and processed pretty much simultaenously in the form of binary. Even if you take the justification from within the fiction that the original Terminator designers in the US military needed a UI, surely Skynet would have phased these out?

Sure, it’s fantasy, the HUD is an effective dramatic device and all that, but seriously, it’s silly. The Terminator isn’t using a screen, its lenses are gathering data to be interpreted by its CPU. Robots don’t need a HUD or a UI for crying out loud! The presence of a HUD or UI in the Terminator POV is almost as daft as having them move a cursor around with a mouse.

Mega-SAD

September 9, 2008

Well, Britain has just experienced its second horrendous non-summer in a row. Last year it rained pretty much non-stop for most of July and August, and this year we’ve had the “most overcast August since records began”. Great.

I remember seeing bizarre futuristic mock-ups in a newspaper a few years ago of what Britain would look like with the effects of global warming – palm trees alongside new bodies of water in Merseyside or somesuch nonsense. Er, guys, it looks like climate change actually means the usually fickle British weather is going to disintegrate further into blurred seasons and a general morass of grey, warm weather. Last winter, I wore my winter coat probably three times, as it was so mild. The pond in my local park froze for about one week in December, but that was about it for zero-ish temperatures.

I like seasonality. I like the promise of some sun in the summer and some cold crispiness in the winter. Well tough, that seems to be getting less and less reliable now. An entire summer of rain and grey skies.Sheesh.

I used to joke Britain’s cruddy weather was one reason we were, in days of yore, such great explorers and imperialists, thrusting ever outwards to warmer climes. Now it just feels like we’re going to be stuck with a mega case of SAD, except it’ll be all year round. Summer sun is supposed to lift you. Well not this year. Mega-SAD and a recession. Good times.

Maybe we’ll get some hot sunny weather in December that’ll check the SAD…

Making wedding cakes

August 26, 2008

In terms of immediate, tangible responsibility, this is one of the most demanding jobs I’ve ever done. I’ve been making cakes all my life, but I’m not a professional baker, even less a patissiere, so this was quite a challenge – especially when you consider the end result has to not just please the couple getting married (in this case, two friends), but also their one hundred guests. Yikes!

I’m confident when it comes to sponge cakes and variations thereon, but our friends Sara and Mike – I suspect at the suggestion of my missus, Fran – primarily wanted a croquembouche. This is a French wedding cake that involves piling choux buns filled with a crème pâtissière and whipped cream mix up into a tall cone, all glued together with hard crack sugar. The name means “crunch in the mouth.” I don’t think I’d made choux pastry for about a decade, never mind the fact that I’m not that au fait with sugarcraft. Double yikes!

We experimented, making a few mini-croquembouche. One can be found over at Cake-Off. It’s a bit wonky, but was otherwise fairly successful. The challenge then was doing the same but about 10 times the size. Oh, and making the back-up cake too. I suggested this for lack of confidence over the croquembouch. I would also make a tiered cardamon cake, covered in chocolate icing. The cardamon cake is an old fave from a Molly Katzen cookbook that I’ve been making for years.

Anyway. So, here’s the making of the three-tiered version of the cardamon cake. Never made quite such a vast cake mixture – all that creaming of sugar and butter was hard work (about 1.5kg of each, plus the same of flour, 1.5l of sour cream, a dozen eggs etc). No wonder Victorian cooks had such beefy arms…

I made a ganache for the filling, then an icing with melted chocolate, butter, water and icing sugar to cover the whole thing. It’s very rich and pretty dense. Here’s a slightly rubbish pic of the finished cake:

The main event, however, was the croquembouche. I started making choux buns at 8am on the day of the wedding, but managed to screw up two double-quantity batches initially, which didn’t help. They weren’t quite crisp enough. I left them in a warm oven to crisp up, but they weren’t ideal. So I just kept making batches. Must have made a hundred or more choux buns that morning, then dipped them all in sugar boiled to 160C (hard crack stage), and filled them with the creme/cream mix. This pic shows just some of the choux buns, as well as the nougatine base Fran made:

I was loosely following the Roux brothers croquembouche recipe, though I found Delia Smith’s choux paste recipe more reliable. Different croquembouch recipes suggest different means of making the cone of buns, but we got a stainless steel mould, 480mm (19 inches) tall. Some recipes suggest you just make the cone one layer thick, but we thought, 100 people, why not fill the whole thing – especially as we’d made so many choux buns. I was hoping it would give the finished cone better structual integrity too…

Here’s the finished thing, turned out, relieved of its greaseproof paper (which prevented it from sticking inside the mould), and decorated with a few crystallised violets:

It’s not quite as refined as the pics I’ve seen of ones made by the professionals, like those crafty Roux chaps, but it wasn’t bad. After a slightly worrying taxi ride across London, both cakes reached the venue, with only slight damage to the top of the croquembouche (minor squashing). Mike and Sara seemed pleased. And how many times have you been to a wedding with those nasty traditional frou-frou fruit cakes where the bride and groom end up with most of it sitting in their cupboard for months afterwards. These results speak for themselves:

[Apologies for the somewhat messy pic spacing. Can never get the pic and text integration right…]

PC gaming = eco doom?

May 20, 2008

I’ve been PC gaming on and off for the past five years. Getting into writing about videogames, I replaced my quaint old iMac with a then fairly robust PC.

Without any shame I can say that the short list of my best ever cultural experiences includes playing Sony’s now slowly dying Planetside, a game that really merited considerable more acclaim that it ever got. Planetside is a massively multiplayer first-person shooter, and pretty much remains a one of a kind. When it was launched in 2003, it opened up to me the wonders of massively multiplayer online gaming, a source of epic experiences (adventures, battles) and deep camaraderie unparalleled in console gaming.  That segued into playing World of Warcraft, another masterpiece of a game, but one with considerably more dubious qualities (notably as a time-sink. At least with Planetside, a game based of persistent combat played out between three armies, you could jump in and out pretty much at will. In World of Warcrack or other similar MMO role-playing games, you frequently get locked into activities that take several hours and are hard to extricate yourself from due to responsibilities to other players).

Anyways, I kicked World of Warcrack a few times, as I find its pleasures tend to work in cycles – you level a character, it’s easy-going and fun; you hit the level cap, and it all turns into gaming-as-accountancy, with a preponderance of more dedicated players obsessing over statistics and gear, and suggesting – fairly rightly – that you can’t “progress” without a similar obsession.  For me, the cycle this time round has coincided with my PC really reaching the end of its life as a viable games machine.

A lot of friends are moving form WoW to Age of Conan, a new-fangled MMORPG  that requires some pretty hefty graphics processing abilities and whatnot to run at its more impressive graphics settings. My PC has been giving up a fair bit on me recently when it comes to gaming – the most recent chapter of Valve’s masterpiece Half-Life 2 hung on load screens for way too long to really qualify as much fun any more, and the expansion pack to the superb WWII themed real-time-strategy game Company of Heroes just resulted in things grinding to a halt if more than a few dozen little men and tanks were going about their business on my screen.

So I started looking around at replacement PCs. Ones that I could work on (I workfrom home, but that generally only requires basic Word, Outlook and Firefox) but could also handle the latest games with the graphics on their higher settings, and some future-proofing that would mean I could handle upcoming games. Weeks of discussions about components left me slightly better informed (I’m very averagely techie when it comes to PCs; I can help my parents, but need help from my younger, or more experience peers), but a letter in Personal Computer World magazine got me thinking… these bastard machines really are hideously environmentally unsound.

As with so much technology (TVs for example) it’s all about more, bigger, faster, brighter – which equates with more powerful, which of course equates with higher power consumption. If a basic, older laptop can work on 50-100W, a big gaming desktop PC rig seems to push things up to 500W plus. One machine that was suggested as suitable for my budget by one online retailer of bespoked computers had a power supply that could handle up to 1000W. That doesn’t mean it’ll constantly be drawing 1000W, but with the latest, top-of-the-line graphics card, when gaming, the power draw is pretty hefty, Combine that with CPU, harddrive, fan/cooling hardware activity and it’s scary.

So are PC gaming and environmental awareness mutually exclusive? Well, I could get a lower spec machine but it’d be less future-proofed, resulting in more financial outlay sooner rather than later; I could get a high-spec laptop, which would probably halve the power consumption in comparison to a loosely similar level of desktop PC, but they’re expensive (about a third more again), and highly restrictive in terms of upgrading components and extending their lives. And laptops are less comfortable for prolonged writing; I know you can plug in peripherals like more comfortable seperate keyboards and bigger screens but that starts to defeat the object of having a laptop, even a high-spec desktop-replacement gaming laptop, which will big but still kinda portable.

There is some new technology on the way from processor manufacturers – Nvidia’s new Hybrid SLI in the context of desktops would include a function in the GPUs to switch mode between when the PC is being used for demanding activities like gaming (or graphics rendering or whatever) and when it’s being used for basic functions (email, wordprocessing, browsing), thus reducing noise and power consumption. Nvidia call this HybridPower – check it out. It’s hardly going to save the world though.

This really is a bind. I’m a fairly green person – I don’t own a car; we try to be considerate with our domestic energy usage (switch things off at the wall, replace light bulbs with low energy ones, insulate the roof, and all that) etc – but this is an interesting challenge to my principles. For me, PC gaming means hanging out with friends – some men are in sports teams, or spend a lot of time in the pub, but I play games. As well as loving the culture of videogames (something I’ve written about professionally for six years or so, but do less of now after the BBC killed off dear old Collective, their excellent online culture magazine), for me they’re also about socialising. But now that seems to go hand-in-hand with yet another way of making a disproportionate contribution to the slow apocalypse of our age.

My cultural life has generally been about books, comic books, films and games. I don’t do theatre or football. Do I really have to give up a major part of my cultural and social life because it comes with a seemingly ever-increasing eco price tag?