Copyright and the crowdsourcing of promotional materials

There was a rumour that Facebook was allowing third-party advertisers to use user's photos in promotional materials. I imagine a photo of me and my friends enjoying Bacardi responsibly being scraped for use in an advert.

It turns out that rumour is false. But it's not totally groundless - it's based on the fact that some third-party Facebook applications were behaving in this way. Facebook put a stop to it, and are to be commended for that. In the general case, though, it's entirely believable that this sort of thing may still happen, on Facebook or elsewhere, and it's fascinating for me to think about why I personally have an objection to it.

I mean, if some company does use your photo in their promotions, then of course some people would feel this is a privacy issue simply because they are shy about having their personal photos widely disseminated outside their immediate social circle - pasted up on billboards or whatever - and they have a negative emotional reaction to that, and that's understandable.

Many people, however, would instead have a positive emotional reaction. If my photo of me and friends enjoying Bacardi was chosen to be used in an ad, it would actually be kinda cool and funny. However, I *still* would object to them doing it. Even though I'm not losing anything on this deal, and in fact I'm gaining an amount of amusement and notoriety, I still don't want Bacardi to reap the benefit. Why is that?

For me, it's this: Bacardi, who incidentally, have done nothing wrong - they are my hypothetical example. In fact, let's just call them BigCorp instead. BigCorp expect to be able to use my photo, without reciprocating in kind. If I had tried to use BigCorp's images or music or logo, for my own purposes, and it had come to their attention, then they would have sued me into the ground.

It may be that they had little choice in the matter. That they were merely acting in a way that they perceived the prevailing legal and commercial environment obliged them to do. Regardless, the upshot from my perspective is that they would have sued me into the ground.

And that isn't cool. There is a groundswell of resentment in me for the way our cultural heritage - all the music and t.v. and movies and images and adverts and logos that surround us, the things that have come to form our whole cultural milieu, have been prised from our fingers, by gradually increasing legal boundaries, such that none of us as individuals, owns it any more.

Copyright used to be of limited scope. It was designed to prevent the wholesale appropriation of creative works such as books, which were being copied and republished and sold by third parties unconnected with the author. This is a fairly obnoxious behaviour, and one I'm prepared to condemn, and I accept that introducing copyright to prevent it seems like a good idea.

However, since the 1970's the scope of what copyright is applicable to has expanded and expanded, way beyond its original remit. Instead of being triggered by the commercial activity of a large book (re)publishing entity, it is now triggered and invoked by the tiniest of non-commercial personal behaviours, not just publishing books, but giving a song to a friend, or drawing a comic for your blog in your bedroom that incorporates the distinctive likeness of a character you like.

Things that used to be common and socially acceptable (eg. giving an album you like to a friend) are now clearly illegal and can get you sued for your life savings plus draconian life-changing conditions. We can no longer give the songs we like to our friends. We can't post parts of our culture on YouTube. We can't put it on our blogs, or use it in any way we feel like. We have been unwillingly converted into pure consumers of culture, instead of participants. We have been robbed of something we used to have.

I strongly agree with Banksy, who feels that if any corporate image or logo or advert is shoved into my face in a public space (e.g. billboards, but also tv or on the web), then it is, from that moment on, mine to do with as I wish. If I wish to appropriate the image, or augment it with daubings of my own, then why should anyone have the right to stop me? (*huge separate discussion reqd here to justify this, obviously. Maybe later.)

But this, obviously, has been utterly repudiated by the copyright industry, acting in concert with BigCorp lobbyists, has used legal strong-arm tactics to deny me the ability to so much as sing happy birthday in my own restaurant without paying royalties, never mind ripping off BigCorp advertising materials to form my own pastiche. I resent that. I've been putting up with it for years, and in return BigCorp has earned my ill-will on this topic.

So in answer to the question "Can Facebook's partners use your photos for promotional purposes? It'll be funny and cool!" I have to reply "No you can't. I agree that it would be kinda funny and cool, but on this topic you have seriously annoyed me. We are not friends. You cannot use my photos. Now piss off."

Home media center

I just bought a NetGear ReadyNas Duo to connect hard drives to my home network, to stream movies and the like to our fabulous Xbox Classic media center. In the process of researching, I was wondering whether the kind of hard drive connection matters. I mean, if you plug USB hard drives into a device like that, does it run fast enough to stream one or more movies simultaneously? How many simultaneous movies or audio streams would your average home ethernet carry? My first stab at answering these questions are below.

On the left are various network and hard drive connection technologies. On the right are various uses to which I might want to put them. You can't use a slower connection (eg. bluetooth) to drive a faster usage (eg. blu-ray quality movies). Centre column is the data rate in megabits per second (Mb/s):

EDGE mobile phone    0.23
                     0.3  cd audio
bluetooth1           0.7
                     1.3  minimal video
bluetooth2           2.1
wifi 802.11b         4.5
                     5.0  dvd mpeg-2 quality
ADSL1                8.0
ethernet 10baseT    10
USB1                12
                    15  hdtv video (from 8 to 15)
ADSL2+              24*
cable modem         30
                    40 blu-ray disc
wifi 802.11g        54
firewire800 act     65
ethernet 100baseT  100*
PCI                133
USB2 actual        240
firewire 400 theo  400
USB2 theoretical   480
wifi 802.11n       600
firewire 800 theo  800
Seagate Barracuda  960*
ethernet gigabit 1,000
SATA-150 theo    1,500
SATA-300 theo    3,000

* = my setup

I'm assuming that I don't have gigabit ethernet, because I've never paid it any attention in the past. Judging from the above, my 100BaseT should be more than adequate, but will be the weakest link. So that'll be the first thing I look at if streaming seems sub-par. Coolio!

Update: Everything works swimmingly. I've had no problem with streaming speeds. Problems *have* occurred with some .avi files which appeared to have invalid interleave cross-stream differential parity (or something) and efforts to reverse their polarity were to no avail (transcoding software generally wouldn't even read the files!) A quick visit or two to MiniNova fixed all that.

A Pythonic 'switch' statement

I've recently had the pleasure of providing some assistance to my lovely wife through her first serious Python coding, and one of many things she expressed surprise at was the lack of a 'switch' statement. At the time, I advised her that such a statement is superfluous, and that she should simply use an if...elif...else instead. I then forgot all about it.

Until today, when I found myself refactoring a Pythonic kind of switch into my own code.

I started with this ugly little lump:

def convert(self, params):
    action = params[0]
    if action == 'M':
        x, y = self.get_point(params)
        current_path = [(x, y)]
    elif action == 'L':
        x, y = self.get_point(params)
        current_path.append((x, y))
    elif action in 'zZ':
        if current_path[0] == current_path[-1]:
            current_path = current_path[:-1]
        if len(current_path) < 3:
            raise ParseError('loop needs 3 or more verts')
        loops.append(current_path)
        current_path = None
    else:
        msg = 'unsupported svg path command: %s' % (action,)
        raise ParseError(msg)

This is from the guts of an SVG parsing module I was hacking up, but what it actually does isn't important. Its only salient feature for today is that it consists of a big switch-like if...elif...else statement. I was going to be adding plenty more cases to this logic, and it was sure going to get ugly. How can we make it better?

First, I extract the logic from each branch of the if into functions. In this case, I chose to make them methods of the current class. Standalone functions (outside the class, without a 'self' parameter) would also work, if they didn't need access to shared state.

def onMove(self, params):
    x, y = self.get_point(params)
    self.current_path = [(x, y)]

def onLine(self, params):
    x, y = self.get_point(params)
    self.current_path.append((x, y))

def onClose(self, params):
    if self.current_path[0] == self.current_path[-1]:
        self.current_path = self.current_path[:-1]
    if len(self.current_path) < 3:
        raise ParseError('loop needs 3 or more verts')
    self.loops.append(self.current_path)
    self.current_path = None

def onBadCommand(self, action):
    msg = 'unsupported svg path command: %s' % (action,)
    raise ParseError(msg)

Again, don't worry too much about what these functions actually do. Just note that I've pulled the logic out of each branch of the if...elif...else statement into separate handler functions.

Second, I define a dictionary which maps action characters to one of the new handler functions:

def convert(self):
    lookup = {
        'M': self.onMove,
        'L': self.onLine,
        'Z': self.onClose,
        'z': self.onClose,
    }

Notice how the methods are bound to self, so they operate on the current object as you would expect. If you used stand-alone functions instead, they would not need any 'self.' qualifier here.

Third, use the dictionary to lookup the function we want to call, and then call the returned function:

handler = lookup[action]
handler(params)

These two lines can be tidily combined into one:

loopup[action](params)

Note that this is pleasantly succinct, but still very explicit about what's going on. We're looking up a value in a dictionary, using the d[key] syntax. The returned value is a function, and we are calling it, passing 'params', using the f() syntax.

Python tries very hard to always clearly expose the details of what is happening to the reader like this. Nothing magically happens behind the scenes. And yet, by the good judgement of Guido and the healthy process that surrounds the language's evolving design, the code remains concise, without becoming verbose or cluttered with obfuscating punctuation.

We haven't yet handled the final 'else' clause from the original code. It can't simply become another entry in our lookup dictionary, since it's unclear what key (left-hand value) would go into the lookup to correspond to this case. We're really talking about the case when the 'action' character can't be found in the lookup dictionary at all. The most explicit and readable way to handle this case is to modify the above line of code:

if action in lookup:
    lookup[action](params)
else:
    self.onBadCommand(action)

Saving these changes, running the tests shows it behaves identically to the original version. (Hint: Tests don't make code harder to change. Quite the opposite - tests enable more frequent and more intrusive change, by giving you the freedom to dabble with refactoring while remaining dead certain you aren't introducing new bugs.)

Let's take a look at the final code all together:

def convert(self, params):
    lookup = {
        'M': self.onMove,
        'L': self.onLine,
        'Z': self.onClose,
        'z': self.onClose,
    }
    action = params[0]
    if action in lookup:
        lookup[action](params)
    else:
        self.onBadCommand(action)

Including the new handler functions, this is considerably longer than the original version (19 vs 32 lines). However, it qualifies as preferable and 'more Pythonic' for a couple of reasons:

  1. It's much clearer in intent. Greater readability is introduced by separating out the code which chooses what to do (the lookup dict) from the actual doing (the new handler functions.) The naming of the new handler functions brings enormous clarity at a stroke. Of course, this could also be done with a switch statement, and frequently should be.
  2. It's easily extendible. The if..elif...else construct of the first version would soon have bloated to over a screen-full of garbled code when we added a few more cases. The new version could add 100 or so new cases without really becoming much less comprehensible.
  3. It's data-driven. The lookup structure could be generated by other means than simply hard-coding it locally like this. We could merge several dictionaries depending on context, or create it on the fly from application configuration.

This isn't very Earth shattering, and of course the idea that I should be preferring polymorphism over switch-statements tickling the back of my mind, but hopefully someone finds it useful.

Update: Wow! What a flurry of completely brilliant comments - every single one contains something of real merit. I feel compelled to rummage through for a sort-of best of breed conclusion based on all of them...

lookup = {
    'M': self.on_move,
    'L': self.on_line,
    'Z': self.on_close,
    'z': self.on_close,
}

def convert(self, params):
    action = params[0]
    handler = self.lookup.get(action, self.on_bad_command)
    handler(params)

# or alternatively
def convert(self, params):
    self.lookup.get(params[0], self.on_bad_command)(params)

I marginally prefer the first version - the second alternative is a smidgeon too compact for my taste. I respect the idea to use exceptions, that makes a lot of sense too. Thanks for all the great ideas, everyone.

IronPython in Action

ironpython-in-action

by Michael Foord and Christian Muirhead

Disclaimer: I'm friends with both the authors and was sent a freebie review copy by the publisher, so I'm bound to be breathlessly gushing in this review. Fortunately, that's easy to do, because the book really is great. (Except for Christian's chapters... Joke!)

Having spent some years working with .NET, and with a series of intriguing personal experiments in Python under my belt, I originally approached IronPython some years ago with a modicum of trepidation. I feared that the weld between the two would be intrusively visible, forming distracting differences from regular Python. I feared for the execution environment, the data types, and perhaps even the syntax itself.

Experience with IronPython showed these worries were needless. I have found IronPython to be a remarkably pleasant marriage - the same elegant language we know and love, given first-class status in the .NET runtime. Gifted with seamless interoperability with other .NET languages, the dowry from such an alliance turns out to be all the .NET libraries in the world, including the substantial and highly capable .NET standard libraries themselves.

IronPython is, to some extent, a niche implementation of a niche language. However, its position seems to potentially be one of importance and strength. Not only does it allow Python programmers to use .NET libraries - and does so admirably, but it also allows the existing legions of .NET programmers to be introduced to the joys of Python. They will fall in love with it, and will be able to introduce it into their workplaces in a way that is politically acceptable. After all, it is now simply another .NET language. Since .NET is orders of magnitude more popular than Python, this could turn out to be an important source of future Python adoption.

This book is aimed to satisfy programmers coming from both the Python and the .NET worlds, and in this it seems to succeed. It starts with quick overviews of concepts from each: 30 pages about Python as a language, and 17 pages about .NET as an environment (data types, events, delegates, Windows Forms, etc) - just enough to get everyone up to speed regardless of background, but without being so verbose as to turn anyone off with a surfeit of material they are already familiar with. Despite being brief, these sections are packed with detail and very pragmatic, focusing on real-world use such as inheriting from existing .NET types, and solving some common problems like creating Windows Forms applications from IronPython.

This style of practical and dense informative content is continued throughout. Straight after the opening sections, we dive right in with another rapid-fire chapter, demonstrating common IronPython techniques by writing a non-trivial application. Woven around this ongoing example, the chapter discusses many immediately important topics, including duck typing, Python protocols, MVC, using Windows Forms to build a GUI, tab pages, dialogs, menus, toolbars, images, saving text files, .NET Streams, text file encodings, Python exceptions and lambda functions. These diverse topics are covered rapidly but thoroughly, giving the reader enough information about each to be able to use them together from IronPython to create a useful project.

Having covered these foundations, the book then moves on to address some specific areas in more detail. The following chapter headings give you some idea of the topics which are explored in depth:

  • First-class functions in action with XML - demonstrates pragmatic use of functions as first-class objects, and higher-order functions (functions that take other functions as arguments and return modified versions.) and of course decorators. These are shown in use, appropriately paired up with the .NET XmlWriter and XmlReader classes, demonstrating event driven parsing of XML.
  • Properties, dialogs and Visual Studio - Python properties, .NET dialogs, and using IronPython in Visual Studio. This sounds like a straightforward chapter, but as you might guess, the book gets deep into the topics and is jammed full of information. By the end of the chapter you'll have added to the example application to create document observers, used BinaryFormatter to serialise objects, and touched on Python's pickle equivalent.
  • Agile Testing: where dynamic typing shines - From the unittest module and creating tests, through mock objects, listeners, monkey patching, dependency injection and functional testing. This is a dense chapter in a dense book, touching along the way on Python attribute lookup rules, bound and unbound methods, asynchronous execution for functional testing. My only criticism is that it's clearly hard for developers to 'get' testing until they have hands-on experience of it, so this single-chapter, while very thorough in explaining how to test, has an ambitious remit, and doesn't have enough space to explain much of why we test. I guess this is partially my own bias shining through here - I regard testing as quite literally the most important thing to happen in computer science since the invention of the compiler, and would encourage anyone interested to go and read as much as they can about it.
  • Metaprogramming, protocols and more - More Python protocols, dynamic attribute access, and metaclasses. The sorts of things that in a static language would be deep black magic, or else completely impossible, but here they are just the right sort of crazy. Read, enjoy, and unlearn. We see how to create a profiling decorator, that modifies the functions you pass to it, wrapping them in stopwatch timing calls. We also learn about some of the more advanced integration of IronPython with the .NET CLR, including static compilation of IronPython code into assemblies, and one of the very few additions to Python syntax that IronPython has been obliged to provide - the typing of .NET arrays and generics. You'll never need to use generics yourself (in Python, everything is a generic), and you'll never want to go back to typed containers if you can avoid it. However, you may need to deal with some from an existing C# API, and this is how you do it.

Whew! We're only halfway through! The remaining chapters are equally detailed, but I'm going to start skimming through them somewhat. They cover the interactions of IronPython with more advanced .NET topics such as:

  • Windows Presentation Foundation (WPF) and IronPython - WPF is the DirectX user interface library that is a successor to Windows Forms. This includes XAML, an XML dialect for describing user interfaces, decoupling their implementation from application logic.
  • Windows System Administration with IronPython - using IronPython as a scripting language for sysadmin automation tasks, from the simple, such as copying files, to the complex, such as Windows Management Instrumentation (WMI), administration of remote machines, and a substantial discussion on the uses of PowerShell with IronPython.
  • IronPython and ASP.NET - building a web-based front end to the sample application developed earlier. Reusable controls.
  • Databases and Web Services - using ADO.NET to work with databases, and using SOAP and REST.
  • Silverlight: IronPython in the browser - creating Silverlight applications, and accessing the browser DOM from them.
  • Extending IronPython with C#/.NET - all about creating C# class libraries for use in IronPython, calling unmanaged code from IronPython, and creating interfaces on your C# classes to provide dynamic, Pythonic behaviour. It also includes dynamic compilation of assemblies at runtime, which opens the door to advanced code-generation techniques.
  • Embedding the IronPython Engine - many developers might want to provide IronPython as a scripting language within their own application, and this chapter shows you how.

Alright, that's it! There are appendices:

  • A whirlwind tour of C# - in case anyone wants more guidance while looking at some of the C# code or concepts that are discussed throughout the book.
  • Python magic methods - a description of all the Python magic double-underscore methods, which is a fabulous resource, one which l haven't seen collected anywhere else, and have been referring back to ever since I read the book.

So there you have it. If you haven't inferred already, I learned absolutely heaps from this book, even though it's about a language and environment I've been using every day for years. I think I can say without any equivocation that this is the best IronPython book in the world. If you're a .NET person who is curious about Python (and believe me, you should be), or if you're a Python person who fancies .NET - maybe for DirectX or Silverlight or any number of other wonderful things, then you should absolutely go directly to the IronPython in Action book website right this second and buy it.

What are you still doing here?

Update: Good catch Carl, I forgot the all-important rating!

10/10 if you already use, or are curious about using, IronPython - then you need this book.

0/10 if dynamic languages make you break out in hives, or if .NET makes you think of Darth Vader, then you shouldn't touch this book with a barge pole.

On Estimates

There is a lot of room for miscommunication about estimates, as people have a startling tendency to think wishfully that the sentence:

I estimate that, if I really understand the problem, it is about 50% likely that we will be done in five weeks (if no one bothers us during that time).

really means:

I promise to have it all done five weeks from now.

from How to be a Programmer, by Robert L. Read.

Do a bit of helping out at EuroPython

Are you a Python geek?

For starters - you should totally be going to EuroPython at the end of June. Python conferences like this attract brilliant presentations from some real community and industry heavyweights. This year we've got Professor Sir Tony Hoare (that's the creator of quicksort to you, amongst a venerable lifetime's worth of other things); Cory Doctorow (everyone's favourite science-fiction author, blogger and all-round geek activist), Bruce Eckel (author and renowned technical communicator); plus Dr Sue Black from code-breaking hothouse Bletchley Park. The veritable horde of over 100 exciting and interesting talks makes up probably the strongest line-up EuroPython has ever had.

Best of all, Python conferences like this one are organised and run at a grass-roots level. By enthusiasts, for enthusiasts, making them quite the most fun, educational and interestingly social conferences I've ever been to. Personally, I love the resultant absence of overriding commercial agendas - everything is done purely for the benefit of the delegates, and is pervaded by the community values that made you love Python in the first place.

One upside of this is that the conference is cheap - only £190 to attend. Interested enthusiasts can easily pay their own way. As a result, this is one of the few technical conferences that still has a robust attendance this year - both of presenters and delegates. Many others have been decimated or even cancelled altogether.

There is a downside though, and here's the rub:

The good folks organising EuroPython in their spare time are desperately short of volunteers to be session chairs.

If you're going to EuroPython, you could help out! Yes, YOU! You could sign up for the sessions you want to watch anyway, so you shouldn't miss anything. Just imagine the warm and fuzzies! The KUDOS of a roomful of eyes. The POWER of cutting off over-running speakers in mid-flow*. The WARMTH of a deftly-cupped microphone.

(*anyone cutting the power on Sir Tony will be duly ejected from the premises)

Responsibilities are described in loving detail here: http://wiki.europython.eu/SessionTeam

Please think about it, and if you fancy it, sign up soon (on the wiki page above) because we're currently all a-flutter wondering how the heck we're going to manage this. :-)

Thanks!

Opengl Shading Language

OpenGL Shading Language cover

by Randi J. Rost.

I've had a passing interest in computer graphics for years, but had avoided the technology of shaders these last few years, thinking that they were just another layer of complexity which I didn't need to embark upon while I was still getting to grips with the standard OpenGL API.

With hindsight, I was wrong. I was recently cajoled into getting on board after talking to Mike Fletcher (creator of PyOpenGL) after his talk at PyCon, and now I feel as if I should have read this book years ago. Shaders solve many of the problems I've been happily messing with for ages, in ways that are easier to implement, more powerful, and more performant.

I whined about the Red Book, but this "Orange" OpenGL Shading Language book is brill - just what I needed. Incisive without being overly terse, practical, and once it got into the chapters about applications of multidimensional Perlin noise it got me all hot'n'bothered about computer graphics again. Yay my inner geek!

Update: I started this book fascinated by using vertex shaders to transform geometry on the fly, with little interest in the superficial fragment shaders used to decorate the rendered surfaces with pretty images or lighting effects. Since finishing it, this has reversed: I've become obsessed with noise and Fourier transforms and all the paraphernalia of fragment shaders, imagining relatively simple fragment shader that could, I believe, provide a surface with infinite levels of detail. I dreamed about my old university 'Signals & Systems' type lectures. Uncanny.

Update2: Ohdear. Once I started trying to write anything more than the most trivial of my own shaders, I ran into an unexpected problem. My shaders just wouldn't link. I couldn't figure out why. The book was no help. Google was no help. The error messages certainly weren't any help (thanks ATI.) Eventually I realised that the 'built-in' noise functions which are part of the OpenGL shader language are simply not implemented by the vast majority of graphics card manufacturers - you have to roll your own. Which is not a major deal-breaker, but what is disappointing is that the OpenGL Shader Language book makes absolutely no mention of this in any of the chapters plural in which it lovingly describes the built-in noise functions, along with their characteristics and uses. Perhaps I spoke too soon when praising the book. Maybe it is another case of idealistic OpenGL theory that has something of a disconnect with real world development. Maybe the book was written before this situation came to pass - regardless, it's no bloody use to me.

Rating (oh, how I love my new rating system. Check this one out:)

10/10 if you want to learn the theory of how to use the OpenGL shader language.

0/10 if you don't.

Cloverfield (2008)

cloverfield

Against my better judgement I couldn't help but snag a torrent of this. Sure enough, as the opening credits kicked in, my housemates assured us it was rubbish.

But then what happened is that I proceeded to love it. Clearly it polarises. The whole thing is shot in a shaky handicam held by one of the characters - imagine Blair Witch meets 9/11, only it isn't terrorists, it's a giant, evil whatthefuckisthat stalking the streets of NYC. It reminds me of Primer, in which the script is so realistic and lacking in over-ripe gravitas that, unusually, the actors don't even look like they are acting! What a concept. Well this is similar - albeit a lot dumber - but the strength is not in the script, which isn't especially strong, but is in the novel method of presentation.

There's clearly a limited number of movies that could be made like this, but for me, it was a welcome respite from the staged set-pieces of Hollywood's more conventional output. The sense of panic and confusion was beautifully heightened by the total lack of exposition - viewers only get to see what this small group of characters get to see, and even that is in blurry and imperfect fragmetary snatches. The monster, when it is even visible, is only glimpsed from afar. It was the closest a movie has ever come to creating the kind of tense, terrifying immersion that really great computer games can create.

I was amused to note that a bridge they take shelter under at the end looks exactly like the bridge they took shelter under at the end of The Day The Earth Stood Still (2008) - do all the bridges in Central Park look the same, or does this one have some special meaning? Anyhow, the final scene is saddening and telegraphed quite plainly from the opening shot (Camera retrieved at incident site US 447. Area formerly known as "Central Park") and it makes me weep with relief that a movie could so willing try and break the mold.

Rating:

0/10 if you're not into monster flicks, or if handycam footage makes you vomit.

10/10 If you fancy being scared silly by a giant alien monster.

Testwatcher

Sometimes when programming I like to leave unit tests running repeatedly in one window while editing the code and tests in another. The bash command watch is useful for this, and can highlight the differences between one invocation and the next in inverse.

I wanted a version of watch for use on Windows, so I whipped up a quick Python script, testwatcher, which produces output very similar to watch, but is cross-platform, and features not just inverse text, but yellow inverse text. Woo-hoo!

$ python example_test.py
F.F
======================================================================
FAIL: testThat (__main__.TestWatcherTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "example_test.py", line 12, in testThat
    self.assertEquals(0, randint(0, 10))
AssertionError: 0 != 4

======================================================================
FAIL: testThis (__main__.TestWatcherTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "example_test.py", line 9, in testThis
    self.assertEquals('one', object())
AssertionError: 'one' != d24460>

----------------------------------------------------------------------
Ran 3 tests in 0.001s

FAILED (failures=2)
_

Incidentally, the above test makes it very clear that Python objects in successive processes get new addresses on Linux, but interestingly on Windows the same addresses seems to get re-used for different processes.

I can't help but suspect this is a dumb script to have written - it should only be a:

 while True:
    command

but in order to shoehorn the inverse text and colors in, it's grown to 300 lines - a hideous bloat for a minor superficial thrill. Plus the Windows version flickers terribly - I'm currently using system('cls') to clear the screen and then redraw it every second. I'll do some searching for better ways to do it.

However - I've long wanted a Python interface to perform simple terminal actions like colors and animation on different platforms (the standard library 'curses' module that would otherwise do the job is simply not implemented on Windows.) So maybe it's time I used this script as an excuse to figure this out. Suggestions welcome.

Update: This idea may have now reached a viable fruition, documented here.

The Day The Earth Stood Still

20081951

The Earth obligingly stood still for us twice this week, on back-to-back nights. In each, a lone alien man arrives in a spaceship with his giant robot buddy Gort, to tell humans that they must mend their destructive ways or be destroyed.

The 1951 version was very Fifties - intrusively hopeless special effects, and seems to my eyes to be riddled with outlandish social etiquette and hopelessly naive politics. I suppose in the years following World War II any platform for preaching pacifism seemed worth a shot. If only more people considered it worth preaching today. I completely missed the Christian allegory that permeates the movie until it was pointed out to me: The alien comes from the heavens, and lives amongst common people, taking the name 'Carpenter' to blend in. He preaches peace to humankind, or else warns we will suffer a fiery apocalypse. He is our intermediary to 'Gort' (in fact the servant of Gort, in the original script) who later resurrects him from the dead, so that he may deliver his final message before being taken back up into the skies. Cute if you're into that, I guess.

Equally predictably, the 2008 version was very Naughties. Intrusively overblown production values string together a mediochre script. The pacifism and Christian message of the original has been replaced with a more timely environmental message - the writers perhaps intuiting that modern Americans are not so receptive to anti-war talk. Otherwise the scope and potential of the ideas at play are completely wasted - lost amidst the creative wasteland of a budget that could no doubt have fed countries. Once provoked, Gort unleashes self-replicating insectile microbots, which swarm and consume Philly, spreading fast. At the last moment, Keanu / Klaatu sees some humans hugging and crying, and has a big change of heart - the Earth deserves to be spared, after all. What a crock.

So there you have it. Watch this space for more reviews from me - wasting nights of my life, so that you don't have to. Final ratings:

10/10 if you are a stump-sucking mealy-mouthed pig-dog with googly eyes.

0/10 if you have any vestigial glimmers of taste or discernment.