Where you gonna be?

[Google Earth and whereyougonnabe from Peter Batty on Vimeo]

Peter Batty has a funky screencast of his incipient social networking slash geolocation application, Where You Gonna Be? which allows you to see when your future itinerary crosses paths with other people you know. Initially implemented as a Facebook app, it now features Google Earth integration. It's currently restricted to beta testers, but should be opening up to all comers soon.

One day, functionality like this will be part of the ubiquitous and unobtrusive infrastructure of society. For now, there's WYGB.

Lions for Lambs

Lions for Lambs

It got a right drubbing critically, and was a box-office failure, but I found it powerful and compelling. It may not be crafted with the greatest finesse, but nevertheless it raises interesting and powerful questions, of the competence of our leaders in war, of the culpability of the media in towing the line, and of the personal responsibility that falls to each of us.

Most Americans seem unaware that their profligate spending on 'defence', coupled with regular invasions of other countries, cause them to be regarded by much of the rest of the world as the single biggest threat to world peace.

In such an environment, this movie asks us to have the bravery to let our point of view be known, and to take action when our beliefs contradict our Goverments' decision to go to war. For anyone with questions about the justification for the recent wars that America has started (and my own country England has supported) this movie raises many issues, and spoke powerfully to my conscience.

Rating 8/10.

PyCon 2008, Chicago, USA

PyCon USA 2008

PyCon 2008 has been absolutely amazing this week. The talks I've seen and the people I've met have been a real inspiration, and my head is a-whirl with ideas. Above all, as always, I'm impressed by the Python community's genuine warmth. Maybe it's just a function of Python being relatively small compared to some other technologies, but there's something pleasantly hippy (in a good way) about its practitioners. Guido, for example, unsurprisingly turns out to be eminently reasonable and affable, and the IronPython team, to whom I and Resolver owe so much, are a thoroughly pleasant and interesting bunch.

To the people who have been asking for my humble talk slides, you can grab them here in a couple of formats:

If anyone has any outstanding thoughts or questions about the talk, I always love to chat, so please feel free to email me, tartley at the domain tartley.com (gosh I love love love http://spamarrest.com.) (Update: email obfuscated - spamarrest is no more.)

Update: There are photos!

The Long Overdue LinkedIn Backlash


We've all known this for a long time, but I don't remember anyone actually saying it out loud: Have you ever noticed how it's always the most industrious LinkedIn users who are the very last people you'd actually recommend in a genuine network of trust?

The most ruthless networkers, climbing every last tenuous connection like an insidious vine. Grasping for unearned advantage at every node. Can you introduce me to so-and-so? If you link to me, I'll have over 500 connections, think how popular that will make me look. My preciousss connectivity. Somehow, this kind of behaviour also correlates well with other repugnant activities, like spamming. Some people just have no conception of how their actions change the world around them for better or worse, and hence act as though they had no responsibilities to the community around them.

One final spam this morning broke the camel's back. I've asked him not to do it before, both personally, and via the captcha screen he disingenuously filled in to get onto my whitelist. So it's time for LinkedIn to start pulling its weight. I notice there is no mechanism to add a negative recommendation via LinkedIn, so I've been forced to add him as a connection and then leave a suitable-worded positive one. Hopefully it still conveys my intended semantic.

It may be that publicly dissing a recruiter is not the wisest of career moves, and maybe James will get retaliatory. I weighed that up for a while, and decided I can risk that. I don't operate on appearance. I operate on substance. I can take it. Some things are right, and some things are wrong, and It's time to take a stand.

Unfortunately, even this is a half-hearted measure - I think James needs to approve it for it to be attached to his profile, and that seems unlikely from my perspective. Who knows, though? Maybe he's actually a stickler for accurate representation who relishes feedback.

So this got me thinking. Forget LinkedIn - it's clearly designed to appeal to vacuous self-congratulators. What attributes would we want from a *real* network of trust? Clearly some mechanism for leaving public negative feedback would be one of them. Can that be done in a way that can't be abused? What else does it need?

Studencki Festiwal Informatyczny 2008

I had the good fortune to attend the Academic IT Festival over the weekend, in Cracow, Poland. Pictured below is the audience listening to Chad Fowler, Ruby and Rails guru, who I had the pleasure to discover is also a really nice and interesting guy.

Chad Fowler speaks - people

The festival is organised by a large group, including former Resolver legends Jan and Konrad, and covers a diverse array of IT and related topics. It is the largest event of its type in Central Europe. Some talks were in English, some in Polish, by an absolutely stellar line-up of luminary speakers, including Joe Armstrong, creator of Erland; Gilad Bracha, co-creator of Java; plus yours truly. See 'one of these things is not like the others.'

There's some photos up here, and I'll publish my own updated Test-Driven Development talk here later today.

Update: There's photos and the talk slides are finally online, over here somewhere...

Why Python?

Python{.alignleft .floatleft}A friend who is a doctor is considering learning Python as his first programming language, to do some processing on some research data. He asked me to give him the 30 second elevator pitch for Python, to evaluate whether it's a wise choice. I enjoyed constructing the reply so much that I decided to post it here, just in case it helps anyone else in a similar situation.

Why Python?

Python is very accessible and intuitive. You should be able to produce simple, useful programs in your first day of experimentation. The syntax is clean and concise, without too much cryptic punctuation (Perl, I'm looking at you), redundancy or unnecessary verbosity.

This accessibility isn't just a superficial convenience. Because of it, writing a program in Python will take noticeably less time than many other programming languages. The resulting program will be shorter and more comprehensible, and will be easy to modify or extend in the future.

The simplicity of Python is not because it is in any way cut-down or incapable. In fact, it is one of the most limber languages available, including a carefully chosen cross-section of advanced language design features, which enable it to adapt gracefully to many different situations and programming styles. Its beauty lies in its ability to provide the aforementioned simplicity regardless of the complexity of the task to which you choose to put it.

Of those language design features, a couple are worthy of special mention.

Python is one of a number of dynamic languages, which are in vogue at the moment. Proponents would say that the entire history of programming has been a gradual migration towards progressively more dynamic languages. Dynamic languages, amongst other things, allow you to write programs that modify themselves when they run. Instead of simply writing a function yourself, you can instead write a function which creates a second function, and then call this second function, which will do the thing you want done. This, and other sorts of brain-bending meta-programming, seem a little abstract at first, but sometimes allow some tremendous conceptual ju-jitsu, allowing very small amounts of code to achieve enormous things.

Secondly, Python's dynamism facilitates a programming style known as test driven development, of which I am big fan. The idea is that for every bit of code you write, you also write a test, which verifies that your code is doing the right thing. It isn't immediately obvious that this is necessarily a very useful thing to do, but in practice it reaps tremendous benefits. I evangelise about it often, because I feel it is the single most important thing that most programmers could do in order to be more productive and write better code.

As well as the language itself, Python comes bundled with a comprehensive set of pragmatic built-in standard libraries, which your program can lean on to help you get things done with a minimum of hassle. These libraries are augmented by a vibrant community of authors producing third-party modules you can download and use as well.

As any good language should be, Python is cross-platform, so with a minimum of tweaking, most Python programs should run on Windows or Macs or Linux.

Why not Python?

A notable alternative to Python is Ruby, which looks like a delightful environment and community to be in. As a general-purpose tool, Ruby is just as good as Python, and it excels in certain areas such as website development. But Ruby is not compellingly better than Python. They are more similar than they are different, and form healthy rivals.

There are other languages that are better than Python at particular things, but none, in my opinion, are better than it for most things.

Something like C++ is better for sheer speed of program execution, or for addressing the low-level bits and bytes that make up the electronics of your computer. But it takes years to master C++. It's a hard-core programmers language. I spent seven years living and breathing it, and feel qualified to say that its practitioners can be slightly masochistic about its inaccessible superiority. Even once mastered, it is still a lot of work to write C++ programs.

Java and C# are both very popular indeed - orders of magnitude more so than Python, and are ubiquitous in conservative corporate enterprise consulting shops. Both are slightly frowned-upon by computer science academics (C#, for example, for being ostensibly tied to Windows), but nevertheless, these languages are not bad choices for many people.

Programs written in Python are usually slower than any other mainstream programming language. This could be an issue if you intend to intensively crunch large amounts of data in CPU intensive ways, for example running a finite element analysis.

There are many Python libraries you can call which are, under the covers, written in C. A prominent example is NumPy, for doing numerical processing. Libraries like this might circumvent the performance issue if one of them happens to handle your particular problem.

Even if there is no appropriate library available, slow performance isn't as serious a drawback as it sounds. 99% of programs don't need to do these sort of CPU intensive tasks, so Python's slowness makes no discernible difference. Even in cases where performance is a factor, Python makes it easy to modify and optimise your code to make it run faster, which often alleviates the problem entirely.

Python uses indentation to define blocks of code instead of 'begin/end' or '{}' delimiters like other languages. This caused no small amount of controversy when it was introduced, with many veteran programmers recoiling in horror, imagining nightmare scenarios in which simply changing the whitespace in a program (eg adding more spaces or tab characters) would unexpectedly change a program's behaviour. In practice, however, this does not ever cause problems, and actually eliminates an entire class of errors, wherein a programs appear to behave strangely because the programmer has failed to keep the indentation (which is useful to human readers of the code) in sync with the delimiters (which are used by the computer.)

Multi-threading is an advanced technique in which a program casts off new versions of itself, all running around simultaneously helping each other out, sorcerers apprentice style. Python does not handle this well, only utilising a single CPU on dual or quad core machines, and often requiring careful crafting of finicky constructs to get it working reliably. However, this is equally awkward in almost every other language, and has had programmers tearing their hair out for decades, no matter what language they use. There are exciting new approaches to this in the language Erlang, but this is still too fringe to recommend as a first language.

Python lacks some of the delightful academic brilliance of hardcore functional languages such as Lisp and its derivatives, which are based on the mathematics of the lambda calculus. In the right hands, these tools can be devastatingly elegant and highly productive. However, many of them lack a degree of day-to-day practicality in terms of available libraries, and most people feel that they are initially unintuitive to learn. Such languages will no doubt remain highly influential in computer science circles, and are having something of a renaissance these days, but they are sufficiently unorthodox for me not to recommend as someone's first (and possibly only) programming language.

Free Culture

Free Culture

by Lawrence Lessig (2004)

Accessible, entertaining, rich with historical perspective, and cuts incisively to the core of modern society's conflict over intellectual property. To my mind, one of the most important books of the decade. I'd highly recommend it. And it's available for free from your iPhone's built-in ebooks installer, or as a pdf.

Rating: 10/10. Blimey.

pyglet week 3 : Some Pretty Flowers

This is just a refinement of last week's script. No massively significant changes, just a bunch of minor tweaks.

  • The fans are replaced with slightly prettier flowers, with separate vertex and color arrays for each one.
  • Running with the python -O flag means we can render 800 flowers at 30fps, no problem.
  • The camera can now be moved and rotated and zoomed. I just use this to subtly zoom in after the screen has filled with flowers
  • I discovered that if I fail to clear the screen before rendering, then I inherit the appearance of the desktop as a backdrop. No doubt this cannot be relied upon, and presumably doesn't happen on some other graphics hardware or operating systems, so YMMV (Update: yep - it looks CRAZY on Windows unless you uncomment the glClear() call). I won't be relying on this trick in future, but for now, on my machine at least, it looks like this:








Actual vs Perceived threats (aka People are Crazy)

Actual vs Perceived threats

Susanna Hertrich has an art / thesis project to artificially stimulate people's threat perceptions (by giving them goosebumps, or making hairs stand on end) in response to actual threats, as opposed to perceived ones. It's a topic that I'm unnaturally preoccupied with, since the most egregious examples of the disparity between the two seem to intrude on my life every day. My opinions about whether any given threat is real or illusiory seem to differ from almost everyone, but I'm going to stubbornly cling to the idea that everyone else is crazy. Take the entry on 'terrorist attack' as an example (see diagram.) Public reactions to the topic remind me of nothing so much as a stirred-up ants nest, a psychotic, ineffectual frenzy.