The Long Overdue LinkedIn Backlash

UnRecommend

We've all known this for a long time, but I don't remember anyone actually saying it out loud: Have you ever noticed how it's always the most industrious LinkedIn users who are the very last people you'd actually recommend in a genuine network of trust?

The most ruthless networkers, climbing every last tenuous connection like an insidious vine. Grasping for unearned advantage at every node. Can you introduce me to so-and-so? If you link to me, I'll have over 500 connections, think how popular that will make me look. My preciousss connectivity. Somehow, this kind of behaviour also correlates well with other repugnant activities, like spamming. Some people just have no conception of how their actions change the world around them for better or worse, and hence act as though they had no responsibilities to the community around them.

One final spam this morning broke the camel's back. I've asked him not to do it before, both personally, and via the captcha screen he disingenuously filled in to get onto my whitelist. So it's time for LinkedIn to start pulling its weight. I notice there is no mechanism to add a negative recommendation via LinkedIn, so I've been forced to add him as a connection and then leave a suitable-worded positive one. Hopefully it still conveys my intended semantic.

It may be that publicly dissing a recruiter is not the wisest of career moves, and maybe James will get retaliatory. I weighed that up for a while, and decided I can risk that. I don't operate on appearance. I operate on substance. I can take it. Some things are right, and some things are wrong, and It's time to take a stand.

Unfortunately, even this is a half-hearted measure - I think James needs to approve it for it to be attached to his profile, and that seems unlikely from my perspective. Who knows, though? Maybe he's actually a stickler for accurate representation who relishes feedback.

So this got me thinking. Forget LinkedIn - it's clearly designed to appeal to vacuous self-congratulators. What attributes would we want from a *real* network of trust? Clearly some mechanism for leaving public negative feedback would be one of them. Can that be done in a way that can't be abused? What else does it need?

Studencki Festiwal Informatyczny 2008

I had the good fortune to attend the Academic IT Festival over the weekend, in Cracow, Poland. Pictured below is the audience listening to Chad Fowler, Ruby and Rails guru, who I had the pleasure to discover is also a really nice and interesting guy.

Chad Fowler speaks - people
listen.

The festival is organised by a large group, including former Resolver legends Jan and Konrad, and covers a diverse array of IT and related topics. It is the largest event of its type in Central Europe. Some talks were in English, some in Polish, by an absolutely stellar line-up of luminary speakers, including Joe Armstrong, creator of Erland; Gilad Bracha, co-creator of Java; plus yours truly. See 'one of these things is not like the others.'

There's some photos up here, and I'll publish my own updated Test-Driven Development talk here later today.

Update: There's photos and the talk slides are finally online, over here somewhere...

Why Python?

Python{.alignleft .floatleft}A friend who is a doctor is considering learning Python as his first programming language, to do some processing on some research data. He asked me to give him the 30 second elevator pitch for Python, to evaluate whether it's a wise choice. I enjoyed constructing the reply so much that I decided to post it here, just in case it helps anyone else in a similar situation.

Why Python?

Python is very accessible and intuitive. You should be able to produce simple, useful programs in your first day of experimentation. The syntax is clean and concise, without too much cryptic punctuation (Perl, I'm looking at you), redundancy or unnecessary verbosity.

This accessibility isn't just a superficial convenience. Because of it, writing a program in Python will take noticeably less time than many other programming languages. The resulting program will be shorter and more comprehensible, and will be easy to modify or extend in the future.

The simplicity of Python is not because it is in any way cut-down or incapable. In fact, it is one of the most limber languages available, including a carefully chosen cross-section of advanced language design features, which enable it to adapt gracefully to many different situations and programming styles. Its beauty lies in its ability to provide the aforementioned simplicity regardless of the complexity of the task to which you choose to put it.

Of those language design features, a couple are worthy of special mention.

Python is one of a number of dynamic languages, which are in vogue at the moment. Proponents would say that the entire history of programming has been a gradual migration towards progressively more dynamic languages. Dynamic languages, amongst other things, allow you to write programs that modify themselves when they run. Instead of simply writing a function yourself, you can instead write a function which creates a second function, and then call this second function, which will do the thing you want done. This, and other sorts of brain-bending meta-programming, seem a little abstract at first, but sometimes allow some tremendous conceptual ju-jitsu, allowing very small amounts of code to achieve enormous things.

Secondly, Python's dynamism facilitates a programming style known as test driven development, of which I am big fan. The idea is that for every bit of code you write, you also write a test, which verifies that your code is doing the right thing. It isn't immediately obvious that this is necessarily a very useful thing to do, but in practice it reaps tremendous benefits. I evangelise about it often, because I feel it is the single most important thing that most programmers could do in order to be more productive and write better code.

As well as the language itself, Python comes bundled with a comprehensive set of pragmatic built-in standard libraries, which your program can lean on to help you get things done with a minimum of hassle. These libraries are augmented by a vibrant community of authors producing third-party modules you can download and use as well.

As any good language should be, Python is cross-platform, so with a minimum of tweaking, most Python programs should run on Windows or Macs or Linux.

Why not Python?

A notable alternative to Python is Ruby, which looks like a delightful environment and community to be in. As a general-purpose tool, Ruby is just as good as Python, and it excels in certain areas such as website development. But Ruby is not compellingly better than Python. They are more similar than they are different, and form healthy rivals.

There are other languages that are better than Python at particular things, but none, in my opinion, are better than it for most things.

Something like C++ is better for sheer speed of program execution, or for addressing the low-level bits and bytes that make up the electronics of your computer. But it takes years to master C++. It's a hard-core programmers language. I spent seven years living and breathing it, and feel qualified to say that its practitioners can be slightly masochistic about its inaccessible superiority. Even once mastered, it is still a lot of work to write C++ programs.

Java and C# are both very popular indeed - orders of magnitude more so than Python, and are ubiquitous in conservative corporate enterprise consulting shops. Both are slightly frowned-upon by computer science academics (C#, for example, for being ostensibly tied to Windows), but nevertheless, these languages are not bad choices for many people.

Programs written in Python are usually slower than any other mainstream programming language. This could be an issue if you intend to intensively crunch large amounts of data in CPU intensive ways, for example running a finite element analysis.

There are many Python libraries you can call which are, under the covers, written in C. A prominent example is NumPy, for doing numerical processing. Libraries like this might circumvent the performance issue if one of them happens to handle your particular problem.

Even if there is no appropriate library available, slow performance isn't as serious a drawback as it sounds. 99% of programs don't need to do these sort of CPU intensive tasks, so Python's slowness makes no discernible difference. Even in cases where performance is a factor, Python makes it easy to modify and optimise your code to make it run faster, which often alleviates the problem entirely.

Python uses indentation to define blocks of code instead of 'begin/end' or '{}' delimiters like other languages. This caused no small amount of controversy when it was introduced, with many veteran programmers recoiling in horror, imagining nightmare scenarios in which simply changing the whitespace in a program (eg adding more spaces or tab characters) would unexpectedly change a program's behaviour. In practice, however, this does not ever cause problems, and actually eliminates an entire class of errors, wherein a programs appear to behave strangely because the programmer has failed to keep the indentation (which is useful to human readers of the code) in sync with the delimiters (which are used by the computer.)

Multi-threading is an advanced technique in which a program casts off new versions of itself, all running around simultaneously helping each other out, sorcerers apprentice style. Python does not handle this well, only utilising a single CPU on dual or quad core machines, and often requiring careful crafting of finicky constructs to get it working reliably. However, this is equally awkward in almost every other language, and has had programmers tearing their hair out for decades, no matter what language they use. There are exciting new approaches to this in the language Erlang, but this is still too fringe to recommend as a first language.

Python lacks some of the delightful academic brilliance of hardcore functional languages such as Lisp and its derivatives, which are based on the mathematics of the lambda calculus. In the right hands, these tools can be devastatingly elegant and highly productive. However, many of them lack a degree of day-to-day practicality in terms of available libraries, and most people feel that they are initially unintuitive to learn. Such languages will no doubt remain highly influential in computer science circles, and are having something of a renaissance these days, but they are sufficiently unorthodox for me not to recommend as someone's first (and possibly only) programming language.

Free Culture

Free Culture

by Lawrence Lessig, 2004.

Accessible, entertaining, rich with historical perspective, and cuts incisively to the core of modern society's conflict over intellectual property. To my mind, one of the most important books of the decade. I'd highly recommend it. And it's available for free from your iPhone's built-in ebooks installer, or as a pdf.

Rating: 10/10. Blimey.

pyglet week 3 : Some Pretty Flowers

This is just a refinement of last week's script. No massively significant changes, just a bunch of minor tweaks.

  • The fans are replaced with slightly prettier flowers, with separate vertex and color arrays for each one.
  • Running with the python -O flag means we can render 800 flowers at 30fps, no problem.
  • The camera can now be moved and rotated and zoomed. I just use this to subtly zoom in after the screen has filled with flowers
  • I discovered that if I fail to clear the screen before rendering, then I inherit the appearance of the desktop as a backdrop. No doubt this cannot be relied upon, and presumably doesn't happen on some other graphics hardware or operating systems, so YMMV (Update: yep - it looks CRAZY on Windows unless you uncomment the glClear() call). I won't be relying on this trick in future, but for now, on my machine at least, it looks like this:

flowers1

flowers2

flowers3

flowers4

flowers5

flowers6

gameloop2c-flowers.pyPython
file

Actual vs Perceived threats (aka People are Crazy)

Actual vs Perceived threats

Susanna Hertrich has an art / thesis project to artificially stimulate people's threat perceptions (by giving them goosebumps, or making hairs stand on end) in response to actual threats, as opposed to perceived ones. It's a topic that I'm unnaturally preoccupied with, since the most egregious examples of the disparity between the two seem to intrude on my life every day. My opinions about whether any given threat is real or illusiory seem to differ from almost everyone, but I'm going to stubbornly cling to the idea that everyone else is crazy. Take the entry on 'terrorist attack' as an example (see diagram.) Public reactions to the topic remind me of nothing so much as a stirred-up ants nest, a psychotic, ineffectual frenzy.

Beautiful Code : Leading Programmers Explain How They Think

Beautiful
Code

33 essays, edited by Andy Oram

I was wondering how to explain why I didn't like this book as much as I thought I would - possibly as much as I thought I ought to. Thirty-odd genius bitwranglers delving into their favourite bits of code - surely it must be packed with fabulous insights into the zen of programming. But something about it left me flat, and I was having trouble putting my finger on why. Fortunately Jeff Atwood's ever brilliant Coding Horror has saved me the bother, by explaining it perfectly.

Programmers love code. It is the culmination of their labours. It's not just a document or a diagram, but is a dynamic, living record of their conceptual jujutso. But to focus on the code itself, in a variety of languages, which few readers are likely to be familiar with all of, is a relatively superficial theme, and comes at the expense of the abstract designs that the code embodies, and of the features and capabilities of the languages themselves.

Rating 6/10: Contains some gems, but generally falls between two stools.

If it's good enough for Benny...

These folks will store samples of your pet's DNA, so that in the event of their death, you can replace them with a newer, younger clone.

Preserving your
pets

Apart from the fact it's a gimmick ripoff (\$1,500 to store my pets DNA? That's quite a freezer they must have there, and the hypothetical cloning service 'may be offered by some other company at a future time') it seems to totally miss the point.

If you love your pet, it's surely for their personality. Your shared experiences and memories, having grown to know one another. I'm talking about their soul, or the state vector of their consciousness, whatever you want to call it. Growing a clone in their image is like creating a twin - it bears no relation to the animal you loved but for its superficial appearance, and even seems to me to be an insult to the memory of your former pet, that they could be replaced by the next one off the shelf. Certainly I wouldn't like to think my loved ones would be content to replace me with a look-alike.

Sometimes I wonder why we make it so hard on ourselves trying to make an honest buck when the rest of the world is one big scam, preying on the vulnerable and the unwary.

History of Western Philosophy

historyofwesternphilosophy.jpg

by Bertrand Russel, 1946.

I've clearly been putting off this post for months, no doubt intimidated by my assumption that my review would have to be as weighty and well-considered as the tome itself. Screw that, a crap post is better than no post at all.

Overall, I enjoyed it greatly. It is a triumph in illustrating the breadth of the domain to a philosophical layman. It was as fascinating for me for the context in which different movements have arisen, for example the operation and mores of of ancient societies which differed greatly from our own, such as the Spartans.

In truth, my original acquisition of the book was partly motivated by a desire to justify my longstanding dismissal of philosophy as a meaningful discipline, and this narrow minded and self-fulfilling expectation was indeed confirmed to some extent.

The opening portion of the book, about ancient philosophy, made it all too easy to infer that the gems of ancient philosophy are very much rare pinnacles of achievement, set amidst a babble of incoherent theories and proclamations. The valuable ideas must be carefully searched out, extracted and refined, much as one might pan for gold on expansive shores of mud. However, this gratifying exercise in belittlement also brought with it an increment of my understanding as to why I feel this way.

Plato's greatest finest hour

It may be all very well for me to sit here nitpicking from atop the cumulative results of thousands of years of hindsight, but my shrill, gauche layman objections seem to be rarely well presented, and even less often indulged with a sensible rebuttal, so I'm glad that I read the book and gained some context in which to articulate them.

The raising of the intellect above the concerns of mere empiricism, as typified by the schools of Descartes and Plato, seems to be completely undermined by the apparently overlooked fact that there is no objective way of apprising the workings of 'pure reason.' The human mind, of course, is not capable of 'pure reason', if there even is such a thing, and is equally incapable of recognising its presence or absence in any given supposition. As a result, all exercises in deduction are extremely fallible, and this should come as no surprise to anyone who has ever forgotten or misjudged something. Just because the mind is not well understood, that does not make its machinations any less empirical than the evidence of the senses - what seems to us like logic is entirely subject to the vagueries of our implementation, regardless of whether or not our mind's implementation is embodied in our apparent physical brains or not.

To elevate the conclusions reached by this process above the concerns of mere empiricism is to put them beyond the reach of any form of feedback or discernment, a state of affairs which leads to entirely nonsensical stances standing uncontested for hundreds of years.

Because of this, I would argue that the teachings of thinkers such as Plato have in fact hindered the development of modern thought, not just by by blazing trails down several blind alleys, but more critically, by then systematically putting their own conclusions beyond the reach of criticism and refinement, effectively removing the intellectual tools that might have enabled pupils to retrace their steps, and map out the geography of thought more thoroughly.

The second portion of the book is the weakest, dealing with the only form of educated rumination occurring during the middle ages, which was religious in nature. It is perhaps unsurprising that a period such as this is notable for it's lack of original thought, barring individuals every few centuries such as Aquinas. As a result, large parts of this section degenerate into a litany of power struggles, assassinations and coronations, notable only for their tedium.

The final portion deals with modern philosophers, which regains the interest of the early sections, by virtue of describing the works of less religiously restricted protagonists, nurtured by the widening availability of education and communication, resulting in a glorious, chaotic competition of ideas. It was only during this section that I feel I gained an appreciation for the process itself, as opposed to the teachings of individual practitioners.

We, as humans, are rubbish at the act of thinking - we leap to conclusions, exhibit enormous biases, forget things, overlook things, make faulty correlations, are inconsistent, are influenced by personal gain and traumatic past experience, and worst of all: we are incapable of seeing these weaknesses - completely blind to our own incompetence. The means by which we evaluate our own conclusions are tied inextricably to the mechanism which derived them in the first place - we always think we are right.

Given this, it is no surprise that we can never agree on anything. We can never see each other's point of view. Whether tackling the problems of day-to-day life, or pondering the riddles of existence, a million different people will jump in a million different directions, no two quite the same. Most of those ideas turn out to be worthless. Idiocy that is detrimental or ineffectual. Only a few of them stick - and there is no way of knowing the merit or otherwise of any idea unless you have some criteria to judge them by. In day-to-day living, some ideas will allow the individual or group to prosper and be happy. In fields where objective measurement is possible - the sciences - then cumulative progress can be made, as successful ideas are shared and built upon.

However, outside of these domains, we may be captivated and entranced by the enthralling pictures philosophy presents to us, and our lives may be enriched as a result. But without any means of discerning the good ideas from the bad, we are are forever doomed to explore a multitude of alternative, competing, overlapping intellectual explorations, without any knowledge of which ones are meaningful, and which ones are purely fanciful. It may be great fun, but it's masturbation.

Rating: 8/10: The box with which to call me philistine is below: