Saturday, December 31, 2005

my web site facelift

I've made the welcome page of my website look better, inspired by the example set by Keith Richardson and Laurie Savage on the Victorian teachers IPM list.

New features are centering the whole page, adding background images, a transparency roll over effect on the menu and alternate style sheets (to see the alternate style sheets using Firefox do View > Page Style).

Friday, December 30, 2005

web2.0 redefinition

Paul Graham says that the real meaning of web 2.0 is:
  1. Ajax - JavaScript works, eg. Google Maps, web based apps are getting better
  2. Democracy - amateurs are more often surpassing professionals (wikipedia, reddit, digg, delicious)
  3. Don't maltreat users - avoid heavy handed branding, signing up procedures, offer free services where possible
Web 2.0 just means using the web the way it was originally intended. He doesn't like the term because it orginated as a business slogan but concedes it now does mean something.

I have written about web 2.0 previously: web2mememap, web 2.0

centering a web site using css

This has puzzled me for quite a while.

Keith Richardson posted his draft the 3 in 6 web page to the Victorian list and by examining his style I finally figured out how he did it.

Then I found a more complete explanation on the web at blue robot which shows two ways to centre a web page. I've tested them both and they work.

Wednesday, December 28, 2005

firefox css tools

Victorian teacher, Laurie Savage web design lessons with an alternate style sheet exemplar
You need to view it in firefox
View > Page Style > Tree to see Laurie's beautiful alternate style sheet - awesome!!

Also found a firefox extension called EditCSS which enables you to view, edit and save CSS in a sidebar!

Update: Through the Victorian IT list, Laurie Savage has put me onto web developer which has a very comprehensive suite of tools built into the browser (better than EditCSS). When you install it you get a tool bar with drop down menus for CSS, Forms, Images and other really useful stuff. Check out the links to Documentation and Forums, from Chris Pederick's web developer site.


Tuesday, December 27, 2005

finding those who link to me

How to find out who has linked to you from another blog (called backlinks).

I enabled the "Links to this post button" by following the procedure in blogger help. These links end up at the bottom of Comments but I am not notified of them.

To obtain notification through a RSS feed, first go to

Then type in:

This searches my whole blog and spits out who has linked to me.

I found out that plakboek had linked to me 4 times, wara once and steve in the UK once.

I then scrolled to the bottom of the page and subscribed to an RSS feed so that I'll be aware of new, future links.

I notice that ecmanaut has tweaked the interface to make the Pages linking to his posts appear on the same page as the original blog. Not sure how he has done this.

on line courses: howto

Australian educator, Leigh Blackall has slides here about how to setup a collaborative on line course using:
  • email - initital contact
  • google groups - group communication
  • blogger - blogging
  • bloglines - RSS aggregation
  • hello and flickr - images
  • geocities - storage (since replaced with
  • open office - ease of conversion to pdf and swf
I have felt daunted about the time and organisation required for running an online course but Leigh's slides illustrate clearly how the "small pieces" would fit together. As well as that the use of google groups and was new to me.

Leigh has blogged about this here.


Monday, December 26, 2005

stupid, not owned and a HUGE success

Two factors account for the huge success of the internet: nobody owns it and it’s a stupid, simple network, all it does is move bits.

I’ve been trying to understand the internet and world wide web. I want to understand the how and why of the amazing things that have evolved – things like google search engine, amazon stores, open source software, sites like blogdex that aggregate the most popular links found on blogs and sites like wikipedia that attract thousands of volunteers to work on a free encyclopaedia.

The web seems to be some sort of evolving and emerging intelligence, new exciting things are happening all the time. New software is developed continually by Open Source enthusiasts on the Web – applications such as the Firefox browser, programming languages such as python, different operating systems like Linux. Also the ability to track interesting new information and to collaborate with others continues to improve rapidly, with new web applications such as blogdex, bloglines, delicious, flickr to name just a few. It’s seems important to deepen understanding of what forces are driving such a rich medium.

So I’ve been using the web to research the web and the internet. Not surprisingly there is some very good material about the nature of the internet and www on the beast itself. The internet has developed its own researchers and philosophers.

The internet is a network of networks. Nearly all the networks, most of which are owned by someone or something, have by now joined the internet, which is owned by nobody. All the internet does is link all the other networks together.

The internet is a stupid, simple network. If that’s a new idea for you, as it was recently for me, then initially the implications won’t be clear, so I need to explain more.

I’m talking about the underlying architecture, which makes all the other stuff possible. Another way of saying it is that all the intelligence and value of the internet is located at the Ends, there is no central intelligence or control. Some authors call the internet a World of Ends, another called it “a hollow sphere comprised entirely of ends”.

Another different way of saying something similar is that all the internet does is move bits, it doesn’t do anything else. This makes it cheap. "The best network is the hardest one to make money running." (The Paradox of the Best Network)

Well, what’s the big deal about a stupid network? It might sound counter-intuitive to say that a stupid, simple network has achieved as much as the internet has. The answer to this becomes more obvious when we look at the alternative, an intelligent network, and the problems and difficulties that they create.

A good example of an intelligent network is the telephone network (other examples not discussed here are TV and radio), interesting since Telstra in Australia has become such a political hot potato. The telephone network has centralised features added to it such as call waiting, message bank, voice look up, providing the caller with choices before the call is completed (“press one for this, two for that, etc.).

Another relevant feature is that the telephone network is designed for a single application, voice. That was fine back in the days when voice generated all of the traffic. But these days all sorts of data goes down the phone line. The design features that are good for voice may not be good for transport of other forms of data.

An interesting dilemma here is that as we learn more and as customers needs become more sophisticated good design in the present becomes poor design in the future. Everyone knows how hard it is to change something in a big organisation, the suggestion has to be approved by various committees, time lines are worked out and so on. Big changes are often not implemented because the cost of the change might outweigh the perceived potential benefits to the company running the network and so a decision is made to put up with an inferior system.

David Isenberg who used to work for AT & T relates such a case where a technical team called True Voice was unable to improve voice quality as much as they could have because they became “tangled up in cobwebs of legacy assumptions” (ROTSN). This experience from an expert inside a communications company led to Isenberg writing his paper Rise of the Stupid Network after which he felt compelled to leave AT & T and set up his own company.

The centralised features of the phone companies network means that they are in control of how the network functions and also makes the cost of the network higher. Of course this suits the phone company. They are in control, being the experts, deciding what the customer needs are and making lots of money. The sometimes despised Telstra is a good example of this sort of business model.

If voice could be delivered over a stupid network like the internet then it would end up being cheaper and probably with even more features than offered by Telstra. Such a system is being developed (VoIP or Voice over Internet Protocol), which has the potential to make Telstra obsolete as a plain old telephone service (POTS).

Nobody owns the internet. It doesn’t have a central administration and the Internet protocols are non proprietary. Moreover, any communications network that can carry two-way digital data can carry Internet traffic, including wired networks like copper wire, coaxial cable, and fiber optic; and through wireless networks like Wi-Fi

Nobody owns the internet! Since it is a network of networks then although companies or government might own parts of it, no one owns it in the overall sense. Another phrase to describe it would be "distributed ownership."
... the old maxim of ‘the Internet interprets censorship as damage and routes around it’ applies: meaning that if one pipe imposed filters upon content or pulled out altogether, information would simply do what the Internet does best and find another route to travel.
- Internet ownership
So, the internet is a stupid network, a World of Ends, the middle is transport and nothing else. This combined with the fact that nobody owns it accounts for the success of the internet.

Humans are clearly a collaborative species who crave connection and recognition. How else could we account for the extraordinary energy of millions of people adding all sorts of value to the internet daily – whether it be a blog entry, an update to the wikipedia encyclopedia, a book review for amazon or a contribution to the development of a new open source application like Firefox. All of this and much more is value being added to the Ends of the internet, transforming the stupid network into one of the most valuable possessions of humanity to this point in our evolution.


Isenberg, David. Rise of the Stupid Network

Isenberg, David. The End of the Middle
(broken link)

Isenberg, David and Weinberger, David. The Paradox of the Best Network

Searls, Doc and Weinberger, David. World of Ends

Jerome H. Saltzer, David P. Reed, David D. Clark
End-To-End Arguments In System Design

network neutrality

When Vint Cerf warns that the neutrality of the internet is under threat then I have to take notice:
The remarkable social impact and economic success of the Internet is in many ways directly attributable to the architectural characteristics that were part of its design. The Internet was designed with no gatekeepers over new content or services. The Internet is based on a layered, end-to-end model that allows people at each level of the network to innovate free of any central control. By placing intelligence at the edges rather than control in the middle of the network, the Internet has created a platform for innovation. This has led to an explosion of offerings – from VOIP to 802.11x wi-fi to blogging – that might never have evolved had central control of the network been required by design.

My fear is that, as written, this bill would do great damage to the Internet as we know it. Enshrining a rule that broadly permits network operators to discriminate in favor of certain kinds of services and to potentially interfere with others would place broadband operators in control of online activity. Allowing broadband providers to segment their IP offerings and reserve huge amounts of bandwidth for their own services will not give consumers the broadband Internet our country and economy need. Many people will have little or no choice among broadband operators for the foreseeable future, implying that such operators will have the power to exercise a great deal of control over any applications placed on the network.

As we move to a broadband environment and eliminate century-old non-discrimination requirements, a lightweight but enforceable neutrality rule is needed to ensure that the Internet continues to thrive. Telephone companies cannot tell consumers who they can call; network operators should not dictate what people can do online.

I am confident that we can build a broadband system that allows users to decide what websites they want to see and what applications they want to use – and that also guarantees high quality service and network security. That network model has and can continue to provide economic benefits to innovators and consumers -- and to the broadband operators who will reap the rewards for providing access to such a valued network.
This had generated a lot of discussion. I am studying the links at the end of Vint's statement and will post on this topic again.

"stupid copyright friction"

Chuck took some video footage of one of his favourite bands, Soundtrack of our lives, at a club.

He doesn't want to make money out of it but was baled up by the management for illegal use.

He argues that this is what fans do, capture magic moments of their favourite groups.

Why fight technology?
Why fight human nature?

He has presented his message in a short movie (11 MB) available at his vlog.

britannica in decline

I quoted a study from Nature earlier which compared wikipedia with Britannica and found that wikipedia did quite well.

However, the decline of Britannica due to competition from digital sources, such as Encarta, started well before the success of wikipedia:

A particularly poignant example of the rapidity with which the digital revolution has undermined a hitherto financially and culturally valuable business is the story of the latest (and, possibly, the last) decade of Encyclopaedia Britannica (EB).

In 1991, the company sold about 400,000 printed sets, and in 1997 about 10,000. (Tellingly, my source for this information is a quotation from the Managing Director of EB International, only available to subscribers to a for-fee service, E-Commerce Today). The collapse was triggered by the success of Microsoft Encarta and other CD-ROM versions of lower-quality but approximately equivalent collections sold in a convenient and inexpensive form. Since then, web-based information services have mushroomed. Despite its brand reputation, and the apparent quality and presumed value of the content the company owned, and even after scrambling to survive, revenue has halved, losses have accumulated, the company has changed hands several times, and survival remains uncertain
- Roger Clarke. Freedom of Information? The Internet as Harbinger of the New Dark Ages

freedom to explore

These are annotated links to positive stories about students being given the freedom to explore. I plan to come back to it and add to it as I discover more. There is too much fear around of things going wrong when kids are allowed to explore. Of course things will sometimes go wrong!

No Two Swimmers Float Alike by Guy Bensusan.

This is his journey through two different learning styles. As a swimming coach he preferred exploration, playfulness and freedom as a style. As a University student he was initially persuaded that a serious, structured approach was necessary. But as he became older and wiser he realised that the first style could be applied to most situations. His story is beautifully written with very interesting anecodotes and detail, which make it stand out. It is also about taking taking responsibility for overcoming our fear of the deep water through empathic individualised instruction. It is obvious that he is an inspirational teacher.

Personal Use of the Internet by Doug Johnson

Libraries ought to be places where students can explore their own interests not just look up what others want them to learn. It follows that this should be extended to exploring things on the internet. Some of the benefits are: practice skills, gives internet ban some teeth(?), it's fun!

Sunday, December 25, 2005

My talented daughter

Showing off some work completed by my beautiful and talented daughter, Alannah, for her Visual Arts course.

This was done in photoshop and the theme was "a meaningful glance" or something like that.

Click on the images for a larger view at my flickr account.

Friday, December 23, 2005

tagging posts in blogger

I'm trying to implement the instructions at Fresh Blog so that I can add tags to my posts here using blogger.

I had to grasp some new things that I hadn't appreciated before.

In delicious you can use the + operator to search for bookmarks with multiple tags. For example, will search my delicious account for items tagged with both javascript and ecmanaut.

So, by introducing a new tag into delicious (I've used billkerr) I can distinguish between my blog items in delicious if I tag all of them with billkerr as well as other tags. Once specified, the billkerr tag is added on automatically by Johan's script, so it doesn't clutter up my tag line in blogger. I've opted not to add the delicious icons at this stage.

Freshblog links to Johan's own account of his script as well as how to alter the styling so that the tags appear on one line. It was useful for me to read these to collaborate that I was doing it correctly.

I still have one problem to solve:
9. In a great feature, Johan has also used this script to add a "post to" link to the publish notification page on blogger, which will pop-up a window auto-populated with permalink, title, timestamp & tags. Just hit "bookmark" over there & you're all set!!
- step 9, Freshblog

My URL for this link has an undefined? in it and the link produces a 404. The URL is something like this:

Can't see any reference to this problem in the comments on the various posts, so I wonder what I'm missing?

At any rate, thanks for the great script, Johan, much appreciated.

update 24/12/2005
all fixed :-)

With the popup boxes you get after clicking on "Tags:" I had to alter the path and then re-enter the value I wanted before the 'Link to Delicious' link would work

A programming friend here thought that perhaps there was bug wrt cache

Similarly typing kerrblog, the tag word for all my blogs, into the second popup box was not sufficient - I needed to alter it in the third popup too

I'm not a very good programmer but someone who is can figure out the work-arounds, others in my boat might get stuck and frustrated. Thanks Paul for the help.

filtering the internet

In South Australia a filtering tool has been imposed onto all government schools as part of the eduCONNECT service. It's based on the N2H2 filtering system, however, by examining the interface it appears that some functionality of that system has been removed.

There was a lengthy discussion about the eduCONNECT system on the South Australian IT teachers list in May 2005. I'm presenting here an overview of my objections to this filtering system from a number of perspectives. If necessary I'll elaborate on some of the points that follow in more detail later.

1. This particular filtering service and its implementation:

It does not distinguish between adults and children or between students of different ages or enable different filtering regimes to operate on different machines. Whatever is blocked for students is also blocked for adults and is blocked on all machines on a site. There is no way around this.

These categories are blocked by default:
  • Message/Bulletin Boards
  • Web Mail
  • Web Page Hosting
  • Games
There are 39 filtering categories . Of these categories 30 are blocked. I am lobbying for the unblocking of the above 4 categories because they are educationally beneficial.

At the time of its implementation in my school eduCONNECT blocked a range of sites which were a part of my curriculum. They were: (game making forum) (photo sharing) (blogging) (my personal blog) (my web mail)

Schools have the ability to unblock individual sites and keywords but as stated above it is all or nothing. So, for example, if a school has a policy for blocking webmail for students then webmail cannot be enabled for the teachers without breaking that policy.

The rationale here I gather is to maintain the ability to track all communications within a school site. The privacy issues here need to be explored.

Another point I would make is that defaults are important because they set expectations and in practice many sites never transcend those expectations.

In this case the expectation being set is that there is something wrong about wanting to write to the web. All ability to write to the web has been blocked by default.

Another expectation being set is that there is something wrong with playing computer games at any time during the school day (including lunchtimes). I would argue that many computer games are beneficial.

2. Understanding the web, what it is, what it is becoming:

The web is for writing, conversation and collaboration, not just searching and reading. This was the original purpose of the web as envisaged by Tim Berners-Lee as described in his book, Weaving the Web, but this purpose has only come to fruition recently.

This future is here now, search the web using these words and you will find it: web 2.0, web2MemeMap, Web as Platform, the architecture of participation, the software paradigm shift, small pieces loosley joined, web applications, the other road ahead.

The future is using the web as platform with web applications such as delicious, flickr, blogger, bloglines, gmail and the hundreds of other applications that are flooding into web space.

We are currently witnessing an irreversible cultural change to more unrestricted conversation and collaboration through the web, including mobile communication with wireless, wifi, mobile phones, etc.

The default settings of eduCONNECT block all of this and sends the completely wrong message about the way to go.

It makes it much harder for innovative teachers to introduce innovation into their curriculum.

Will the citizens of the future thank us for being cautious now? We should listen to futurists like Alan Kay:
Another problem is that we don't have a very good concept of the future itself. McLuhan's line--one of my favorites--is, "We're driving faster and faster into the future, trying to steer by using only the rear-view mirror." ....

But McLuhan was saying something else, that when change changes, you can't predict the future in the same way anymore; you have some second order or third order effects. So the biggest thing we need to invent ... is the invention of the future itself. In other words, to think of the concept of future not as a thing that comes from the past--although it has come from the past in a way--but to realize that the forces that are bringing about change right now are so great that it's very difficult to sit down and make simple extrapolations.
For students it makes school seem more irrelevant, out of touch and restrictive. A PEW Report in the USA in 2002 has already found that there was "... a widening gap between internet savvy students and their schools."

3. Child access and safety:

My response to the "it's better to be safe than sorry" or "proceed with caution" argument

With exciting new web applications coming on line every day, the general stance of "proceed with caution" means in practice that schools will lag behind the cutting edge of innovation. This will disadvantage our students in a world where innovation and keeping up with trends in technology has become more important. To play safe, to be cautious may not be always in the best interest of students.

Equity considerations: Quite a few students do not have access to this web based software at home. If we block it at school we are doing them a disservice compared with wealthier students who generally do have internet at home.

Concerns about child safety are important. By using web based social software we can proactively train students in safe usage. By not using it we play safe but do not protect students in what they might do outside of school hours. By not using it schools may be minimising their chance of being sued by a parent when something goes wrong but this may still not be in the longer term best interests of children, especially those who are naive internet users and need advice from teachers about safe practice.

Resiliance is better than avoidance. It is just crazy to think that with the incredible growth of mobile peer to peer communications that we have the option of locking young people out of exposure to illicit or dangerous material distributed through the internet. Irrespective of the merits of the blocking strategy with each day it is becoming less realistic.

Does blocking material to children which parent or teachers perceive as undesirable do more harm than good? The arguments against censorship need to be considered. Read Why we do this? by Peacefire, an organisation devoted to Open Access for the Net Generation.

Wednesday, December 21, 2005

the architecture of participation

Tim OReilly
Originally uploaded by Bill Kerr.
The traditional wisdom that Open Source is mainly about a new licensing model is wrong.

O’Reilly draws a parallel with the paradigm shift that IBM started in 1981-2 when they released the specs for the PC and started the separation of hardware from software. This led to the rise of companies such as Compaq, Dell (hardware) and Microsoft (software).

The paradigm shift happening today is from the PC to the network

The O’Reilly book Google Hacks was the no. 1 best seller for over a month, illustrating the interest in web applications that run over the internet

The backend can be described by the acronym LAMP (Linux Apache MySQL PHP (or Python or Perl).

Open Source licensing (eg, Stallman’s GPL license) is an important issue but we need to realise that the most interesting software these days is not even distributed, it is just performed – google, amazon software – and these companies offer a total service based on massive amounts of data / information and a critical mass of users.

MapQuest is another example of a killer application (only applies to the USA). Later in the talk he qualifies this because they haven’t figured out the architecture of participation

The most important thing to watch for these days is network enable collaboration, which results in an adhocracy (term attributed to Alvin Toffler and Cory Doctorow – scifi writer.

Power is shifting from the company to the individual. Individual like Linus Torvalds and Larry Wall are more important than the companies they work for. The same thing has already happened in Hollywood, the individuals are more important than the companies.

“Architecture of participation”.
“Small pieces loosely joined”
Some software is built to encourage participation, eg. Linux has a small kernel, whereas other software is not, eg. Windows.

Other examples of the architecture of participation:
  • amazon software enables user reviews (over 10,000,000 on amazon) and list mania (the writers of lists on particular topics receive a kickback)
  • the google page rank algorithm
This is my summary of Tim O'Reilly's talk, Rethinking the Boundaries.

hardware, software, infoware

Tim O'Reilly
Originally uploaded by Bill Kerr.
What is the software paradigm shift?

The term paradigm shift is taken from Kuhn’s Structure of Scientific Revolutions, eg. the shift from an earth centred view of the solar system to a sun centred view

When audiences are asked, “How many of you use Linux?” then for some audiences only a small percentage put their hand up.

But when asked, “How many of you use Google?” then 100% put their hands up

If you are using Google then you are using Linux, since Google runs on Linux. What you use may not be mainly in your PC on your desktop. Increasingly we use internet applications such as google, amazon and eBay.

The paradigm shift is towards the predominant use of applications which reside somewhere on network servers. The network, not the PC, has moved to the centre of the universe.

It is software that stands above the level of a single device or operating system.

Another aspect of the shift is peer to peer. Client – server thinking is old thinking. Napster was created from the insight that you don’t need to have all the songs on your PC, you just need to be able to access songs by using collaborative software that can search the internet.

“Google is a collaborative work.” Ditto for amazon and eBay.

Users help to build these companies just by using their services for their own ends. For example, if a website is not found in google then a blogger can put it into google by publishing the URL in their blog.

Users build amazon by writing readers reviews of items they buy, such as books, through amazon and publishing those reviews on amazon. This feature is one of the most useful things about amazon.

There is hardware, software and infoware. What is inforware? It is the combination of software and information as illustrated by companies such as google, amazon and eBay. They have good software but their dominance is based just as much on their information and the critical mass of users they have developed.

This is my summary of Tim O'Reilly's interview The Software Paradigm Shift. You can obtain the full voice interview from it-conversations


Originally uploaded by Bill Kerr.
I became aware of the open source really slick screensavers a year ago but did not take the opportunity then to download them. I thought I was too grown up for fancy screensavers.

This time I decided to download and install. They are beautiful. I've been running Euphoria for the past week. It was so nice that I shortened the timer so that it cuts in every 5 minutes! Every time I walk back into my room after a short break I see a beautiful pattern on the screen.

I've just switched over to Flocks, which is equally beautiful.

From the really slick screensavers site:
These savers are free software. Feel free to give them to anyone and everyone. Some of them demand a lot of performance from your CPU and graphics card, so I hope you have a fast computer. These savers use OpenGL for graphics and will perform poorly without OpenGL hardware acceleration. All modern PCs being sold have decent OpenGL support, but the high-end gaming PCs will work the best.

Monday, December 19, 2005

tell yer mum

This goes to show that if you are into internet romance that it's best to tell your mum about it!!
MARSEILLES, France -- Skirt-chasing playboy Daniel Anceneaux spent weeks talking with a sensual woman on the Internet before arranging a romantic rendezvous at a remote beach -- and discovering that his on-line sweetie of six months was his own mother!
- full story


student communication rights

Plunkers writes in his blog:

At this time I suddenly had access to all the student blogs. This really opened my eyes. The students really want to express themselves in this environment. Students want to spell things correctly, they want to be considered important enough that people read their blogs and they want to be interesting. (My blog has already been classified as REALLY boring, guess I've got some work to do ;-) )

Of most interest however was that I had access to pictures of students hanging around at school (taken on mobile phones) and a complete set of camp photos from Adelaide.

The traditional school is one where teachers carefully monitor all student behaviour and intervene quickly when students move outside prescribed boundaries - no bad language, no harrassment of others, no body contact, no smoking etc.

One of the teachers primary roles - duty of care - is to protect students from any harm, to "first do no harm"

Teachers and schools have to be far more careful than parents have to be - in theory students have no privacy whatsoever at school - someone is always watching them (yard duty) - the only privacy they have is when they go to the toilet and they have to ask permission for that

At recess, lunch, before and after school many students break these rules. For example, in their friendship groups they may use bad language, tease each other, have body contact of some type etc. - this is usually done in a low level way which doesn't upset anyone.

The advent of blogging, mobile phones (with cameras), the internet, flickr photo sharing, podcasting and other social software introduces opportunities for increased commmunication and collaboration between students

One thing is certain - students will keep on using this technology more and more - we can block it in schools in the name of protecting them from harmful behaviours but that won't stop them using it outside of school

Shouldn't schools be training their students in safe behaviours on the internet, rather than attempting to block all possible avenues of unsafe behaviour?

I know of a recent case where a school has blocked all google images because a handful of students searched for and found images of bestiality in a class - other students saw this and reported it to their parents who made complaints to the school - the response in this case has been to block anything that might lead to something like this occurring again

My frustrated feeling about this is: Why do the blockers have all the rights? Why does a whole school population have to be deprived of a useful resource because it can be abused?

What would be the response to a parent who phoned in complaining about the blocking of all google images because a handful of students broke the rules?

It seems that a parent could launch legal proceedings against a school if their child is exposed to pornography inadvertently. But that a parent could not launch legal proceedings against a school for depriving their child of a useful educational resource.

Most schools have an internet agreement form in place about what students can and can't do on the internet. Doesn't that form mean anything?

I really would like to see some clarification of the legal issues.

We have here a situation where parent rights to protect their children (backed by an implied legal threat) is being given more weight than other considerations.

Personally, I'm mainly concerned about the teachers right to teach an innovative curriculum. For example, flickr and blogging are marvellous educational resources which I already use. I don't want to lose them.

Another consideration is student rights to express themselves. Student rights seem to be at the bottom of the list in schools. I was surprised recently to read that the EFF (Electronic Frontier Foundation) has published an FAQ about student blogging rights. They refer to legal cases in the USA about student rights to free speech, including their right to criticise their teachers and schools.

America has the First Amendment, the right to free speech. Students rights to free speech, what a novel idea! What have we got?


Added this feature to the sidebar today. I'm looking forward to discovering where my visitors are from.

If you want one just click on the map and that will lead you to ClustrMaps.

There is a good analysis of the ClustrMap by ecmanaut here.

ecmanaut has somehow integrated his delicious tags with his blog to achieve tagging for his blogger items, something I've been wanting to do.

Sunday, December 18, 2005

the vector is mightier than the bitmap

Nice slogan from inkscape, wish I had thought of it myself.

Scalable Vector Graphics (SVG) have come of age, it is time to incorporate them into the school curriculum. I can't wait for 2006, back to school, well I exaggerate slightly.

Up until now there has been a problem with the viewer not being embedded into the browser, meaning that many users would not see the SV graphics on your webpage unless they had gone to the trouble of downloading the Adobe viewer. It is still worthwhile for users / schools to do that so the SV graphics can be viewed in IE, which, no surprise, is lagging behind the times, yet again.

But the viewer problem has been solved for those who use Firefox. SVG is now native to Mozilla firefox, no plugin required.

SVG offers a relevant, exciting, rich learning environment for both students and teachers. Graphics can be styled, animated and scripted. This practice can be combined with the exploration of:
  • XML - SVG graphics are text based (eXtensible Markup Language)
  • CSS - they can be styled (Cascading Style Sheets)
  • SMIL - they can be animated (synchronised multimedia integration language)
  • JavaScript and DOM - they can be scripted (Document Object Model)
Also raster graphics (JPEG, PNG) can be embedded into vector graphics, so you can have the best of both worlds.

It all comes together in a really nice way.

For an editor I would recommend Inkscape, if you want a free one. Xara is in the process of releasing an open source Linux editor but you will still have to pay for their Windows version.

There are some great examples at

Here are some examples to illustrate the range of what can be done:
xml file, svg graphics, style in the header, JavaScript (functions, buttons, event handlers)
sophisticated svg drawing of a lion
too good!! tetris game programmed in SVG and JavaScript (inspirational example)

Don't forget, you'll need a recent version of firefox or the Adobe plugin to view these.

outrageous attack on wikipedia

wikipediaclassaction is trying to force wikipedia to alter the way they do things by legal action.


Meanwhile, here is the link and a quote from the original Nature article about the wikipedia verus Britannica comparison, mentioned earlier:
... Michael Twidale, an information scientist at the University of Illinois at Urbana-Champaign, says that Wikipedia's strongest suit is the speed at which it can updated, a factor not considered by Nature's reviewers.

"People will find it shocking to see how many errors there are in Britannica," Twidale adds. "Print encyclopaedias are often set up as the gold standards of information quality against which the failings of faster or cheaper resources can be compared. These findings remind us that we have an 18-carat standard, not a 24-carat one."

Furthermore, there is a great commentary on the whole Seigenthaler issue and implications by Danah Boyd, who amongst other things calls on academics to contribute more:

I am worried about how academics are treating Wikipedia and i think that it comes from a point of naivety. Wikipedia should never be the sole source for information. It will never have the depth of original sources. It will also always contain bias because society is inherently biased, although its efforts towards neutrality are commendable. These are just realizations we must acknowledge and support. But what it does have is a huge repository of information that is the most accessible for most people. Most of the information is more accurate than found in a typical encyclopedia and yet, we value encyclopedias as a initial point of information gathering. It is also more updated, more inclusive and more in-depth. Plus, it's searchable and in the hands of everyone with digital access (a much larger population than those with encyclopedias in their homes). It also exists in hundreds of languages and is available to populations who can't even imagine what a library looks like. Yes, it is open. This means that people can contribute what they do know and that others who know something about that area will try to improve it. Over time, articles with a lot of attention begin to be inclusive and approximating neutral. The more people who contribute, the stronger and more valuable the resource. Boycotting Wikipedia doesn't make it go away, but it doesn't make it any better either.

Saturday, December 17, 2005

Blogger Web Comments for Firefox

I've installed this extension to Firefox tonight. When you visit a page a window pops up with links to the blogs that have made comments about that page. It has already been useful in enabling me to track comments from a page of interest, an FAQ by the EFF about student blogging rights.


Thursday, December 15, 2005

free software, free knowledge group

Received mail from Christopher Harvey who has helped create a Free Software and Free Knowlege in Education EdNA group.

It's a portal into the free software community for educators.

One thing that caught my eye was the discussion forum, which consists mainly of some interesting discussion starters at this stage.

Another was the free software CD/DVD for Windows compiled by Christopher.

wikipedia rivals britannica

This is a timely study given the recent hate articles directed at wikipedia. Wikipedia works despite being not perfect. But Britannica is far from perfect too.
A report published in the British journal Nature said it gave independent reviewers 42 pairs of articles from both encyclopaedias, covering subjects that ranged from Archimedes' Principle and Dolly the Sheep to field-effect transistors and Creutzfeldt-Jakob disease.

The reviewers were not told which article came from where, and were asked to check the entries for accuracy.

"Only eight serious errors, such as misinterpretations of important concepts, were detected in the pairs of articles reviewed, four from each encyclopaedia," Nature reports.

"But reviewers also found many factual errors, omissions or misleading statements: 162 and 123 in Wikipedia and Britannica, respectively."

Nature says "Britannica's advantage (over Wikipedia) may not be great" when it comes to science, and comments that this result is "surprising" given the eclectic way that Wikipedia's articles are written.

Wednesday, December 14, 2005

those who hate wikipedia

Wikipedia has come under fire recently after publishing seriously false information about John Seigenthaler: that he was involved in the murder of Robert Kennedy.

The information was published anonymously but then it took a long time to remove it (132 days) after Seigenthaler complained. Now Seigenthaler has an axe to grind and so does The Register.

In response to the (Seigenthaler) incident, Wales instituted a new policy preventing unregistered users from creating new articles on the English Wikipedia.

Some people use mistakes like this to condemn the whole wikipedia experiment.

One issue is understanding the strengths and limitations of wikipedia as a peer to peer, free on line source of information which calls itself an encyclopedia. IMO its strengths outweigh its limitations, I think it does very well.

Another issue is understanding why some people hate wikipedia and want to shut it down / destroy it in its present form. My guess would be that it is disruptive technology, it is changing the way we do things, that it is a threat to established interests. Those who think knowledge ought to be filtered by experts before being consumed by the "ignorant" masses.

But that is not the internet way. Filter then publish has been replaced with publish then filter. That is a big shift, one for the better, but one that takes some getting used to and initially mastering new technologies that do the necessary filtering for us.

Wikipedia is an imperfect filter but in the scheme of things - internet, knowledge explosion - it is a huge step forward.

There was a good discussion on slashdot about this, here are some abbreviated quotes:
I think that responsibility is the heart of this issue, and is why so many people get worked up about it. It's about who is to be assigned blame if wikipedia is inaccurate.

The author of the register article obviously wants the administrators of wikipedia to be held responsible, as if it was a top-down heirarchy. But it's not: it's more of a sort of p2p encyclopedia. It's not useful to blame wikipedia for being irresponsible any more than it is to blame gnutella for having illegal media on its network.

And the problem with attacking wikipedia and saying its not only useless, but it is harmful, is that it is not only attacking those people who spread disinformation. It is also attacking smart people who have a lot of worthwhile knowledge, and have carefully attempted to transfer this knowledge to an online medium that they knew people would use.
  • In fact, Wikipedia actually provides more (and more accessible) information on the revision history and editorial decisions leading to the present state of an article than any print encyclopedia I've ever heard of.
  • Wikipedia may not provide a strong or prominent enough disclaimer to suit you, but the obvious question would be: what does? TV news? The New York Times? Can you name a single "authoritative" source of information that either 1) Prominently disclaims their status as authoritative or 2) provides some substantive guarantee of the accuracy of the information?
  • link
Would we somehow be better off if Wikipedia didn't exist at all?
Despite some inaccuracies the Wikipedia is a veritable goldmine of useful information. What do the people who complain about it expect? An editor to peer review every single article? Wikipedia is probably the best model for a free encyclopedia that anyone has come up with and it's an amazing use of technology almost undreamt of a couple of decades ago. As long as we bear in mind how the entries are created (and it's not exactly a tough concept to grasp) how can it not be providing great benefit for people? The nay-sayers would put us back into the dark ages where we have to pay money for out-of-date information when there are people out there with the up-do-date facts who want to share them now for nothing. By all means don't keep the innacuracies a secret (because, among other things, that'll help to get them fixed), but there's no need for moral lectures unless you have a better alternative to propose. So I think your question is the right one to ask.
Unlike the Register itself Wikipedia is subject to a thousand year old form of analysis: Peer Review. If peer review is good enough for the scientific community (they put a man on the moon, the register has yet to accomplish that) and the medical community (they have done heart transplants, the Register has not) and the Linux Kernel, as any open source project, is subject to peer review (they have a very good perating system, the Register has yet to boot a machine) why would we not subject our historical data to such a process? Why not subject our media to such processes. Sadly it seems that the Register has the disease many younger Internet-generation kids have, a lack of patience. Peer review is slower, but as history moves on, faster
The real problem here is that the Wikipedia puports to be peer-reviewed, but each article has its subscribers, and it isn't clear whether an article has been tacitly approved by innumerable readers, or quietly corrupted out of salutary neglect. This ambiguity is the real failing of the Wikipedia, but it should be easily corrected by applying something similar to Slashdot Karma--just to show whether any editorial attention has affected any given article or not. ...

In the end, I think the Wikipedians are right. "The price of liberty is vigilance." The Register is also right. This is one thing that will happen if we're asleep at the wheel. However fiery the iconoclasty makes you feel, do we throw the baby out with the bathwater? No. We take what we have and make it better.

stereotypes refuted

The worst stereotyping of video games is that they are meaningless time wasting, that they desensitise, are socially isolating and induce violent behaviour, that they are sexist and only appeal to immature male youth.

Every one of these propositions is refuted in grand style by Henry Jenkins (MIT Professor), here.

He argues that the historical trend is now moving in the direction of girls becoming more involved with video games:
Historically, the video game market has been predominantly male. However, the percentage of women playing games has steadily increased over the past decade. Women now slightly outnumber men playing Web-based games. Spurred by the belief that games were an important gateway into other kinds of digital literacy, efforts were made in the mid-90s to build games that appealed to girls. More recent games such as The Sims were huge crossover successes that attracted many women who had never played games before. Given the historic imbalance in the game market (and among people working inside the game industry), the presence of sexist stereotyping in games is hardly surprising. Yet it's also important to note that female game characters are often portrayed as powerful and independent. In his book Killing Monsters, Gerard Jones argues that young girls often build upon these representations of strong women warriors as a means of building up their self confidence in confronting challenges in their everyday lives.
And I like the way he responds to the position that video games are too violent:
... Judge Richard Posner noted: "Violence has always been and remains a central interest of humankind and a recurrent, even obsessive theme of culture both high and low. It engages the interest of children from an early age, as anyone familiar with the classic fairy tales collected by Grimm, Andersen, and Perrault are aware." Posner adds, "To shield children right up to the age of 18 from exposure to violent descriptions and images would not only be quixotic, but deforming; it would leave them unequipped to cope with the world as we know it." Many early games were little more than shooting galleries where players were encouraged to blast everything that moved. Many current games are designed to be ethical testing grounds. They allow players to navigate an expansive and open-ended world, make their own choices and witness their consequences. The Sims designer Will Wright argues that games are perhaps the only medium that allows us to experience guilt over the actions of fictional characters. In a movie, one can always pull back and condemn the character or the artist when they cross certain social boundaries. But in playing a game, we choose what happens to the characters. In the right circumstances, we can be encouraged to examine our own values by seeing how we behave within virtual space.

Tuesday, December 13, 2005

The quick pink lamb fell ...

Originally uploaded by Bill Kerr.
Last week I was screaming at my computer.

My new challenge went more smoothly. I can report being ecstatic and joyfully exclaiming saying, "yes, yes, yes, yes ..." once or twice.

Starting with the basic sentence structure:
"The quick brown fox jumped over the lazy dog",
I wanted the user to be able to input a variety of adjectives, nouns and verbs and then by pressing a button create new sentences such as:
"The quick pink lamb fell over the indolent horse"

I even managed to colour code the adjectives, nouns and verbs by constructing a sentence array which dragged in parts of speech from the adjective, noun and verb arrays. I have published the file at my website. You'll need Game Maker 6.1 to run it.

It needs more work because so far it only works for that particular sentence structure. But it's a good start.

Friday, December 09, 2005

imitate a pattern

Make a pattern in a room. Ask the game player to push some objects to imitate this pattern. Doing this will destroy a wall and enable the player to progress to the next room.

A couple of students in the Murdoch Uni Game Making competition asked me for assistance with this problem.

I have written a demo about how to do this.

I have blogged about how I screamed at my computer due to poor understanding / documentation about how the position_meeting() function works in Game Maker.

Wednesday, December 07, 2005

screaming again

Last friday night I was screaming at my computer again because I worked for hours on a game maker problem which I couldn't solve

I'll call it the position_meeting() problem, after the name of the recalcitrant function

I worked on it again tonight and solved it. But it still took me at least another couple of hours.

I feel both a deep sense of satisfaction that comes from solving problems such as these, combined with ongoing annoyance that the game maker interface in this case was not intuitive and also the documentation in the manual quite inadequate.

When the ghost pushes the rocks onto the placeholders then the wall disappears. That's the big picture of what I was trying to achieve. It's part of a pattern matching exercise that a couple of my students have come up with in the Murdoch game making competition

The problem was that when you determine a position in a room (x, y co-ordinates) by moving the mouse to the desired position and reading the co-ordinates, then that position does not work in the function: position_meeting(x, y, obj).

By default, the position co-ordinates provided by the mouse feedback is snapped to grid and also by default the origin of an object is at the top left hand corner of that object. Anyway, if you do the natural thing and just type in the co-ordinates provided by the mouse feedback then the function does not work.

I solved the problem by adding half the grid width and height to the co-ordinates moving the position detection to the centre of the object, rather than the outer perimeter. For a 24*24 grid add 12 to both the x and y co-ordinates producing 108, 156 and not 96, 144 which is shown in the diagram.

Anyway, the screaming last Friday didn't help much.

Tuesday, December 06, 2005


I posted the following to oz-teachers in response to criticisms of wikipedia and the internet as a reliable information source

wikipedia has been criticised here:,,1599116,00.html

My reading of that article is that the wikipedia entries are not perfect but also in most cases fairly reliable

Wikipedia is free and convenient - how many of us actually are ready to buy the latest Britannica, how many of us have access to a comprehensive library 24 hours a day?

There is a broader issue involved here too - the explosion of knowledge, cost and the future of the categorisation of information

Wikipedia is a very worthwhile experiment in developing a free peer to peer electronic encyclopedia - many said it was impossible but it is successful and becoming more so every day

Professional librarians have superior knowledge retrieval and categorisation skills - they are an essential but expensive resource

But because of the cost factors and knowledge explosion the professional librarian approach is not going to work on the internet

The comments in this thread along the lines that the internet is a pile of uninformed and opinionated junk of course are partly true but that is only relevant if there are not good ways available on the internet to search for reliable information of significance to the searcher

The truth of the matter is that internet technology (search, RSS feeds, bottom up taxonomies, group blogs of experts etc.) has dramatically improved the ability of the informed user to extract quality information from the internet - it really doesn't matter if it's 99% junk if you know how to find the 1% gold - and the technology to achieve this is improving all the time

It's a good idea to support really promising experiments like wikipedia because things like that are showing us one way forward about how to categorise reliable information in the new digital media which is in the process of replacing the old print media

Sunday, December 04, 2005

student blogs

I've just finished marking my year 10 student blogs for the last time this year and want to write down impressions while they are still fresh. Unfortunately I can't provide links because I didn't get around to asking permission.

Ground rules. I did ask students to blog about the space invaders games they were making but also allowed them to blog off topic if they liked. I modelled the blogging process for them at even though I did run out of time and didn't follow that through to the end. That didn't matter.

My expectation was one blog per lesson or five a week. Some found it hard to keep up with this rate but it did create a healthy expectation in my view that writing and reflection should be a continuing ongoing process. We blogged for five weeks but some asked if they could keep going beyond that.

I also set expectations about technical competence and asked them to post pictures, links, lists, to use formatting and to enable word verification to eliminate comment spam.

Nearly all of the students, except for about 4 out of a class of 23, obviously enjoyed it and found it rewarding. A handful of boys did not write much at all and some others were reluctant writers but did write when pushed. Nearly all of the girls wrote without prompting.

Some loved it and a couple have declared their intention to keep on blogging, perhaps on a new blog.

From my perspective the best thing was that it opened a new channel for me to get to know my students. In class my time to interact with each student is quite limited and is focused pretty much entirely on the IT subject content. The student blogs opened a new door through which students could communicate to me and with each other. Some students who are quiet in class were talkative on their blogs. Other students who are not quiet in class talked about other things on their blogs apart from the narrow subject content. I got to know them in a much broader way. Students revealed their feelings and interests via their blogs. This was very valuable and rewarding for me as a teacher trying to build positive relationships with my students.

The off topic blogging took on a life of its own, some students got into that in a big way. I was surprised to find a couple of girls in my class obsessed with wrestling and the Hardy boys :-)

I published the blog addresses of all students within the class and asked them to leave comments on each others blogs. Posting to each others blogs became popular too, for some.

Other students were much more task oriented and rarely strayed off topic. They posted their design ideas and problems / solutions to do with their games as well. This was the original purpose of the exercise from my point of view. But the blogging communication took on a life of its own far beyond that.

There is a significant marking load involved in the process for the teacher. Over the five weeks I marked their blogs three times, leaving comments on their blog and recording marks in a grid I developed.

update 6th December: I'm adding in the feature of my marks grid in response to a request from wara (see comments)
  • Game Reviews 2* 10 (As well as building space invaders I asked for two game reviews of editable game maker games that I placed in a folder)
  • Quality of writing /5
  • Design ideas about space inviaders (open ended mark)
  • Problems and solutions
  • Feelings
  • Other
  • Off topic
  • Quantity (number of blog entries, expectation was 24)

I also had technical competence marks:
  • word verification enabled
  • links
  • pictures
  • lists
  • formatting
From the grid I can pick out:
  • the quality writers
  • those who focused on design and problems solving
  • those who expressed feelings a lot
  • the off topic champions (and proud of it!)
  • technical competence (I had some who tweaked the HTML too)
As I said, it's a fair marking load, going into this detail. But rewarding.

It's was all pretty subjective and maybe unsatisfactory to tasky purists. I'm a bit inclined that way myself. Next time I'll set some limits on off topic stuff (but still allow some). Also I'll be better organised with space invaders extensions since I've now developed the materials. So I can have a higher expectation there.

Saturday, December 03, 2005

more dylan

"I've stepped in the middle of seven sad forests,
I've been out in front of a dozen dead oceans. ...
I saw a black branch with blood that kept drippin',
I saw a room full of men with their hammers a-bleedin'"
- a Hard Rain's A-Gonna Fall
How good is that?
As Dylan demonstrated, a good protest song is not simply political, nor is it narrowly confined to the issue that it's protesting. The best protest songs provide historical and artistic context for an alternative worldview and, in doing so, give legitimacy and a powerful sense of inevitability to the protest; even if the target of the protest never hears the actual song, he's ultimately unable to ignore its message and the followers that message inspires. "A Hard Rain's A-Gonna Fall"--which Dylan wrote during the Cuban missile crisis--never specifically mentions war. Instead, it uses apocalyptic imagery--"I've stepped in the middle of seven sad forests, I've been out in front of a dozen dead oceans. ... I saw a black branch with blood that kept drippin', I saw a room full of men with their hammers a-bleedin'"--to convey the horrors of war and, in the process, transcends its topic. As David Hajdu wrote in his book Positively Fourth Street, the song "provoke[s] feeling and thought as well as action."
Trite Eyes by Jason Zengerle

conversation in game maker



I've written a game maker demo illustrating how to have a conversation with some limited variation.

There are separate arrays of optimistic and pessimistic phrases which are selected randomly in turn. These could be further expanded, I've only done a prototype.

Press spacebar to make the conversation alternate.

It's a small start to the more sophisticated grammars devised by Paul Goldenberg and Wallace Feurzeig in Exploring Language with Logo (MIT, 1987).

Friday, December 02, 2005

video game course, USA

For making my game maker course details available, I received a nice email thank you from Jeff Helfand, professor in the Computer Science and Engineering department at the University of Nevada, Reno in the United States.

Jeff's new course, Introduction to Video Game Development is here.

By my reading he's using Game Maker as a bridge to C++, Java and Flash.