Wednesday, 14 March 2012

Lessons Learned

No discussion of the transformation of system design is possible at the moment without recourse to the severe nature of the research funding environment; currently, the recession has replaced necessity as the mother of invention. Given this, what practical change could take place?

System development could take a more open, public approach, and usability techniques could prove extremely useful here as they enable interaction with users in multiple ways, from surveys and interviews to blog postings of results. This would create more information which a project could publish through its lifecycle rather than aggregating and (to a certain extent) generalising in a final report.

Publishing the project plan and outcomes provides a useful starting point for the project; however, the language used in them will likely come directly from the original application for funding which will be highly specialised and reliant to a certain degree on jargon and technical understanding. Far more preferable would be to ensure that all public project communications could be accessible to as wide an audience as possible (this should include, of course, budget holders within the projects' own organisation).

To further exploit the work undertaken, an open library of evidence collected at first hand could be created from each project. Put simply, if a project undertakes a survey of, say, 100 researchers in a specific academic discipline then that survey must be made available for re-use. The same concept could be applied to interviews, focus groups or transect walks provided that data protection legislation was adequately followed. This would make available a knowledge base upon which other projects could instantly draw for informing their own usability strategies. Furthermore, with much research being ploughed into text mining, the first hand evidence materials here would form an excellent candidate for approaches such as sentiment analysis.

By applying research practice, this field could be transformed from one predominantly made up of narrative reflections of practitioners to a data store (perhaps accessible via an API) on which projects could draw for the modelling and evolution of user behaviour; thus corroborating or challenging results, and even for constructing funding applications (for instance, if particular sections of the academic community have not been adequately served or covered in prior research).

Monday, 12 March 2012

How successful have we been?

Our project plan identified both qualitative and quantitative measures with which to gauge success.

Quantitative

The learnability investigation was split across two interfaces with discrete audiences, the querying interface ("public") and the XML document editing interface ("editor").


Issue list for 'Public' interface

  • 'Users do not understand the difference between querying and searching'
    Issue 1: partially met
  • 'There is no method for users to initiate queries using statistical tools'
    Issue 2: partially met
  • 'Users accidentally left a previous search filter active when starting a new search'
    Issue 3: partially met
  • 'Users expected highlighted terms to extend their current search'
    Issue 4: partially met
  • 'Users have no way of creating a citation, nor any help for saving their search'
    Issue 5: met
  • 'There is not enough information about the source being searched in the results pages'
    Issue 6: met
  • 'Variant spellings currently have to be searched on separately'
    Issue 7: met
  • 'There is currently no keyword search function'
    Issue 8: partially met

The fact that the redesigned interface is unrecognisable from the version with which the investigation began indicates that the usability process was able to give strategic direction as well as feedback on finer details such as nomenclature; this is made more powerful by the fact that this was the first time that many users had seen the product.


Issue list for 'Editor' interface

  • 'Users need constant feedback when using something new'
    issue 1: unmet
  • 'Some parts of articles are repetitively structured and would suit some automation'
    issue 2: met
  • 'Users do not check work by looking at mark-up'
    issue 3: partially met
  • 'Users need standardised visual cues to use devices which move or transform'
    issue 4: met
  • 'Users need to be able to specify whether their comments are intended for publication'
    issue 5: partially met
  • 'There is a need to provide guidance that is tailored both to the source, and to the role of the current user'
    issue 6: unmet
  • 'Users like to navigate documents quickly using search'
    issue 7: met
  • 'Some users will need to be able to track the changes that have been made to a document'
    issue 8: met

You would be forgiven for thinking that an interface which relied heavily on interactive menus and controls would be difficult to investigate using click testing; however, given that all user activity starts with one click, it still needed to accurately signpost the correct entry points for interactivity. Overall, although the exercise created positive results, there were several areas in which the redesign failed; most frustratingly, issue 6, a simple case of inappropriate wording.

Qualitative

It is much harder to report on qualitative feedback – a lot of it is confidential and cannot be 'transcribed'; in addition, the compressed nature of the project plan means that there is little time for changes to take effect and be measured again. As an observation, it is more likely that users will view a project team as acting more deliberately (i.e. with a greater sense of strategy) if they communicate changes in system delivery personally. Interviews and focus group, and to some extent, email surveys can carry this sense of purpose.

Approach

Each primary issue within the public and editor interfaces now has a fixed point within system documentation which can track its development. By opening this up to the general public, further scrutiny of the issues is enabled and the project can appear more confident in its action.

By publishing the life cycle of each issue, the project cannot hide any aspect of development and its decisions are open to the standard academic techniques of review and criticism. It is a deeply honest approach to system design whose success can be judged by the range of people, including non-specialists, who are empowered to comment on it.