New vacancy listings are posted weekly on Wednesday at approximately 12 noon Central Time. They appear under New This Week and under the appropriate regional listing. Postings remain on the LITA Job Site for a minimum of four weeks.
New This Week
Visit the LITA Job Site for more available jobs and for information on submitting a job posting.
Following several studies about the use and usability of the DPLA website, we’ve just completed a set of small but significant changes. We believe these changes will create a more pleasant, intuitive experience on our website, connecting people more easily with the cultural heritage materials our partners provide.
During the evaluation phase of the project, we drew insight from multiple sources, and benefitted greatly from our community network. In consultation with DPLA staff, two volunteers conducted usability studies of our website, interviewing and observing yet more volunteers as they interacted with our site. Professional UX researcher Tess Rothstein conducted a pro bono study of users’ experiences searching the DPLA, while DPLA Community Rep Angele Mott Nickerson focused her study on our map, timeline, and bookshelf features. In addition to these interviews, we conducted in-depth analysis of our usage statistics, gathered via Google Analytics. In addition to the studies, we considered informal feedback from our community of users and partners.
Here are a few lessons we learned, and what we’ve done in response:Highlighting full access
Anyone who has done a usability study is familiar with the shocking moment when your product completely fails to engage a user in its intended way. For us, that moment came when a first-time user of our website did not realize that they could get all of the digital materials on our website right now, for free. They were just one click away from total access — and they didn’t click!
To ensure that future users don’t miss out, we’ve done a few key things to highlight that our contributors provide public access to all the materials users discover through DPLA. For example, a link that use to read “View object” now says something like this:
DPLA is a treasure trove of cultural heritage materials – but sometimes it can be hard to find just the right thing amidst millions and millions of items. Our research gave us a clearer picture of how to help users when their first search attempt returned too much — or too little — of a good thing.
For example, many of our users rely on our “Refine search” filter to narrow their search results and hone in on truly relevant materials. In our usability studies, we paid attention to which filters interviewees used, whether or not the filters helped them achieve their goals, and what interviewees told us about their usefulness. We corroborated these observations with analytics data, looking at which filters are used most frequently and when used, which filters are most likely to be followed by a “click” on a search result.
As is often the case with user-driven decision-making, our findings surprised us. We had predicted that filters with the best-quality metadata would prove most useful, but that was not always the case. Ultimately, we moved the most in-demand filters, like subject and location, to the top of the page, and bumped the lesser-used filters, like type and date, to the bottom.Making room for new ebooks features
DPLA is actively working toward an innovative future for ebooks. To make space for this work, we decided to retire the “Bookshelf,” our original interface for browsing the ebooks collection. Developed by Harvard Innovation Lab, the “Bookshelf” provided a unique search experience that will continue to inform our work with online search and ebooks.No bugs, please!
Staying on top of bugs and layout issues – especially those that our community members take the time to report – is an essential component of usability. We identified and fixed many during the project. Thanks especially to everyone who has chatted with us or contacted us about bugs on the website.Future work
This round of improvements is but one component of our community’s ongoing efforts to improve usability of and access to digital materials. While small, low-cost improvements like these will make an immediate positive impact, we are also actively engaged in conversations about improved metadata quality, new technologies, and stronger community relationships.
Avoid the heat and stay inside on August 4 for a free webinar on “Rightsstatements.org: Communicating Copyright Status through Metadata.”
Rightsstatements.org is a collaborative project of DPLA and Europeana to create a standard way of communicating the copyright status of works in digital collections. The project is built on the idea that accurate and clear rights statements are needed to help both organizations and users understand how they can use digital collections. This session, led by members of the working group that developed the statements, will address what Rightsstatements.org does, how it was created, and its first stages of adoption by digital libraries both in the US and abroad.
Emily Gore, Director of Content at DPLA and Right Statement Working Group Co-Chair
David Hansen, Clinical Assistant Professor & Faculty Research Librarian at UNC School of Law and Right Statement Working Group Member
Day/Time: Thursday, August 4 at 2pm Eastern/11am Pacific for our hour long free webinar.
Go to http://ala.adobeconnect.com/copytalk/ and sign in as a guest. You’re in.
This program is brought to you by OITP’s copyright education subcommittee.
This blog post was written by Simon Matet and Antoine Dusséaux. French version follows the English one
Open data is sometimes considered first as a way to foster economic growth through the development of innovative services built on public data. However, beyond this economic perspective, important though it may be, access to public sector information should be seen first and foremost as an unprecedented opportunity to bridge the gap between the government and its citizens. By providing a better access to fundamental public services and promoting transparency and accountability, open data has the potential to guarantee a greater respect of fundamental human rights. In this respect, access to case-law (the law developed by judges through court decisions) could become a pioneering application of open data to improve our democratic societies.
According to the European Court of Human Rights (ECHR), publicity of court decisions, “by making the administration of justice transparent”, is a condition for a fair trial, the guarantee of which is one of the fundamental principles of any democratic society. There is no concrete publicity without a free access for any citizen to court records. This is why the ECHR considers that the ability for any citizen to obtain copies of judgments, without the need to show a legitimate interest, “protects litigants against the administration of justice in secret” and “is also one of the means whereby confidence in the courts can be maintained”. Furthermore, according to the European Parliament, “certain aspects of (in)accessibility of Court files cause serious legal problems, and may, arguably, even violate internationally recognised fundamental human rights, such as equality of arms.”
For those reasons, all over the world, diffusion of case law is a public service task. However, accessing court documents can prove a daunting task for untrained, private citizens, reporters, and NGOs. In some countries, corporations or charities have captured the market of access to judicial precedents as governments proved unable or unwilling to fulfill this key mission. For instance, an important part of English judge-made law is owned by a private charity, the Incorporated Council of Law Reporting. In others, decisions are sold by courts to private legal publishers. For example, the Administrative Office of the US Courts collects $145 million in fees to access court records, every year. As a result, citizens usually only have access to a small selection of court decisions.
However, modern communication technologies and digitization now make it possible to provide free online access to millions of public court documents.
Open legal data would guarantee the respect of fundamental rights and also increase legal certainty. Indeed not only do citizens need to know the law, in codes and statutes, they also need to understand the concrete application and interpretation of the law by courts. Therefore, a free access to court records can help litigants to prepare their trials, for instance while assessing the opportunity of a negotiation. In the 21st century, the Internet must be seen as a valuable opportunity to enhance the transparency of the judiciary and improve legal certainty.
Open data of jurisprudence shows that behind the mere economic gains, access and reuse of public sector information is a fundamental instrument for extending the right to knowledge, which is a basic principle of democracy, and is a matter of human rights in the information age. The judiciary should not be left behind the ongoing digital transformation of public policies. In this domain, some countries, such as the Netherlands, have already made great efforts to provide a free access to citizens to a large amount of court decisions, while respecting litigants’ privacy, but most countries still have a long way to go. Although access to legislation is already included in the Open Data Index by Open Knowledge, it only requires all national laws and statutes to be available online, and not judge-made law. Since case law is an important source of law, especially in countries of common law tradition, it should be included in the legislation dataset in future versions of the Open Data Index.
Austin, TX In two weeks the 2016 VIVO Conference, Aug 17-19, is set to kick-off in Denver!
Please join us on Friday August 19 at 2:15 PM for a panel discussion on "Persistent Identifiers: Where we are now and the role of Networking Profile Systems in Shaping Their Future" with invited panelists:
DuraSpace News: VIVO Updates for July 17–New Steering and Leadership Group Members, VIVO 1.9 Testing, VIVO and SHARE Synergy
From Mike Conlon, VIVO Project Director
Steering Group Updates The VIVO Leadership Group has nominated new members for the VIVO Steering Group to serve three year terms. Mark Fallu of the University of Melbourne, Mark Newton of Columbia University, and Paul Albert of Weill Cornell Medicine have each agreed to serve. Please join me in welcoming them to the Steering Group! Short biographies below:
Austin, TX The Islandora Foundation held its Annual General Meeting recently and adopted a set of strategic goals to provide focus for development efforts in 2016 - 2017. Goals include a strong commitment to the CLAW project which is the next version of Islandora that supports Fedora 4 and the Portland Common Data Model (PCDM), which provides interoperability at the Fedora level.
Last updated July 26, 2016. Created by Peter Murray on July 26, 2016.
Log in to edit this page.
The Princeton University Library is looking at hosting another Blacklight Summit this fall with tentative dates of Wednesday, November 2nd through Friday, November 4th. We are thinking the event will be organized similar to last year’s format with demonstrations of Blacklight-powered applications, sessions about enhancing Blacklight applications, and ample time for community roadmapping, code exchange and development.
Library of Congress: The Signal: Recommended Formats Statement: Expanding the Use, Expanding the Scope
This is a guest post by Ted Westervelt, head of acquisitions and cataloging for U.S. Serials – Arts, Humanities & Sciences at the Library of Congress.
As summer has fully arrived now, so too has the revised 2016-2017 version of the Library of Congress’s Recommended Formats Statement.
When the Library of Congress first issued the Recommended Formats Statement, one aim was to provide our staff with guidance on the technical characteristics of formats, which they could consult in the process of recommending and acquiring content. But we were also aware that preservation and long-term access to digital content is an interest shared by a wide variety of stakeholders and not simply a parochial concern of the Library. Nor did we have any mistaken impression that we would get all the right answers on our own or that the characteristics would not change over time. Outreach has therefore been an extremely important aspect of our work with the Recommended Formats, both to share the fruits of our labor with others who might find them useful and to get feedback on ways in which the Recommended Formats could be updated and improved.
We are grateful that the Statement is proving of value to others, as we had hoped. Closest to home, as the Library and the Copyright Office begin work on expanding mandatory deposit of electronic-only works to include eBooks and digital sound recordings, they are using the Recommended Formats as the starting point for the updates to the Best Edition Statement that will result from this. But its value is being recognized outside of our own institution.
The American Library Association’s Association for Library Collections Technical Services has recommended the Statement as a resource in one of its e-forums. And even farther afield, the UK’s Digital Preservation Coalition included it in their Digital Preservation Handbook this past autumn, bringing the Statement to a wider international audience.
The Statement has even caught the attention of those who fall outside the usual suspects of libraries, creators, publishers and vendors. Earlier this year, we were contacted by a representative from an architectural software firm. He (and others in the architectural field) has been concerned about the potential loss of architectural plans, as architectural files are now primarily created in digital formats with little thought as to their preservation. Though the Library of Congress has a significant Architecture, Design and Engineering collection, this is a community that overlaps little with our own. But he saw the intersection between the Recommended Formats and the needs of his own field and he came to us to see how the Recommended Formats might relate to digital files and data produced within the fields of architecture, design and engineering and how they might help encourage preservation of those creative works as well. This, in turn, led to the addition of Industry Foundation Classes — a data model developed to facilitate interoperability in the building industry — to the Statement. We hope it will lead to future interest, not simply from the architectural community but from any community of creators of digital content who wish their creations to last and to remain useful.
We have committed to an annual review and revision of the Recommended Formats Statement to ensure its usefulness to as wide a spectrum of stakeholders as possible. In doing so, we hope to encourage others to offer their knowledge and to prevent the Statement from falling out of sync with the technical realities of the world of digital creation. As we progress down this path, one of the benefits is that the changes each year to the hierarchies of technical characteristics and metadata become fewer and fewer. More and more stakeholders have provided their input already and, happily, the details of how digital content is created are not so revolutionary as to need to be completely rewritten annually. This allows for a sense of stability in the Statement without a sense of inertia. It also allows us to engage with types of digital creation with which we might not have addressed as closely or directly as was possible. This is proving to be the case with digital architectural plans and it is proving to be even more the case with the biggest change to the Recommended Formats with this new edition: the inclusion of websites as a category of creative content.
At the time of the launch of the first iteration of the Recommended Formats Statement, websites per se were not included as a category of creative content. This omission was the result of various concerns and perspectives held then but there was no gainsaying that it was definitely an omission. Of all the types of digital works, websites are probably the most open to creation and dissemination and probably the most common digital works available to users, but also not something that content creators have tended to preserve.
Unsurprisingly, this also tends to make them the type of digital creation that causes the most concern to those interested in digital preservation. So when the Federal Web Archiving Working Group reached out about how the Recommended Formats Statement might be of use in furthering the preservation of websites, this filled a notable gap in the Statement.
Naturally, the new section of the Statement on websites is not being launched into a vacuum. The prevalence of websites and much of their development is predicated on the enhancement of the user experience, either in creating them or in using them, which is not the same as encouraging their preservation. It is made very clear that the Statement’s section on websites is focused specifically on the actions and characteristics that will encourage its archivability and thereby its preservation and long-term use.
Nor does the Statement ignore the work that has been done already by other groups and other institutions to inform content creators of best practices for preservation-friendly websites, but instead builds upon them and links to them from the Statement itself. The intention of this section on websites is twofold. One is to provide a clear and simple reminder of the importance of considering the archivability of a website when creating it, not merely the ease of creating it and the ease of using it. The other is to bring together those simple actions along with links to other guidance in order to provide website creators with easy steps that they can take to ensure the works in which they are investing their time and energy can be archived and thereby continue to entertain, educate and inform well into the future.
As always, the completion of the latest version of the Recommended Formats Statement means the beginning of a new cycle, in which we shall work to make it as useful as possible. Having the community of stakeholders involved with digital works share a common commitment to the preservation and long-term access of those works will help ensure we succeed in saving these works for future generations.
So, use and share this version of the Statement and please provide any and all comments and feedback on how the 2016-2017 Recommended Formats Statement might be improved, expanded or used. This is for anyone who can find value in it; and if you think you can, we’d love to help you do so.
FOR IMMEDIATE RELEASE
Duluth, Georgia–July 26, 2016
Equinox is proud to announce that Altoona Area Public Library was added to SPARK, the Pennsylvania Consortium overseen by PaILS. Equinox has been providing full hosting, support, and migration to PaILS since 2013. In that time, SPARK has seen explosive growth. As of this writing, 105 libraries have migrated or plan to migrate within the next year. Over 3,000,000 items have circulated in 2016 to over 550,000 patrons. We are thrilled to be a part of this amazing progress!
Altoona went live on June 16. Equinox performed the migration and also provided training to Altoona staff. They are the first of 8 libraries coming together into the Blair County Library System. This is the first SPARK migration where libraries within the same county are on separate databases and are merging patrons and coming together to resource share within a unified system. Altoona serves 46,321 patrons with 137,392 items.
Mary Jinglewski, Equinox Training Services Librarian, had this to say about the move: “I enjoyed training with Altoona Area Public Library, and I think they will be a great member of the PaILS community moving forward!”
About Equinox Software, Inc.
Equinox was founded by the original developers and designers of the Evergreen ILS. We are wholly devoted to the support and development of open source software in libraries, focusing on Evergreen, Koha, and the FulfILLment ILL system. We wrote over 80% of the Evergreen code base and continue to contribute more new features, bug fixes, and documentation than any other organization. Our team is fanatical about providing exceptional technical support. Over 98% of our support ticket responses are graded as “Excellent” by our customers. At Equinox, we are proud to be librarians. In fact, half of us have our ML(I)S. We understand you because we *are* you. We are Equinox, and we’d like to be awesome for you. For more information on Equinox, please visit http://www.esilibrary.com.
About Pennsylvania Integrated Library System
PaILS is the Pennsylvania Integrated Library System (ILS), a non-profit corporation that oversees SPARK, the open source ILS developed using Evergreen Open Source ILS. PaILS is governed by a 9-member Board of Directors. The SPARK User Group members make recommendations and inform the Board of Directors. A growing number of libraries large and small are PaILS members.
For more information about about PaILS and SPARK, please visit http://sparkpa.org/.
Evergreen is an award-winning ILS developed with the intent of providing an open source product able to meet the diverse needs of consortia and high transaction public libraries. However, it has proven to be equally successful in smaller installations including special and academic libraries. Today, over 1400 libraries across the US and Canada are using Evergreen including NC Cardinal, SC Lends, and B.C. Sitka.
For more information about Evergreen, including a list of all known Evergreen installations, see http://evergreen-ils.org.
Sarah Houghton (@TheLiB) summarizes what her team has learned about serving older adults with memory issues. We can make accommodations in our design, too. In May, Laurence Ivil and Paul Myles wrote Designing A Dementia-Friendly Website, which makes the point that
An ever-growing number of web users around the world are living with dementia. They have very varied levels of computer literacy and may be experiencing some of the following issues: memory loss, confusion, issues with vision and perception, difficulties sequencing and processing information, reduced problem-solving abilities, or problems with language. Just when we thought we had inclusive design pegged, a completely new dimension emerges.
I think specifically their key lessons about layout and navigation are really good.
What’s more, as patrons these people may be even more vulnerable because, as Sarah says, libraries are trusted entities. So these design decisions demand even greater consideration.
Libraries are uniquely positioned to see changes in our regular users. We have people who come in all the time, and we can see changes in their behavior, mood, and appearance that others who see them less often would never recognize. Likewise, libraries and librarians are trusted entities–you may have people being more open and letting their guard down with you in a way that lets you observe what’s happening to them more directly. Finally, people who work in libraries generally really care a lot about other people–and that in-built sensitivity and care can help when seeing a change in someone’s mental health and abilities. Sarah Houghton
The post Library Services for People with Memory Loss, Dementia, and Alzheimers appeared first on LibUX.
Last week marked the official start of the 2016 Congressional App Challenge, an annual nationwide event to engage student creativity and encourage participation in STEM (science, technology, engineering, and math) and computer science (CS) education. The Challenge allows high school students from across the country to compete against their peers by creating and exhibiting their software application (or app) for mobile, tablet, or computer devices. Winners in each district will be recognized by their Member of Congress. The Challenge is sponsored by the Internet Education Foundation and supported by ALA.
Why coding at the library? Coding could come across as the latest learning fad, but skills developed through coding align closely with core library activities such as critical thinking, problem solving, collaborative learning, and now connected learning and computational thinking. Coding in libraries is a logical progression in services for youth.
The App Challenge can be another means to engage teens at your library. Libraries can encourage students to participate in the Challenge by having an App Challenge event- host an “App-a-thon,” have a game night for teens to work on their Apps, or start an App building club.
At the launch, over 140 Members of Congress from 38 states signed up to participate in the 2016 Congressional App Challenge. Check to see if your district is participating and if not, you can use a letter template on the Challenge Website to send a request to your Member of Congress.
If you do decide to participate we encourage you to share what you’re doing using the App Challenge hashtag #HouseofCode and ALA’s hashtag #readytocode @youthandtech. The App Challenge runs through November 2. Look for more information throughout the competition.
The post Coding at the library? Join the 2016 Congressional App Challenge appeared first on District Dispatch.
However, the raw citation data used here are not publicly available but remain the property of Thomson Reuters. A logical step to facilitate scrutiny by independent researchers would therefore be for publishers to make the reference lists of their articles publicly available. Most publishers already provide these lists as part of the metadata they submit to the Crossref metadata database and can easily permit Crossref to make them public, though relatively few have opted to do so. If all Publisher and Society members of Crossref (over 5,300 organisations) were to grant this permission, it would enable more open research into citations in particular and into scholarly communication in general. In other words, despite the importance of the citation graph for understanding and measuring the output of science, the data are in private hands, and are analyzed by opaque algorithms to produce a metric (journal impact factor) that is easily gamed and is corrupting the entire research ecosystem.
Simply by asking to flip a bit, publishers already providing their citations to CrossRef can make them public, but only a few have done so.
Larivière et al's painstaking research shows that journal publishers and others with access to these private databases (Web of Science and Scopus) can use it to graph the distribution of citations to the articles they publish. Doing so reveals that:
the shape of the distribution is highly skewed to the left, being dominated by papers with lower numbers of citations. Typically, 65-75% of the articles have fewer citations than indicated by the JIF. The distributions are also characterized by long rightward tails; for the set of journals analyzed here, only 15-25% of the articles account for 50% of the citations Thus, as has been shown many times before, the impact factor of a journal conveys no useful information about the quality of a paper it contains. Further, the data on which it is based is itself suspect:
On a technical point, the many unmatched citations ... that were discovered in the data for eLife, Nature Communications, Proceedings of the Royal Society: Biology Sciences and Scientific Reports raises concerns about the general quality of the data provided by Thomson Reuters. Searches for citations to eLife papers, for example, have revealed that the data in the Web of ScienceTM are incomplete owing to technical problems that Thomson Reuters is currently working to resolve. ... Because the citation graph data is not public, audits such as Larivière et al's are difficult and rare. Were the data to be public, both publishers and authors would be able to, and motivated to, improve it. It is perhaps a straw in the wind that Larivière's co-authors include senior figures from PLoS, AAAS, eLife, EMBO, Nature and the Royal Society.
LITA: Stop Helping! How to Resist All of Your Librarian Urges and Strategically Moderate a Pain Point in Computer-Based Usability Testing
Editor’s note: This is a guest post by Jaci Paige Wilkinson.
Librarians are consummate teachers, helpers, and cheerleaders. We might glow at the reference desk when a patron walks away with that perfect article or a new search strategy. Or we fist pump when a student e-mails us at 7pm on a Friday to ask for help identifying the composition date of J.S. Bach’s BWV 433. But when we lead usability testing that urge to be helpful must be resisted for the sake of recording accurate user behavior (Krug, 2000). We won’t be there, after all, to help the user when they’re using our website for their own purposes.
What about when a participant gets something wrong or gets stuck? What about a nudge? What about a hint? No matter how much the participant struggles, it’s crucial for both the testing process and the resulting data that we navigate these “pain points” with care and restraint. This is particularly tricky in non-lab, lightweight testing scenarios. If you have only 10-30 minutes with a participant or you’re in an informal setting, you, as the facilitator, are less likely to have the tools or the time to probe an unusual behavior or a pain point (Travis, 2014). However, pain points, even the non-completion of a task, provide insight. Librarians moderating usability testing must carefully navigate these moments to maximize the useful data they provide.
How should we move the test forward without helping but also without hindering a participant’s natural process? If the test in question is a concurrent think-aloud protocol, you, as the test moderator, are probably used to reminding participants to think out loud while they complete the test. Those reminders sound like “What are you doing now?”, “What was that you just did?”, or “Why did you do that?”. Drawing from moderator cues used in think aloud protocols, this article explains four tips to optimize computer-based usability testing in those moments when a participant’s activity slows, or slams, to a halt.
There are two main ways for the tips described below to come into play. Either the participant specifically asks for help or you intervene because of a lack of progress. The first case is easy because a participant self-identified as experiencing a pain point. In the second case, identify indicators that this participant is not moving forward or they are stalling: they stay on one page for a period of time or they keep pressing the back button. One frequently observed behavior that I never interfere with is when a participant repeats a step or click-path even when it didn’t work the first time. This is a very important observation for two reasons: first, does the participant realize that they have already done this? If so, why does the participant think this will work the second time? Observe as many useful behaviors as possible before stepping in. When you do step in, use these tips in this order:
ASK a participant to reflect on what they’ve done so far!
Get your participant talking about where they started and how they got here. You can be as blunt as: “OK, tell me what you’re looking at and why you think it is wrong”. This particular tip has the potential to yield valuable insights. What did the participant THINK they were going to see on the page and now what do they think this page is? When you look at this data later, consider what it says about the architecture and language of the pages this participant used. For instance, why did she think the library hours would be on “About” page?
Notice that nowhere have I mentioned using the back button or returning to the start page of the task. This is usually the ideal course of action; once a user goes backwards through his/her clickpath he/she can make some new decisions. But this idea should come from the user, not from you. Avoid using language that hints at a specific direction such as “Why don’t you back up a couple of steps?” This sort of comment is more of a prompt for action than reflection.
Read the question or prompt again! Then ask the participant to pick out key words in what you read that might help them think of different ways to conquer the task at hand.
“I see you’re having some trouble thinking of where to go next. Stop for one moment and listen to me read the question again”. An immediate diagnosis of this problem is that there was jargon in the script that misdirected the participant. Could the participant’s confusion about where to find the “religion department library liaison” be partially due to that fact that he had never heard of a “department library liaison” before? Letting the participant hear the prompt for a second or third time might allow him to connect language on the website with language in the prompt. If repetition doesn’t help, you can even ask the participant to name some of the important words in the prompt.
Another way to assist a participant with the prompt is to provide him with his own script. You can also ask him to read each task or question out loud: in usability testing, it has been observed that this direction “actually encouraged the “think aloud” process” that is frequently used” (Battleson et al., 2001). The think aloud process and its “additional cognitive activity changes the sequence of mediating thoughts. Instructions to explain and describe the content of thought are reliably associated with changes in ability to solve problems correctly” (Ericsson & Simon, 1993). Reading the prompt on a piece of paper with his own eyes, especially in combination with hearing you speak the prompt out loud, gives the participant multiple ways to process the information.
Choose a Point of No Return and don’t treat it as a failure.
Don’t let an uncompleted or unsuccessful task tank your overall test. Wandering off with the participant will turn the pace sluggish and reduce the participant’s morale. Choose a point of no return. Have an encouraging phrase at ready: “Great! We can stop here, that was really helpful. Now let’s move on to the next question”. There is an honesty to that phrasing: you demonstrate to your participant that what he is doing, even if he doesn’t think it is “right” is still helpful. It is an unproductive use of your time, and his, to let him continue if you aren’t collecting any more valuable data in the process. The attitude cultivated at a non-completed task or pain point will definitely impact performance and morale for subsequent tasks.
Include a question at the end to allow the participant to share comments or feelings felt throughout the test.
This is a tricky and potentially controversial suggestion. In usability testing and user experience, the distinction between studying use instead of opinion is crucial. We seek to observe user behavior, not collect their feedback. That’s why we scoff at market research and regard focus groups suspiciously (Nielsen, 1999). However, I still recommend ending a usability test with a question like “Is there anything else you’d like to tell us about your experience today?” or “Do you have any questions or further comments or observations about the tasks you just completed?” I ask it specifically because if there was one or more pain points in the course of a test, a participant will likely remember it. This gives her the space to give you more interesting data and, like with tip number three, this final question cultivates positive morale between you and the participant. She will leave your testing location feeling valued and listened to.
As a librarian, I know you were trained to help, empathize, and cultivate knowledge in library users. But usability testing is not the same as a shift at the research help desk! Steel your heart for the sake of collecting wonderfully useful data that will improve your library’s resources and services. Those pain points and unfinished tasks are solid gold. Remember, too, that you aren’t asking a participant to “go negative” on the interface (Wilson, 2010) or manufacture failure, you are interested in recording the most accurate user experience possible and understanding the behavior behind it. Use these tips, if not word for word, then at least to meditate on the environment you curate when conducting usability testing and how to optimize data collection.
Battleson, B., Booth, A., & Weintrop, J. (2001). Usability testing of an academic library web site: a case study. The Journal of Academic Librarianship, 27(3), 188-198.
Ericsson, K. A., & Simon, H. A. (1993). Protocol analysis.
Travis, David “5 Provocative Views on Usability Testing” User Focus 12 October 2014. <http://www.userfocus.co.uk/articles/5-provocative-views.html>
Nielsen, Jakob. “Voodoo Usability” Nielsen Norman Group 12 December 1999. <https://www.nngroup.com/articles/voodoo-usability/>
Wilson, Michael. “Encouraging Negative Feedback During User Testing” UX Booth 25 May 2010. <http://www.uxbooth.com/articles/encouraging-negative-feedback-during-user-testing/>
Islandora: Dispatches from the User List: new tools for creating and ingesting derivatives outside of a production Islandora
Open Knowledge Foundation: Sinar Project in Malaysia works to open budget data at all levels of government
“Open Spending Data in Constrained Environments” is a project being lead by Sinar Project in Malaysia aimed exploring ways to of making critical information public and accessible to Malaysian citizens. The project is supported by the Open Data for Development programme and has been run in collaboration with Open Knowledge International & OpenSpending
In Malaysia, fiscal information exists at all three levels of government, the federal, the state and the municipal. There are complicated relationships and laws that dictate how budget flows through the different levels of government and, as the information is not published as open data, by any level of government, it is incredibly challenging for citizens to understand and track how public funds are being spent. This lack of transparency creates an environment for potential mismanagement of funds and facilitates corruption.
Earlier this year, the prime minister of Malaysia, Dato’ Seri Najib Razak, announced the revised budgets for 2016 in response to slow economic growth, that is a result of declining oil and commodity price coupled with stagnant demand from China. As a result, it was paramount to restructure the 2016 federal budget in order to find a savings of US $2.1 billion. That will make possible for the government to maintain its 2016 fiscal budget target at least at 3.1 percent of the country’s GDP. One of the biggest cuts from the revised 2016 budget is the public scholarships for higher education.
“Higher education institutions had their budget slashed by RM2.4 billion (US$573 million), from RM15.78 billion (US$3.8 billion) in 2015 to RM13.37 billion (US$3.2 billion) for the year 2016.” – Murray Hunter, Asian Correspondent
When numbers get this big, it is often difficult for people to understand what the real impact and implications of these cuts are going to be on the service citizens depend on. While it is the role of journalists and civil society to act as an infomediary and relay this information to citizens, without access to comprehensive, reliable budget and spending data it becomes impossible for us to fulfil our civic duty of keeping citizens informed. Open budget and spending data is vital in order to demonstrate to the public the real life impact large budget cuts will have. Over the past few months, we have worked on a pilot project to try to make this possible.
While the federal budgets that have been presented to Parliament are accessible on the Ministry of Finance website, we were only able to access state and municipal governments budgets through directly contacting state assemblyman and local councillors.
Given this lack of proactive transparency and limited mechanisms for reactive transparency, it was necessary to employ alternative mechanism devised to hold governments accountable. In this case, we decided to conduct a social audit.
Social audits are mechanisms in which users collect evidence to publicly audit, as a community, the provision of services by government. One essential component of a social audit is taking advantage of the opportunity to work closely with communities in order to connect and empower traditionally disenfranchised communities.
Here in Malaysia, we started our social audit work by conducting several meetings with communities living in public house in Kota Damansara, a town in the district of Petaling Jaya in Selangor State, in order to gain a better understanding of the challenges they were facing and to map these issues against various socio-economic and global development indicators.
Then, we conducted an urban poverty survey where we managed to collect essential data on 415 residents from 4 blocks in Kota Damansara public housing. This urban poverty survey covered several indicators that were able to tell us more about the poverty rate, the unemployment rate, the child mortality rate and the literacy rate within this community. From the preliminary results of the survey, we have found that all residents are low income earners, currently living under the poverty line. These findings stand in contrast to the question asked in the Parliament last year on income distribution of the nation’s residents, where it was declared that there is a decrease of about 0.421% of people in poverty in Malaysia. Moreover, in order for citizens to hold the Selangor state government accountable, civil society could use this data as evidence to demand that allocated budgets are increased in order to give financial/welfare support to disenfranchised communities in Kota Damansara public housing.
What’s next? In order to measure the impact of open data and social audit, we are planning a follow up of urban poverty surveys. Since the upcoming general elections will be held on 2018, the follow up of the surveys will be applied each 4 months after the first survey, in order to document if there are any changes or improvements made by the decision makers for better policies in the respective constituency and making better budget priorities that match the proposed/approved public policies.
Austin, TX The Fedora Project is pleased to announce that Fedora Camp in NYC, hosted by Columbia University Libraries, will be offered at Columbia University’s Butler Library in New York City November 28-30, 2016.
DuraSpace News: CALL for Expressions of Interest in Hosting Annual Open Repositories Conference, 2018 and 2019
From William Nixon and Elin Stangeland for the Open Repositories Steering Committee
Glasgow, Scotland The Open Repositories Steering Committee seeks Expressions of Interest (EoI) from candidate host organizations for the 2018 and 2019 Open Repositories Annual Conference series. The call is issued for two years this time to enable better planning ahead of the conferences and to secure a good geographical distribution over time. Proposals from all geographic areas will be given consideration.
It’s that time of year again! We’re asking for you to either nominate yourself or someone you know who would be a great addition to the panel of speakers for the 2017 Midwinter Top Tech Trends program in Atlanta, GA.
LITA’s Top Trends Program has traditionally been one of the most popular programs at ALA. Each panelist discusses two trends in technology impacting libraries and engages in a moderated discussion with each other and the audience.
Submit a nomination at: http://bit.ly/lita-toptechtrends-mw2017. Deadline is Sunday, August 28th.
The LITA Top Tech Trends Committee will review each submission and select panelist based on their proposed trends, experience, and overall balance to the panel.
For more information about past programs, please visit http://www.ala.org/lita/ttt.