You are here

Feed aggregator

HangingTogether: A Research and Learning Agenda for Archives, Special, and Distinctive Collections

planet code4lib - Tue, 2017-05-23 15:25

[This post is contributed by our Practitioner Researcher-in-Residence, Chela Scott Weber]

Network rectangle board rings | from Pixabay

Some of you may have seen the recent announcement that I’m working with the good folks here at OCLC Research through the end of June, to help shape a research and learning agenda around issues related to archives, special, and distinctive collections in research libraries. In this and a series of upcoming blog posts, I’ll be sharing details about that work. Today I’ll talk a little bit about goals and process, and in later posts I’ll talk more about content.

The goals for this work are twofold. First, to build a guiding agenda for the OCLC Research work in this space over the next several years that is truly aligned with current and emerging needs in the OCLC Research Library Partnership community. Second, to engage in a transparent and iterative approach to building the agenda, with significant input from the RLP. While I’m leading the effort, I’m certainly not doing it alone. Merrilee Proffitt and Jackie Dooley are active collaborators, as are an advisory board who have generously offered their time and expertise to meet with and advise me regularly throughout the process. This group is Athena Jackson (Penn State), Erin O’Meara (University of Arizona), Michelle Light (UNLV), Sharon Farb (UCLA), and Tom Hyry (Harvard/Houghton Library).

I’m now more than a month into the project, which I started with a review of the last 8-10 years of work OCLC Research has done in this space– reading papers, watching webinar recordings, and revisiting conference proceedings. I did this to get an overview of the work, and to identify trajectories that might not yet be complete. I also wanted to get a sense of the full range of the outputs and activities they’ve undertaken, to inform what approaches might best suit future research needs.

I am currently having conversations with colleagues throughout the profession in order to identify major areas of challenge and opportunity, and then try to drill down to better define the problem spaces and think about what kinds of activities and outputs might be helpful to address them. I’ve been talking to people in leadership roles at RLP institutions, as well as specialists with expertise in specific areas like audio/visual collections and born-digital records.

My hope has been to get a well-rounded sense of how issues play out at different levels of the enterprise, from the overarching view of an administrator to the on-the-ground perspective of the librarians, archivists, and conservators working closely with collections and researchers.

Over the next few weeks, I’ll be working on shaping what I’ve learned through my reading and conversations into a draft research agenda, and will be sharing that draft for feedback in a number of ways. We will be hosting an invitational working meeting in June at the  RBMS Conference in Iowa City, where we’ll be convening a small group of leaders from RLP institutions to react to an early stage draft of the agenda, and inform where further work is needed. We’ll then host another similar event in July at the Society of American Archivists Annual Meeting in Portland, asking invited colleagues to engage with the next iteration of the agenda. I’ll also be sharing drafts with colleagues for written feedback throughout. The finalized agenda will be rolled out at the OCLC Research Library Partnership Meeting in November.

Stay tuned for further updates about work on the agenda, and opportunities to give feedback.

About Merrilee Proffitt

Mail | Web | Twitter | Facebook | LinkedIn | More Posts (291)

DPLA: Interim Executive Director Announced

planet code4lib - Tue, 2017-05-23 14:50

As part of DPLA’s ongoing transition to a new executive director, the Board of Directors has named Michele Kimpton the interim executive director until a new executive director is hired.

Kimpton is currently the Business Development Director and Senior Strategist at DPLA, and has been working on sustainability models, an ebook pilot, technical services, and strengthening DPLA’s member network. She brings to the interim role considerable experience running similar organizations. Prior to joining DPLA, she worked as Chief Strategist for LYRASIS and CEO of DuraSpace.

“DPLA is fortunate to have a strong senior leadership team in place during this transitionary period, and the board looks forward to working with Michele, Director for Content Emily Gore, Director of Technology Michael Della Bitta, and the rest of the staff to continue to further DPLA’s mission in the months ahead,” said the board president, Amy Ryan.

Kimpton will take on the interim role following the departure of founding executive director Dan Cohen on June 1.

District Dispatch: ALA President responds to the Administration’s 2018 budget proposal

planet code4lib - Tue, 2017-05-23 13:58

This morning, ALA issued a statement about the budget proposal released today. You can read it on ALA.org or below.

WASHINGTON, DC — In response to the Trump Administration’s 2018 budget proposal released today, American Library Association (ALA) President Julie Todaro issued the following statement:

“The Administration’s budget is using the wrong math when it comes to libraries.

“To those who say that the nation cannot afford federal library funding, the American Library Association, American businesses and millions of Americans say emphatically we cannot afford to be without it.

“America’s more than 120,000 public, school, academic and special libraries are visited more than 1.4 billion times a year by hundreds of millions of Americans in every corner of the nation. In 2013, 94 percent of Americans said that having a public library improves the quality of life in a community and the same percentage of parents said that libraries are important for their children.

“Over 80 major companies and trade associations from multiple sectors of the economy called libraries ‘critical national infrastructure’ in a letter to all Senators asking them to support the very agency and programs that the Administration has just proposed to effectively eliminate.

“We and those we serve will collaborate with our stakeholders, business allies and the more than one-third or more of all Members of Congress who have already pledged their support in writing to preserve critical library funding for FY 2018 through the Institute of Museum and Library Services and to save the agency itself, as well as other vital programs in other agencies that help millions of Americans.”

The post ALA President responds to the Administration’s 2018 budget proposal appeared first on District Dispatch.

Islandora: Islandora CLAW FAQ

planet code4lib - Tue, 2017-05-23 12:34

Last week was Islandoracon, our community's biggest gathering. We had a great week (and there will be more on that in another post), and a chance to unveil an early alpha version of the Islandora CLAW Minimum Viable Product. This first look at the product also kicked off a lot of questions, so we decided to gather them together with some answers:

When will Islandora CLAW be done?

Islandora CLAW won’t be done until it is deprecated in favor of whatever comes after it in the distant future. Islandora is an active community that constantly builds new tools and improves existing ones.

The Islandora CLAW MVP is scheduled for beta release at the end of June, 2017. The timeline for a full release will depend on community engagement and what features we map out together as necessary for the next phase.

The Islandora CLAW MVP does not do [thing that we really really need]. Are we going to be left behind?

The Islandora CLAW Minimum Viable Product is just a jumping-off point. Since we recognize that it can be challenging to review and comment meaningfully on a concept or a technical spec, the MVP version of CLAW is intended to give the Islandora community a tangible product to work with so that you can engage with the project and help to make sure your use cases are a part of the software as development continues.

Completing the MVP is a beginning for more community-driven development, with a very basic start on a product that the community can now test out and respond to.

How do I join in?

A good place to start is the CONTRIBUTING.md file included on all Islandora CLAW modules. It outlines how to submit a use case, feature request, improvement, or bug report. It also has details about our weekly meetings (‘CLAW Calls’), which are open for anyone to join.

While the meetings may seem very technical, we really mean it when we say anyone is welcome add items to the agenda. If we seem to spend most of our calls discussing very technical issues, that’s because we fall back on tickets and issues when no one has given us something more general to dig into. If you have questions or concerns, putting it on the agenda ensures that there is time and attention reserved for what you need to discuss.

You are also welcome to join the call and not say a thing. We take attendance, but that’s all the participation that’s required. If you would like to just listen to the discussion and get a feel for how things are going, lurking is a popular option, and a way that some very active contributors got their start.

You can also learn more about Islandora CLAW from these introductory pages:

Details of the MVP are here.

What is the MODSPOLCALYPSE? Are we losing MODS in CLAW?

The term “MODSPOCALYPSE” is an exaggeration made in jest about the fact that Islandora CLAW will have to deal with legacy MODS XML in a linked data/RDF world. While CLAW handles RDF as its native language (like Fedora 4), MODS is doable if we put in the work. The challenge is in mapping MODS to RDF, and that’s something we need to do as a community. If we can come together and agree on a standard mapping, the technical implementation will be relatively easy

Because this is not just an issue for Islandora, lot of work has already been done by the MODS and RDF Descriptive Metadata Subgroup in the Hydra community. To help achieve this vital mapping, please join as the Islandora Metadata Interest Group takes the lead on community discussions for Islandora.

Instead of a MODSPOCALYPSE, let’s consider this our “RDFnaissance.”

Will we have XML Form Builder in Islandora CLAW?

XML Form Builder is an amazing tool that plays an important role in Islandora 7.x. It is also an extremely complex tool that carries a significant maintenance burden that is challenging to meet even in the 7.x stack. Reproducing it in Islandora CLAW is unlikely to happen unless an institution or group in the community adopts it as a project and donates the work to the Islandora Foundation.

Editable metadata forms are definitely going to continue to be a part of Islandora CLAW. They are being handled in Drupal, which should be a more sustainable and accessible approach for both developers and end-users.

How long will Islandora 7.x be supported?

Islandora 7.x will be supported as long as the Islandora community needs for it to be supported. The goal of developing CLAW is not to push adoption, but to prepare for it when the majority of the Islandora community wants to move. As with other major upgrades we’ve been through, we will likely see a few institutions lead the way with early adoption, with a gradual migration of other sites as more tools are built and the path to migrate is mapped out by those trailblazers. The time to officially end support for Islandora 7.x will be when most of the Islandora community is done with it, just as we did with 6.x.

It’s also important to note that “ending support” does not mean it cannot still be used. We will (eventually, well down the road) end active development of new features and improvements, and then bug fixes on longer timeline, but there are still many Islandora 6.x sites out in the world more than three years after we officially ended its support. Fedora 3 is itself no longer supported by its community, but it remains a stable platform that hasn’t become less stable for no longer being actively improved.

DuraSpace News: Deployment of ORCID Integration Services

planet code4lib - Tue, 2017-05-23 00:00

From Emilio Lorenzo, Arvo Consulting  With the development of tight-integration functionalities between DSpace and orcid.org, repositories can obtain numerous advantages by improving data consistency in key information systems. These developments help repositories lower the barriers of ORCID integration and take-up.

FOSS4Lib Upcoming Events: DSpace North American User Meeting

planet code4lib - Mon, 2017-05-22 19:26
Date: Tuesday, August 22, 2017 - 08:00 to Wednesday, August 23, 2017 - 17:00Supports: DSpace

Last updated May 22, 2017. Created by Peter Murray on May 22, 2017.
Log in to edit this page.

DSpace North American User Meeting at Georgetown University Library

Raymond Yee: Fine-tuning a Python wrapper for the hypothes.is web API and other #ianno17 followup

planet code4lib - Mon, 2017-05-22 14:23

In anticipation of #ianno17 Hack Day, I wrote about my plans for the event, one of which was to revisit my own Python wrapper for the nascent hypothes.is web API.

Instead of spending much time on my own wrapper, I spent most of the day working with Jon Udell's wrapper for the API. I've been working on my own revisions of the library but haven't yet incorporated Jon's latest changes.

One nice little piece of the puzzle is that I learned how to introduce retries and exponential backoff into the library, thanks to a hint from Nick Stenning and a nice answer on Stackoverflow .

Other matters

In addition to the Python wrapper, there are other pieces of follow-up for me. I hope to write more extensively on those matters down the road but simply note those topics for the moment.

Videos from the conference

I might start by watching videos from #ianno17 conference: I Annotate 2017 – YouTube. Because I didn't attend the conference per se, I might glean insight into two particular topics of interest to me (the role of page owner in annotations and the intermingling of annotations in ebooks.)

An extension for embedding selectors in the URL

I will study and try Treora/precise-links: Browser extension to support Web Annotation Selectors in URIs. I've noticed that the same annotation is shown in two related forms:

Does the precise-links extension let me write the selectors into the URL?

Bohyun Kim: How to Price 3D Printing Service Fees

planet code4lib - Mon, 2017-05-22 13:18

** This post was originally published in ACRL TechConnect on May. 22, 2017.***

Many libraries today provide 3D printing service. But not all of them can afford to do so for free. While free 3D printing may be ideal, it can jeopardize the sustainability of the service over time. Nevertheless, many libraries tend to worry about charging service fees.

In this post, I will outline how I determined the pricing schema for our library’s new 3D Printing service in the hope that more libraries will consider offering 3D printing service if having to charge the fee is a factor stopping them. But let me begin with libraries’ general aversion to fees.

A 3D printer in action at the Health Sciences and Human Services Library (HS/HSL), Univ. of Maryland, Baltimore

Service Fees Are Not Your Enemy

Charging fees for the library’s service is not something librarians should regard as a taboo. We live in the times in which a library is being asked to create and provide more and more new and innovative services to help users successfully navigate the fast-changing information landscape. A makerspace and 3D printing are certainly one of those new and innovative services. But at many libraries, the operating budget is shrinking rather than increasing. So, the most obvious choice in this situation is to aim for cost-recovery.

It is to be remembered that even when a library aims for cost-recovery, it will be only partial cost-recovery because there is a lot of staff time and expertise that is spent on planning and operating such new services. Libraries should not be afraid to introduce new services requiring service fees because users will still benefit from those services often much more greatly than a commercial equivalent (if any). Think of service fees as your friend. Without them, you won’t be able to introduce and continue to provide a service that your users need. It is a business cost to be expected, and libraries will not make profit out of it (even if they try).

Still bothered? Almost every library charges for regular (paper) printing. Should a library rather not provide printing service because it cannot be offered for free? Library users certainly wouldn’t want that.

Determining Your Service Fees

What do you need in order to create a pricing scheme for your library’s 3D printing service?

(a) First, you need to list all cost-incurring factors. Those include (i) the equipment cost and wear and tear, (ii) electricity, (iii) staff time & expertise for support and maintenance, and (iv) any consumables such as 3d print filament, painter’s tape. Remember that your new 3D printer will not last forever and will need to be replaced by a new one in 3-5 years.

Also, some of these cost-incurring factors such as staff time and expertise for support is fixed per 3D print job. On the other hand, another cost-incurring factor, 3D print filament, for example, is a cost factor that increases in proportion to the size/density of a 3d model that is printed. That is, the larger and denser a 3d print model is, the more filament will be used incurring more cost.

(b) Second, make sure that your pricing scheme is readily understood by users. Does it quickly give users a rough idea of the cost before their 3D print job begins? An obscure pricing scheme can confuse users and may deter them from trying out a new service. That would be bad user experience.

Also in 3D printing, consider if you will also charge for a failed print. Perhaps you do. Perhaps you don’t. Maybe you want to charge a fee that is lower than a successful print. Whichever one you decide on, have that covered since failed prints will certainly happen.

(c) Lastly, the pricing scheme should be easily handled by the library staff. The more library staff will be involved in the entire process of a library patron using the 3D printing service from the beginning to the end, the more important this becomes. If the pricing scheme is difficult for the staff to work with when they need charge for and process each 3D print job, the new 3D printing service will increase their workload significantly.

Which staff will be responsible for which step of the new service? What would be the exact tasks that the staff will need to do? For example, it may be that several staff at the circulation desk need to learn and handle new tasks involving the 3D printing service, such as labeling and putting away completed 3D models, processing the payment transaction, delivering the model, and marking the job status for the paid 3D print job as ‘completed’ in the 3D Printing Staff Admin Portal if there is such a system in place. Below is the screenshot of the HS/HSL 3D Printing Staff Admin Portal developed in-house by the library IT team.

The HS/HSL 3D Printing Staff Admin Portal, University of Maryland, Baltimore

Examples – 3D Printing Service Fees

It’s always helpful to see how other libraries are doing when you need to determine your own pricing scheme. Here are some examples that shows ten libraries’ 3D printing pricing scheme changed over the recent three years.

  • UNR DeLaMare Library
    • https://guides.library.unr.edu/3dprinting
    • 2014 – $7.20 per cubic inch of modeling material (raised to $8.45 starting July, 2014).
    • 2017 – uPrint – Model Material: $4.95 per cubic inch (=16.38 gm=0.036 lb)
    • 2017 – uPrint – Support Materials: $7.75 per cubic inch
  • NCSU Hunt Library
    • https://www.lib.ncsu.edu/do/3d-printing
    • 2014-  uPrint 3D Printer: $10 per cubic inch of material (ABS), with a $5 minimum
    • 2014 – MakerBot 3D Printer: $0.35 per gram of material (PLA), with a $5 minimum
    • 2017 – uPrint – $10 per cubic inch of material, $5 minimum
    • 2017 – F306 – $0.35 per gram of material, $5 minimum
  • Southern Illinois University Library
    • http://libguides.siue.edu/3D/request
    • 2014 – Originally $2 per hour of printing time; Reduced to $1 as the demand grew.
    • 2017 – Lulzbot Taz 5, Luzbot mini – $2.00 per hour of printing time.
  • BYU Library
  • University of Michigan Library
    • The Cube 3D printer checkout is no longer offered.
    • 2017 – Cost for professional 3d printing service; Open access 3d printing is free.
  • GVSU Library
  • University of Tennessee, Chattanooga Library
  • Port Washington Public library
  • Miami University
    • 2014 – $0.20 per gram of the finished print; 2017 – ?
  • UCLA Library, Dalhousie University Library (2014)
    • Free
Types of 3D Printing Service Fees

From the examples above, you will notice that many 3d printing service fee schemes are based upon the weight of a 3D-print model. This is because these libraries are trying recover the cost of the 3d filament, and the amount of filament used is most accurately reflected in the weight of the resulting 3D-printed model.

However, there are a few problems with the weight-based 3D printing pricing scheme. First, it is not readily calculable by a user before the print job, because to do so, the user will have to weigh a model that s/he won’t have until it is 3D-printed. Also, once 3D-printed, the staff will have to weigh each model and calculate the cost. This is time-consuming and not very efficient.

For this reason, my library considered an alternative pricing scheme based on the size of a 3D model. The idea was that we will have roughly three different sizes of an empty box – small, medium, and large –  with three different prices assigned. Whichever box into which a user’s 3d printed object fits will determine how much the user will pay for her/his 3D-printed model. This seemed like a great idea because it is easy to determine how much a model will cost to 3d-print to both users and the library staff in comparison to the weight-based pricing scheme.

Unfortunately, this size-based pricing scheme has a few significant flaws. A smaller model may use more filament than a larger model if it is denser (meaning the higher infill ratio). Second, depending on the shape of a model, a model that fits  in a large box may use much less filament than the one that fits in a small box. Think about a large tree model with think branches. Then compare that with a 100% filled compact baseball model that fits into a smaller box than the tree model does. Thirdly, the resolution that determines a layer height may change the amount of filament used even if what is 3D-printed is a same model.

Different infill ratios – Image from https://www.packtpub.com/sites/default/files/Article-Images/9888OS_02_22.png

Charging Based upon the 3D Printing Time

So we couldn’t go with the size-based pricing scheme. But we did not like the problems of the weight-based pricing scheme, either. As an alternative, we decided to go with the time-based pricing scheme because printing time is proportionate to how much filament is used, but it does not require that the staff weigh the model each time. A 3D-printing software gives an estimate of the printing time, and most 3D printers also display actual printing time for each model printed.

First, we wanted to confirm the hypothesis that 3D printing time and the weight of the resulting model are proportionate to each other. I tested this by translating the weight-based cost to the time-based cost based upon the estimated printing time and the estimated weight of several cube models. Here is the result I got using the Makerbot Replicator 2X.

  • 9.10 gm/36 min= 0.25 gm per min.
  • 17.48 gm/67 min= 0.26 gm per min.
  • 30.80 gm/117 min= 0.26 gm per min.
  • 50.75 gm/186 min=0.27 gm per min.
  • 87.53 gm/316 min= 0.28 gm per min.
  • 194.18 gm/674 min= 0.29 gm per min.

There is some variance, but the hypothesis holds up. Based upon this, now let’s calculate the 3d printing cost by time.

3D plastic filament is $48 for ABS/PLA and $65 for the dissolvable per 0.90 kg  (=2.00 lb) from Makerbot. That means that filament cost is $0.05 per gram for ABS/PLA and $0.07 per gram for the dissolvable. So, 3D filament cost is 6 cents per gram on average.

Finalizing the Service Fee for 3D Printing

For an hour of 3D printing time, the amount of filament used would be 15.6 gm (=0.26 x 60 min). This gives us the filament cost of 94 cents per hour of 3D printing (=15.6 gm x 6 cents). So, for the cost-recovery of filament only, I get roughly $1 per hour of 3D printing time.

Earlier, I mentioned that filament is only one of the cost-incurring factors for the 3D printing service. It’s time to bring in those other factors, such as hardware wear/tear, staff time, electricity, maintenance, etc., plus “no-charge-for-failed-print-policy,” which was adopted at our library. Those other factors will add an additional amount per 3D print job. And at my library, this came out to be about $2. (I will not go into details about how these have been determined because those will differ at each library.) So, the final service fee for our new 3D printing service was set to be $3 up to 1 hour of 3D printing + $1 per additional hour of 3D printing. The $3 is broken down to $1 per hour of 3D printing that accounts for the filament cost and $2 fixed cost for every 3D print job.

To help our users to quickly get an idea of how much their 3D print job will cost, we have added a feature to the HS/HSL 3D Print Job Submission Form online. This feature automatically calculates and displays the final cost based upon the printing time estimate that a user enters.

 

The HS/HSL 3D Print Job Submission form, University of Maryland, Baltimore

Don’t Be Afraid of Service Fees

I would like to emphasize that libraries should not be afraid to set service fees for new services. As long as they are easy to understand and the staff can explain the reasons behind those service fees, they should not be a deterrent to a library trying to introduce and provide a new innovative service.

There are clear benefits to running through all cost-incurring factors and communicating how the final pricing scheme was determined (including the verification of the hypothesis that 3D printing time and the weight of the resulting model are proportionate to each other) to all library staff who will be involved in the new 3D printing service. If any library user inquire about or challenges the service fee, the staff will be able to provide a reasonable explanation on the spot.

I have implemented this pricing scheme at the same time as the launch of my library’s makerspace (the HS/HSL Innovation Space at the University of Maryland, Baltimore – http://www.hshsl.umaryland.edu/services/ispace/) back in April 2015. We have been providing 3D printing service and charging for it for more than two years. I am happy to report that during that entire duration, we have not received any complaint about the service fee. No library user expected our new 3D printing service to be free, and all comments that we received regarding the service fee were positive. Many expressed a surprise at how cheap our 3D printing service is and thanked us for it.

To summarize, libraries should be willing to explore and offer new innovating services even when they require charging service fees. And if you do so, make sure that the resulting pricing scheme for the new service is (a) sustainable and accountable, (b) readily graspable by users, and (c) easily handled by the library staff who will handle the payment transaction. Good luck and happy 3D printing at your library!

An example model with the 3D printing cost and the filament info displayed at the HS/HSL, University of Maryland, Baltimore

ACRL TechConnect: How to Price 3D Printing Service Fees

planet code4lib - Mon, 2017-05-22 13:12

Many libraries today provide 3D printing service. But not all of them can afford to do so for free. While free 3D printing may be ideal, it can jeopardize the sustainability of the service over time. Nevertheless, many libraries tend to worry about charging service fees.

In this post, I will outline how I determined the pricing schema for our library’s new 3D Printing service in the hope that more libraries will consider offering 3D printing service if having to charge the fee is a factor stopping them. But let me begin with libraries’ general aversion to fees.

A 3D printer in action at the Health Sciences and Human Services Library (HS/HSL), Univ. of Maryland, Baltimore

Service Fees Are Not Your Enemy

Charging fees for the library’s service is not something librarians should regard as a taboo. We live in the times in which a library is being asked to create and provide more and more new and innovative services to help users successfully navigate the fast-changing information landscape. A makerspace and 3D printing are certainly one of those new and innovative services. But at many libraries, the operating budget is shrinking rather than increasing. So, the most obvious choice in this situation is to aim for cost-recovery.

It is to be remembered that even when a library aims for cost-recovery, it will be only partial cost-recovery because there is a lot of staff time and expertise that is spent on planning and operating such new services. Libraries should not be afraid to introduce new services requiring service fees because users will still benefit from those services often much more greatly than a commercial equivalent (if any). Think of service fees as your friend. Without them, you won’t be able to introduce and continue to provide a service that your users need. It is a business cost to be expected, and libraries will not make profit out of it (even if they try).

Still bothered? Almost every library charges for regular (paper) printing. Should a library rather not provide printing service because it cannot be offered for free? Library users certainly wouldn’t want that.

Determining Your Service Fees

What do you need in order to create a pricing scheme for your library’s 3D printing service?

(a) First, you need to list all cost-incurring factors. Those include (i) the equipment cost and wear and tear, (ii) electricity, (iii) staff time & expertise for support and maintenance, and (iv) any consumables such as 3d print filament, painter’s tape. Remember that your new 3D printer will not last forever and will need to be replaced by a new one in 3-5 years.

Also, some of these cost-incurring factors such as staff time and expertise for support is fixed per 3D print job. On the other hand, another cost-incurring factor, 3D print filament, for example, is a cost factor that increases in proportion to the size/density of a 3d model that is printed. That is, the larger and denser a 3d print model is, the more filament will be used incurring more cost.

(b) Second, make sure that your pricing scheme is readily understood by users. Does it quickly give users a rough idea of the cost before their 3D print job begins? An obscure pricing scheme can confuse users and may deter them from trying out a new service. That would be bad user experience.

Also in 3D printing, consider if you will also charge for a failed print. Perhaps you do. Perhaps you don’t. Maybe you want to charge a fee that is lower than a successful print. Whichever one you decide on, have that covered since failed prints will certainly happen.

(c) Lastly, the pricing scheme should be easily handled by the library staff. The more library staff will be involved in the entire process of a library patron using the 3D printing service from the beginning to the end, the more important this becomes. If the pricing scheme is difficult for the staff to work with when they need charge for and process each 3D print job, the new 3D printing service will increase their workload significantly.

Which staff will be responsible for which step of the new service? What would be the exact tasks that the staff will need to do? For example, it may be that several staff at the circulation desk need to learn and handle new tasks involving the 3D printing service, such as labeling and putting away completed 3D models, processing the payment transaction, delivering the model, and marking the job status for the paid 3D print job as ‘completed’ in the 3D Printing Staff Admin Portal if there is such a system in place. Below is the screenshot of the HS/HSL 3D Printing Staff Admin Portal developed in-house by the library IT team.

The HS/HSL 3D Printing Staff Admin Portal, University of Maryland, Baltimore

Examples – 3D Printing Service Fees

It’s always helpful to see how other libraries are doing when you need to determine your own pricing scheme. Here are some examples that shows ten libraries’ 3D printing pricing scheme changed over the recent three years.

  • UNR DeLaMare Library
    • https://guides.library.unr.edu/3dprinting
    • 2014 – $7.20 per cubic inch of modeling material (raised to $8.45 starting July, 2014).
    • 2017 – uPrint – Model Material: $4.95 per cubic inch (=16.38 gm=0.036 lb)
    • 2017 – uPrint – Support Materials: $7.75 per cubic inch
  • NCSU Hunt Library
    • https://www.lib.ncsu.edu/do/3d-printing
    • 2014- uPrint 3D Printer: $10 per cubic inch of material (ABS), with a $5 minimum
    • 2014 – MakerBot 3D Printer: $0.35 per gram of material (PLA), with a $5 minimum
    • 2017 – uPrint – $10 per cubic inch of material, $5 minimum
    • 2017 – F306 – $0.35 per gram of material, $5 minimum
  • Southern Illinois University Library
    • http://libguides.siue.edu/3D/request
    • 2014 – Originally $2 per hour of printing time; Reduced to $1 as the demand grew.
    • 2017 – Lulzbot Taz 5, Luzbot mini – $2.00 per hour of printing time.
  • BYU Library
  • University of Michigan Library
    • The Cube 3D printer checkout is no longer offered.
    • 2017 – Cost for professional 3d printing service; Open access 3d printing is free.
  • GVSU Library
  • University of Tennessee, Chattanooga Library
  • Port Washington Public library
  • Miami University
    • 2014 – $0.20 per gram of the finished print; 2017 – ?
  • UCLA Library, Dalhousie University Library (2014)
    • Free
Types of 3D Printing Service Fees

From the examples above, you will notice that many 3d printing service fee schemes are based upon the weight of a 3D-print model. This is because these libraries are trying recover the cost of the 3d filament, and the amount of filament used is most accurately reflected in the weight of the resulting 3D-printed model.

However, there are a few problems with the weight-based 3D printing pricing scheme. First, it is not readily calculable by a user before the print job, because to do so, the user will have to weigh a model that s/he won’t have until it is 3D-printed. Also, once 3D-printed, the staff will have to weigh each model and calculate the cost. This is time-consuming and not very efficient.

For this reason, my library considered an alternative pricing scheme based on the size of a 3D model. The idea was that we will have roughly three different sizes of an empty box – small, medium, and large – with three different prices assigned. Whichever box into which a user’s 3d printed object fits will determine how much the user will pay for her/his 3D-printed model. This seemed like a great idea because it is easy to determine how much a model will cost to 3d-print to both users and the library staff in comparison to the weight-based pricing scheme.

Unfortunately, this size-based pricing scheme has a few significant flaws. A smaller model may use more filament than a larger model if it is denser (meaning the higher infill ratio). Second, depending on the shape of a model, a model that fits in a large box may use much less filament than the one that fits in a small box. Think about a large tree model with think branches. Then compare that with a 100% filled compact baseball model that fits into a smaller box than the tree model does. Thirdly, the resolution that determines a layer height may change the amount of filament used even if what is 3D-printed is a same model.

Different infill ratios – Image from https://www.packtpub.com/sites/default/files/Article-Images/9888OS_02_22.png

Charging Based upon the 3D Printing Time

So we couldn’t go with the size-based pricing scheme. But we did not like the problems of the weight-based pricing scheme, either. As an alternative, we decided to go with the time-based pricing scheme because printing time is proportionate to how much filament is used, but it does not require that the staff weigh the model each time. A 3D-printing software gives an estimate of the printing time, and most 3D printers also display actual printing time for each model printed.

First, we wanted to confirm the hypothesis that 3D printing time and the weight of the resulting model are proportionate to each other. I tested this by translating the weight-based cost to the time-based cost based upon the estimated printing time and the estimated weight of several cube models. Here is the result I got using the Makerbot Replicator 2X.

  • 9.10 gm/36 min= 0.25 gm per min.
  • 17.48 gm/67 min= 0.26 gm per min.
  • 30.80 gm/117 min= 0.26 gm per min.
  • 50.75 gm/186 min=0.27 gm per min.
  • 87.53 gm/316 min= 0.28 gm per min.
  • 194.18 gm/674 min= 0.29 gm per min.

There is some variance, but the hypothesis holds up. Based upon this, now let’s calculate the 3d printing cost by time.

3D plastic filament is $48 for ABS/PLA and $65 for the dissolvable per 0.90 kg (=2.00 lb) from Makerbot. That means that filament cost is $0.05 per gram for ABS/PLA and $0.07 per gram for the dissolvable. So, 3D filament cost is 6 cents per gram on average.

Finalizing the Service Fee for 3D Printing

For an hour of 3D printing time, the amount of filament used would be 15.6 gm (=0.26 x 60 min). This gives us the filament cost of 94 cents per hour of 3D printing (=15.6 gm x 6 cents). So, for the cost-recovery of filament only, I get roughly $1 per hour of 3D printing time.

Earlier, I mentioned that filament is only one of the cost-incurring factors for the 3D printing service. It’s time to bring in those other factors, such as hardware wear/tear, staff time, electricity, maintenance, etc., plus “no-charge-for-failed-print-policy,” which was adopted at our library. Those other factors will add an additional amount per 3D print job. And at my library, this came out to be about $2. (I will not go into details about how these have been determined because those will differ at each library.) So, the final service fee for our new 3D printing service was set to be $3 up to 1 hour of 3D printing + $1 per additional hour of 3D printing. The $3 is broken down to $1 per hour of 3D printing that accounts for the filament cost and $2 fixed cost for every 3D print job.

To help our users to quickly get an idea of how much their 3D print job will cost, we have added a feature to the HS/HSL 3D Print Job Submission Form online. This feature automatically calculates and displays the final cost based upon the printing time estimate that a user enters.

 

The HS/HSL 3D Print Job Submission form, University of Maryland, Baltimore

Don’t Be Afraid of Service Fees

I would like to emphasize that libraries should not be afraid to set service fees for new services. As long as they are easy to understand and the staff can explain the reasons behind those service fees, they should not be a deterrent to a library trying to introduce and provide a new innovative service.

There are clear benefits to running through all cost-incurring factors and communicating how the final pricing scheme was determined (including the verification of the hypothesis that 3D printing time and the weight of the resulting model are proportionate to each other) to all library staff who will be involved in the new 3D printing service. If any library user inquire about or challenges the service fee, the staff will be able to provide a reasonable explanation on the spot.

I have implemented this pricing scheme at the same time as the launch of my library’s makerspace (the HS/HSL Innovation Space at the University of Maryland, Baltimore – http://www.hshsl.umaryland.edu/services/ispace/) back in April 2015. We have been providing 3D printing service and charging for it for more than two years. I am happy to report that during that entire duration, we have not received any complaint about the service fee. No library user expected our new 3D printing service to be free, and all comments that we received regarding the service fee were positive. Many expressed a surprise at how cheap our 3D printing service is and thanked us for it.

To summarize, libraries should be willing to explore and offer new innovating services even when they require charging service fees. And if you do so, make sure that the resulting pricing scheme for the new service is (a) sustainable and accountable, (b) readily graspable by users, and (c) easily handled by the library staff who will handle the payment transaction. Good luck and happy 3D printing at your library!

An example model with the 3D printing cost and the filament info displayed at the HS/HSL, University of Maryland, Baltimore

Open Knowledge Foundation: OKI Agile: Picking/Designing a Methodology

planet code4lib - Mon, 2017-05-22 08:13

This is the seond in a series of blogs on how we are using the Agile methodology at Open Knowledge International. Originating from software development, the Agile manifesto describes a set of principles that prioritise agility in work processes: for example through continuous development, self-organised teams with frequent interactions and quick responses to change (http://agilemanifesto.org). In this blogging series we go into the different ways Agile can be used to work better in teams and to create more efficiency in how to deliver projects. The first post dealt with user stories: this time we go into methodologies.

More efficiency in project delivery is the name of the game. Working together in teams or with other people requires us to put in places methods and methodologies to accomplish that. A shared set of methods that allows team members to walk into a project and start delivering as soon as possible.

Glossary
  • Method – A systematic procedure.
  • Methodology – A series of related methods or techniques.
  • Methodology size – Number of methods/techniques used in the methodology.
  • Methodology density – Amount of precision/checkpoints needed. More density is a higher ceremony methodology (more formal one)
  • Criticality – The nature of damage of undetected defects (impact if we forget something). Higher criticality means worse impact.
What is a methodology

A methodology consists of 10 elements, of which one, the team values, permeates all of them. Let’s first put them in a pretty picture and then list them out with descriptions:

  • Team values – What the team strives for, how they’d like to communicate and work together. The values affect each element of the methodology so different values create different methodologies.
  • Roles – It’s best to think of this as the job descriptions we’d put into the ads when you need to hire more staff.
  • Skills – The skills needed for the roles we need.
  • Team – This is the group of people that will tackle a project and what roles they have.
  • Tools – The tools people use either within a technique/method or to produce a deliverable according to the standard.
  • Techniques – The methods used to get stuff done (generate work product), this can be everything from work changing techniques like breaking up into sprints (time blocks for work), to games played to achieve a specific output like planning poker (for estimates), to just a description of what is done to make things happen like “write a blog post”.
  • Activities – The meetings, reviews, milestones and other things people do or attend. The most obvious activity is “create deliverable” or something like that, but the more interesting activities are the events that take place.
  • Standards – Description of what is permitted and not permitted in the work product. These can be standards such as what programming language or security measures to use, how management is handled or how decisions get made (e.g. RASCI) and other project conventions.
  • Work Products – Not only the final product, this is also the internal products, i.e. what each person or team hands over to another person or team, something like user stories or mockups.
  • Quality – Often not considered explicitly but these are the rules and concerns that need to be tracked for each deliverable (work product). This could be a part of activities but it’s so important that it’s better to split it out.
Principles for picking/designing methodologies

There is no one size fits all methodology. There are methods one reuses between them (the shared set of methods/techniques people know about) and probably a default set one uses in absence of something more fitting. However, in general one needs to think about and pick/design methodologies based on two things:

  1. The project itself
  2. The number of people on the project team

The nature of the project calls for different methodologies, some projects can get away with a very lightweight methodology, where it won’t be the end of the world if something is forgotten, while others call for more publicly visible correctness where bad things happen if things are forgotten. For example, paying salaries requires a methodology with more visible checkpoints and correctness than responding to Tweets. People might lose their houses if they don’t get paid, but nobody will be out on the street for missing a possibility to retweet.

A lot changes based on the number of people involved. Communications become harder to do and the effectiveness of individuals decreases as a result the methodology must get bigger to tackle all of these:

Picking/designing a methodology has to be based on these four principles (they have to be kept in mind so because they aren’t all achievable always, notably number 4 in our case):

  1. Bigger teams call for a bigger methodology
  2. More critical projects call for more methodology density (publicly visible correctness)
  3. Cost comes with weight (a small increase in methodology adds large amount of cost of the project)
  4. The most effective communication is face to face and interactive (everyone participates instead of just doing a broadcast).
Agile development

We have agreed on adopting the agile values (slightly adapted from the agile software values) as our team values where we can. That means we value:

  • Individuals and interactions over processes and tools
  • Working stuff over comprehensive documentation
  • Customer collaboration over contract negotiation
  • Responding to change over following a plan

We value everything that’s mentioned above, we just value the things on the left (the bold) more than the things on the right. There are also 12 principles we try to follow as much as we can:

  1. Our highest priority is to satisfy the customer through early and continuous delivery of valuable things.
  2. Welcome changing requirements, even late in development. Agile processes harness change for the customer’s competitive advantage.
  3. Deliver working stuff frequently, from a couple of weeks to a couple of months, with a preference to the shorter timescale.
  4. Business people and implementers must work together daily throughout the project.
  5. Build projects around motivated individuals. Give them the environment and support they need, and trust them to get the job done.
  6. The most efficient and effective method of conveying information to and within a team is face-to-face conversation.
  7. Working stuff is the primary measure of progress.
  8. Agile processes promote sustainable development. The sponsors, developers, and users should be able to maintain a constant pace indefinitely.
  9. Continuous attention to excellence and good design enhances agility.
  10. Simplicity–the art of maximizing the amount of work not done–is essential.
  11. The best architectures, requirements, and designs emerge from self-organizing teams.
  12. At regular intervals, the team reflects on how to become more effective, then tunes and adjusts its behavior accordingly.

As an encouragement and because you were just bombarded with a long list of things and a discussion about planning how we plan work, here’s something from XKCD to keep in mind as we dive into methodologies:

Jakob Voss: Wikidata documentation on the 2017 Hackathon in Vienna

planet code4lib - Sun, 2017-05-21 13:47

At Wikimedia Hackathon 2017, a couple of volunteers sat together to work on the help pages of Wikidata. As part of that Wikidata documentation sprint. Ziko and me took a look at the Wikidata glossary. We identified several shortcomings and made a list of rules how the glossary should look like. The result are the glossary guidelines. Where the old glossary partly replicated Wikidata:Introduction, the new version aims to allow quick lookup of concepts. We already rewrote some entries of the glossary according to these guidelines but several entries are outdated and need to be improved still. We changed the structure of the glossary into a sortable table so it can be displayed as alphabetical list in all languages. The entries can still be translated with the translation system (it took some time to get familiar with this feature).

We also created some missing help pages such as Help:Wikimedia and Help:Wikibase to explain general concepts with regard to Wikidata. Some of these concepts are already explained elsewhere but Wikidata needs at least short introductions especially written for Wikidata users.

Image taken by Andrew Lih (CC-BY-SA)

Hugh Rundle: Small-Batch Memories of The Unready

planet code4lib - Sun, 2017-05-21 02:28

In 1016, Æthelred The Unready died, to be replaced as King of England by the invading Cnut the Great. In 1017, Cnut married Æthelred’s widow, Emma of Normandy, and divided the kingdom into the four Earldoms of Wessex, Mercia, East Anglia and Northumbria. A thousand years later, names like Æthelred and Cnut sound distinctly un-English, and the world is utterly changed. The English may still fear foreigners arriving from across the Channel, but the Vikings are long gone.

3017

Will anyone remember our stories in the year 3017, and will they too feel that we are like aliens, far removed from the reality in which they now live? Like Edward Shaddow, my mind takes a turn towards the post-apocalyptic when thinking about what life might be like a thousand years hence. I’m not optimistic about the future of humans, given our long history of dealing poorly with environmental degradation, and our utter failure to make any meaningful progress towards stopping or even slowing catastrophic climate change. But yesterday’s newCardigan Cardi Party with Cory Doctorow encouraged me to think that, perhaps, there might be some hope after all.

Cory talked about John Maynard Keynes’ prediction of a fifteen-hour working week by (* checks watch *) about now. His point was that Keyne’s was actually right - he simply didn’t predict that our desires would increase such that we are no longer satisfied with the lifestyle of a well-to-do 1930s European or American. So whilst I tend to image a kind of Tank Girl future of repression and chronic water wars, perhaps this is wide of the mark. Maybe the legacy of our times will be a warning about where hubrus and individualism can take humanity, and our much wiser descendants will live full and satisfying lives living mostly in harmony. Just without jetpacks or flying cars. The recent news that the Svalbard seed bank has already flooded as the warming climate causes permafrost to melt is yet another warning that for all our technological skills, there’s no guarantee that anything from our cultural and scientific storehouses will survive in 3017. And yet, cultural and scientific knowledge has passed down generations over long stretches of human time. Whether it’s Australian Aboriginal Star Maps, Japanese Tsunami Stones or simply family heirlooms, doing GLAM in a small and decentralised manner can be surprisingly effective.

Ubiquity and Abundance

The projects I find most exciting and intriguing - LOCKSS, IPFS, DIY squat archives, The Enspiral Network, the Internet itself - are all based on principles of decentralised networks, autonomy with connection, and local control. If I was going to wake up in a post-apocalyptic future, I’d want to be in the Enspiral kibbutz. In yesterday’s interview, Cory and Tom riffed on the idea of abundance, and the future Cory imagined in Walkaway where machines are always the best they can be, as opposed to today’s experience where “everyone has the eleventh-best drill in the world”. I like this idea, but it still feels a little bit too much like what we have today - not so much abundance, but rather ubiquity. A few years ago, every technology, business and economics writer was talking about the rise of personalisation. The idea was that with more advanced manufacturing techniques, the near future would be one where we would all order our own customised products and the mass-production system would someone manage to personalise each item to an individual consumer’s tastes. To the small extent that has come to pass, it’s a hollow sort of personalisation. What has been much more evident, at least in my First World hipster bubble, has been the rise of a sort of anti-mass production movement. From coffee to gin, vegetables to soap, “small batch” and “hand crafted” are the thing. More interestingly, whilst personalised mass production assumed that people were focussed on themselves, increasingly people are moving in the opposite direction - wanting to know who grew their coffee beans and sewed their shirt, or what kind of life the cow had before it turned into steak. It’s a sort of “shopping literacy” I suppose.

When I think of abundance, I go back to what Keynes wrote about - what he imagined as a three-hour workday but I prefer to imagine as a two-day working week. A world full of tinkerers, artists, storytellers, obsessives and bullshit artists. Sure, we’ll still need surgeons, electricians, and science laboratories, but most people could live as dilettantes - spending the majority of their time working out how to grow the most exquisite orchid, build a faster bicycle, or paint the perfect sunset. Or perhaps they will create the most thorough index, the most detailed catalogue, or simply the greatest gin ever distilled.

Small batches and long notes

Is the future of GLAM one of small-batch culture and long notes about the creator of each artefact? Will there be songlines to guide travellers between the archives? The question of what galleries, libraries, archives, and museums will look like a thousand years from now perhaps shouldn’t make us think of crystal storage that requires supercomputers to actually read, or 3D-printed Roman ruins. The important thing isn’t really the technology used or even the physical artefacts that survive - it’s the stories and lessons that are passed on. Apart from anything else, most of our current institutions are likely to be under the warm, acidic sea in a thousand years. All that will be left is stories of the people of 2017. We probably should get cracking on the world being imagined into being by groups like Enspiral, Unmonastery and Open Source Ecology because if we don't, I have a feeling I know what our descendants might call us.

The Unready.

Jakob Voss: Introduction to Phabricator at Wikimedia Hackathon

planet code4lib - Sat, 2017-05-20 07:47

This weekend I participate at Wikimedia Hackathon in Vienna. I mostly contribute to Wikidata related events and practice the phrase "long time no see", but I also look into some introductionary talks.

In the late afternoon of day one I attended an introduction to Phabricator project management tool given by André Klapper. Phabricator was introduced in Wikimedia Foundation about three years ago to replace and unify Bugzilla and several other management tools.

Phabricator is much more than an issue tracker for software projects (although it is mainly used for this purpose by Wikimedia developers). In summary there are tasks, projects, and teams. Tasks can be tagged, assigned, followed,discussed, and organized with milestones and workboards. The latter are Kanban-boards like those I know from Trello, waffle, and GitHub project boards.

Phabricator is Open Source so you can self-host it and add your own user management without having to pay for each new user and feature (I am looking at you, JIRA). Internally I would like to use Phabricator but for fully open projects I don’t see enough benefit compared to using GitHub.

P.S.: Wikimedia Hackathon is also organized with Phabricator. There is also a task for blogging about the event.

Evergreen ILS: Evergreen 3.0 development update #6: feedback fest results

planet code4lib - Fri, 2017-05-19 23:39

Image from page 668 of “The American farmer. A complete agricultural library, with useful facts for the household, devoted to farming in all its departments and details” (1882). Image digitized by NCSU Libraries.

Since the previous update, another 30 patches have been committed to the master branch.

This was the week of the first feedback fest in the 3.0 release cycle. A total of 57 bugs were identified last week as having an active pull request but no signoff; an additional bug was add to the wiki page today. Of those 58 bugs, 43 received substantive feedback, and 17 of them had their patches merged.

I would like to acknowledge the following people who left feedback for fest bugs:

  • Galen Charlton
  • Jeff Davis
  • Bill Erickson
  • Jason Etheridge
  • Rogan Hamby
  • Kathy Lussier
  • Mike Rylander
  • Ben Shum
  • Jason Stephenson
  • Dan Wells

A special shout-out also goes to Andrea Neiman, who helped keep the fest’s wiki page up to date this week.

There was also a fair amount of activity outside of the feedback fest, including a number of bug reports filed by folks testing the web staff client.

Duck trivia

Ducks occasionally need to be rescued from Library of Congress buildings. To date, no duck has been known to request a book whose call number starts with QL696.A52.

Submissions

Updates on the progress to Evergreen 3.0 will be published every Friday until general release of 3.0.0. If you have material to contribute to the updates, please get them to Galen Charlton by Thursday morning.

Harvard Library Innovation Lab: LIL Talks: Seltzer!

planet code4lib - Fri, 2017-05-19 18:21

In this week’s LIL talk, Matt Phillips gave us an effervescent presentation on Seltzer, followed by a tasting.

We tasted

  • Perrier – minerally, slightly salty, big bubbles with medium intensity
  • Saratoga – varied bubble size, clean… Paul says that this reminds him of typical German seltzers
  • Poland Springs – soft, smooth, sweet and clean
  • Gerolsteiner – Minerally with low carbonation
  • Borjomi – Graphite, very minerally, small bubbles, funk

Of course, throughout the conversation, we discussed the potential for the bottles affecting our opinions. We agreed that for a truly objective comparison, we’d transfer the samples to generic containers.

Though our tech and law talks are always educational and fun, our carbonated water talk was a refreshing change.

LITA: Evaluating Databases: Prioritizing on a Shoestring

planet code4lib - Fri, 2017-05-19 15:00

Libraries have limited resources, and the portion afforded to electronic resources requires some savvy prioritization to meet patrons’ needs while sticking to budgets. Allocation of spending is a key issue for many libraries, and database subscriptions can cost thousands, even tens of thousands, of dollars. For smaller libraries, it’s possible to spend the electronic resources budget on a single amazing all-purpose database or piece together a collection from low-cost alternatives. What’s a librarian to do?

It’s important to note that there’s no right/wrong dichotomy in deciding which electronic resources are “best”; it’s always a matter of “best for the community”, i.e., the librarian’s rule of thumb: know thy service population. Does your library serve a population with a high unemployment rate? You may need to prioritize electronic resources focused on job training, skill-building, and resume writing. Are you situated in an area where students hang out after the school day? Consider electronic resources like educational games, homework helpers, and web-based tutoring. Are you nestled in the heart of an emerging tech boomtown? You might include resources on programming languages (reference sources, learning programs, etc).

Over the years, I’ve explored various sources – from my MLIS textbooks to library websites to blog posts – and here’s a list of preliminaries that I consider when I’m tasked with evaluating electronic resources for selection to serve my library’s community.

Content

In the same way I’d evaluate a print source, I consider the content of an electronic resource. Is it relevant to my community? What about scope – is the information comprehensive, or, if not, is it a good fit to fill gaps in depth on a topic of special interest to patrons in the community? Is it updated often with timely and reliable information? Does a database include full text content, abstracts, citations? Is there a print resource that’s more useful?

Functionality

The how is as important as the content of a resource (the what). I ask myself: how simple is it for a layperson to use? Is the interface user-friendly? Is the indexing accurate and thorough? What about search – how does the database handle truncation, search types, filters, alternate spellings, and search history? Is there a FAQ or tutorial to help users solve issues if they get stuck? Can they export and download materials? I’ve learned that these questions can be important in how valuable patrons find the resource to be. A database may contain the deepest, most broad content possible, but if users can’t find it, it’s not much use to them. Like the question of a tree making sound when it falls in an empty forest, we can’t answer the question of whether the content is useful if no one is there to witness it.

Technical Bits

Before digging deeper into authentication and content format, I have a list of technical odds and ends that I consider in the preliminary evaluation. Does the vendor provide IT support for when technical issues inevitably arise? What about staff training or tutorials so librarians can learn how best to assist patrons in using the resource, or teach classes on the database’s functionality? How do patrons access the database – some vendors may allow in-library access only, some may provide limited content in their licensed versions, and some may not be optimized for mobile; in my evaluation, the resource will need to be stellar in other ways if these limitations exist. There’s also the biggie: cost. I weigh the expected value against the cost of the resource in electronic versus print format, e.g., is the electronic version more timely, cheaper per use, or vastly easier to use?

Once an electronic resource is in use, I add a parameter or two in the annual evaluation process – such as whether a database generates enough use to warrant the expense; any patron feedback staff has received; how much librarian-patron interaction is required for users to engage with the resource effectively; and how often the resource crashes as well as how the vendor’s IT staff assists in resolving those inevitable issues that crop up. In the preliminary stages of electronic resource selection, I use content, function, and basic technical elements as the litmus. If a resource passes all of these tests, then a library can dig a level deeper to finalize its decision. I’ll discuss this next month in a follow-up post.

Do you have any pro-tips? What has been your experience in implementing databases at your library?

FOSS4Lib Upcoming Events: Fedora and Hydra Camp at Oxford

planet code4lib - Fri, 2017-05-19 13:51
Date: Monday, September 4, 2017 - 08:00 to Friday, September 8, 2017 - 17:00Supports: HydraFedora Repository

Last updated May 19, 2017. Created by Peter Murray on May 19, 2017.
Log in to edit this page.

Fedora and Hydra Camp at Oxford University

OCLC Dev Network: Implementating Continous Integration

planet code4lib - Fri, 2017-05-19 13:00

Learn about implementating continuous integration as a development practice

Cynthia Ng: Notes for BC SirsiDynix Users Group Meeting 2017

planet code4lib - Thu, 2017-05-18 22:54
We got a bunch of presentations from both SD and a couple of presentations from libraries. COSUGI Updates Rick Branham, VP, Pre-Sales Solutions; Steve Donoghue, Senior Library Relations Manager; Tom Walker, Executive Accounts Manager Striving to listen to what customers want, and branding/marketing focus on customers and libraries. Popular “add-on” products: Enterprise, MobileCirc, Visibility, eResource … Continue reading Notes for BC SirsiDynix Users Group Meeting 2017

Harvard Library Innovation Lab: LIL Talks: A Small Study of Epic Proportions

planet code4lib - Thu, 2017-05-18 20:26

(This is a guest post by John Bowers, a student at Harvard College who is collaborating with us on the Entropy Project. John will be a Berktern here this Summer.)

In last week’s LIL talk, team member and graduating senior Yunhan Xu shared some key findings from her prize-winning thesis “A Small Study of Epic Proportions: Toward a Statistical Reading of the Aeneid.” As an impressive entry into the evolving “digital humanities” literature, Yunhan’s thesis blended the empirical rigor of statistical analysis with storytelling and interpretive methods drawn from the study of classics.

The presentation dealt with four analytical methodologies applied in the thesis. For each, Yunhan offered a detailed overview of tools and key findings.

  1. 1. Syntactic Analysis. Yunhan analyzed the relative frequencies with which different verb tenses and parts of speech occur across the Aeneid’s 12 books. Her results lent insight into the “shape” of the epic’s narrative, as well as its stylistic character in relation to other works.
  2. 2. Sentiment Analysis. Yunhan used sentiment analysis tools to examine the Aeneid’s emotional arc, analyze the normative descriptive treatment of its heroes and villains, and differentiate—following more conventional classics scholarship—the tonality of its books.
  3. 3. Topic Modeling. Here, Yunhan subjected existing bipartite and tripartite “partitionings” of the Aeneid to statistical inquiry. By applying sophisticated topic modelling techniques including Latent Dirichlet Allocation and Non-Negative Matrix Factorization, she made a compelling case for the tripartite interpretation. In doing so, she added a novel voice to a noteworthy debate in the classics community.
  4. 4. Network Analysis. By leveraging statistical tools to analyze the coincidence of and interactions between the Aeneid’s many characters, Yunhan generated a number of compelling visualizations mapping narrative progression between books in terms of relationships.

 

In the closing minutes of her presentation, Yunhan reflected on the broader implications of the digital humanities for the study of classics. While some scholars remain skeptical of the digital humanities, Yunhan sees enormous potential for collaboration and coevolution between the new way and the old.

Pages

Subscribe to code4lib aggregator