You are here

Feed aggregator

William Denton: CC-BY

planet code4lib - 8 hours 39 min ago

I’ve changed the license on my content to CC-BY: Creative Commons Attribution 4.0.

District Dispatch: Last week in appropriations

planet code4lib - Tue, 2016-05-24 19:41

The Appropriations process in Congress is a year-long cycle with fits and starts, and includes plenty of lobbying, grassroots appeals, lobby days, speeches, hearings and markups, and even creative promotions designed to draw attention to the importance of one program or another. ALA members and the Office of Government Relations continue to play a significant role in this process. Recently, for example, we’ve worked to support funding for major library programs like LSTA and IAL, as well as to address policy issues that arise in Congressional deliberations. Your grassroots voice helps amplify my message in meetings with Congressional staff.

The House and Senate Appropriations Committees have begun to move their FY2017 funding bills through the subcommittee and full committee process as the various spending measures to the Floor and then to the President’s desk. Last week was a big week for appropriations on Capitol Hill and I was back-and-forth to various Congressional hearings, meetings, and events. Here are a few of last week’s highlights:

Source: csp_iqoncept

Tuesday – There’s another word for that    

The full House Appropriations Committee convened (in a type of meeting called a “markup”) to discuss, amend and vote on two spending bills: those for the Department of Defense and the Legislative Branch. A recent proposed change to Library of Congress (LC) cataloging terminology having nothing to do with funding at all was the focus of action on the Legislative Branch bill. Earlier in April, the Subcommittee Chair Tom Graves (R-GA14) successfully included instructions to the Library in a report accompanying the bill that would prohibit the LC from implementing changes in modernizing the outdated, and derogatory, terms “illegal aliens” and “aliens.”

An amendment was offered during Tuesday’s full Committee meeting by Congresswoman Debbie Wasserman Schultz (D-FL23) that would have removed this language from the report (a position strongly and actively supported by ALA and highlighted during National Library Legislative Day). The amendment generated extensive discussion, including vague references by one Republican to “outside groups” (presumably ALA) that were attempting to influence the process (influence the process? in Washington? shocking!).

The final roll call vote turned out to be a nail biter as ultimately four Committee Republicans broke with the Subcommittee chairman to support the amendment. Many in the room, myself included, thought the amendment might have passed and an audible gasp from the audience was heard upon announcement that it had failed by just one vote (24 – 25). Unfortunately, two Committee Democrats whose votes could have carried the amendment were not able to attend. The Legislative Branch spending bill now heads to the Floor and another possible attempt to pass the Wasserman Schultz amendment …. or potentially to keep the bill from coming up at all.

Wednesday – Can you hear me now? Good.

In Congress, sometimes the action occurs outside the Committee rooms. It’s not uncommon, therefore, for advocates and their congressional supporters to mount a public event to ratchet up the pressure on the House and Senate. ALA has been an active partner in a coalition seeking full funding for Title IV, Part A of the Every Student Succeeds Act. On Wednesday, I participated in one such creative endeavor: a rally on the lawn of the US Capitol complete with high school choir, comments from supportive Members of Congress, and “testimonials” from individuals benefited by Title IV funding.

This program gives school districts the flexibility to invest in student health and safety, academic enrichment, and education technology programs. With intimate knowledge of the entire school campus, libraries are uniquely positioned to assist in determining local needs for block grants, and for identifying needs within departments, grade levels, and divisions within a school or district. Congress authorized Title IV in the ESSA at $1.65 billion for FY17, however the President’s budget requests only about one third of that necessary level.

The cloudy weather threatened — but happily did not deliver — rain and the event came off successfully. Did Congress hear us? Well, our permit allowed the use of amplified speakers, so I’d say definitely yes!

Thursday – A quick vote before lunch

On Thursday, just two days after House Appropriators’ nail biter of a vote over Legislative Branch Appropriations, the full Senate Appropriations Committee took up their version of that spending bill in addition to Agriculture Appropriations. For a Washington wonk, a Senate Appropriations Committee hearing is a relatively epic thing to behold. Each Senator enters the room trailed by two to four staffers carrying reams of paper. Throughout the hearing, staffers busily whisper amongst each other, and into the ears of their Senators (late breaking news that will net an extra $10 million for some pet project, perhaps?)

While a repeat of Tuesday’s House fracas wasn’t at all anticipated (ALA had worked ahead of time to blunt any effort to adopt the House’s controversial Library of Congress provision in the Senate), I did wonder whether there had been a last minute script change when the Chairman took up the Agriculture bill first and out of order based on the printed agenda for the meeting. After listening to numerous amendments addressing such important issues as Alaska salmon, horse slaughter for human consumption (yuck?), and medicine measurement, I was definitely ready for the Legislative Branch Appropriations bill to make its appearance. As I intently scanned the room for any telltale signs of soon-to-be-volcanic controversy, the Committee Chairman brought up the bill, quickly determined that no Senator had any amendment to offer, said a few congratulatory words, successfully called for a voice vote and gaveled the bill closed.

Elapsed time, about 3 minutes! I was unexpectedly free for lunch…and, for some reason, craving Alaska salmon.

Epilogue – The train keeps a rollin’

This week’s activity by the Appropriations Committees of both chambers demonstrates that the leaders of Congress’ Republican majority are deliberately moving the Appropriations process forward. Indeed, in the House and Senate they have promised to bring all twelve funding bills to the floor of both chambers on time…something not done since 1994. Sadly, however, staffers on both sides of the aisle tell me thatthey expect the process to stall at some point. If that happens, once again Congress will need to pass one or more “Continuing Resolutions” (or CRs) after October 1 to keep the government operating. One thing is certain; there is lots of work to be done this summer to defend library funding and policies.

The post Last week in appropriations appeared first on District Dispatch.

District Dispatch: Judiciary Committee Senators face historic “E-Privacy” protection vote

planet code4lib - Tue, 2016-05-24 17:55

More good news could be in the offing for reform of ECPA, the Electronic Communications Privacy Act. Senate Judiciary Committee Chairman Charles Grassley (R-IA) recently (and pleasantly) surprised reform proponents by calendaring a Committee vote on the issue now likely to take place this coming Thursday morning, May 26th.  The Committee, it is hoped, will take up and pass H.R. 699, the Email Privacy Act, which was unanimously approved by the House of Representatives, as reported in District Dispatch, barely three weeks ago.  (A similar but not identical Senate bill co-authored by Judiciary Committee Ranking Member Patrick Leahy [D-VT], S. 356, also could be called up and acted upon.)

Source: www.searchquarry.com

Either bill finally would update ECPA in the way most glaringly needed: to virtually always require the government to get a standard, judicially-approved search warrant based upon probable cause to acquire the full content of an individual’s emails, texts, tweets, cloud-based files or other electronic communications. No matter which is considered, however, there remains a significant risk that, on Thursday, the bill’s opponents will try to dramatically weaken that core reform by exempting certain agencies (like the IRS and SEC) from the new warrant requirement, and/or by providing dangerous exceptions to law enforcement and security agencies acting in overbroadly defined “emergency” circumstances.

Earlier today, ALA joined a new joint letter signed by nearly 65 of its public and sector coalition partners calling on Senators Grassley and Leahy to take up and pass H.R. 699 as approved by the House: in other words “without any [such] amendments that would weaken the protections afforded by the bill” ultimately approved by 419 of the 435 House Members.

Now is the time to tell the Members of the Senate Judiciary Committee that almost 30 years has been much too long to wait for real ECPA reform. Please go to ALA’s Legislative Action Center to email to your Senate Judiciary Senator now!

The post Judiciary Committee Senators face historic “E-Privacy” protection vote appeared first on District Dispatch.

SearchHub: Welcome Jeff Depa!

planet code4lib - Tue, 2016-05-24 17:30

We’re happy to announce another new addition to the Lucidworks team! Please welcome Jeff Depa, our new Senior Vice President of Worldwide Field Operations in May 2015 (full press release: Lucidworks Appoints Search Veterans to Senior Team).

Jeff will lead the company’s day-to-day field operations, including its rapidly growing sales, alliances and channels, systems engineering and professional services business. Prior to Lucidworks, Jeff has over 17 years in leadership positions across sales, consulting, and systems engineering with companies such as Oracle, Sun, and most recently at DataStax.

Jeff earned a B.S. in Biomedical Engineering from Case Western Reserve University and also holds a Masters in Management. Aside for a passion to enable clients to unleash the power of their data, Jeff is an avid pilot and enjoys spending time with his family in Austin, TX.

We sat down with Jeff to learn more about his passion for search:

What attracted you to Lucidworks?

Lucidworks is at the forefront of unleashing the value hidden in the massive amount of data companies have collected across disparate systems. They have done a phenomenal job in driving the adoption of Apaceh Solr, but more importantly, building a platform in Fusion that allows enterprises from high volume ecommerce shops to healthcare to easily adopt and deploy a search solution that goes beyond the industry standard, and really focuses on providing the right information at the right time with unique relevancy and machine learning technologies.

What will you be working on at Lucidworks?

I’ll be focused on building on top of a solid foundation as we continue to drive the adoption of Fusion in the market and expand our team to capture the market opportunity with our customers and partners. I’m excited to be part of this journey.

Where do you think the greatest opportunities lie for companies like Lucidworks?

In today’s economy, value is driven from creating a unique, personalized and real time experience for customers and employees. Lucidworks sits squarely in the middle of an enterprise’s disparate and rapidly evolving data sources and enables the transformation of data to information that can be used to improve the user experience. The ability to to tie that information to a high impact customer result is a huge opportunity for Lucidworks.

Welcome to the team Jeff!

The post Welcome Jeff Depa! appeared first on Lucidworks.com.

LITA: Mindful Tech, a 2 part webinar series with David Levy

planet code4lib - Tue, 2016-05-24 15:09

Mindful Tech: Establishing a Healthier and More Effective Relationship with Our Digital Devices and Apps
Tuesdays, June 7 and 14, 2016, 1:00 – 2:30 pm Central Time
David Levy, Information School, University of Washington

Register Now for this 2 part webinar

“There is a long history of people worrying and complaining about new technologies and also putting them up on a pedestal as the answer. When the telegraph and telephone came along you had people arguing both sides—that’s not new. And you had people worrying about the explosion of books after the rise of the printing press.

What is different is for the last 100-plus years the industrialization of Western society has been devoted to a more, faster, better philosophy that has accelerated our entire economic system and squeezed out anything that is not essential.

As a society, I think we’re beginning to recognize this imbalance, and we’re in a position to ask questions like “How do we live a more balanced life in the fast world? How do we achieve adequate forms of slow practice?”

David Levy – See more at: http://tricycle.org/trikedaily/mindful-tech/#sthash.9iABezUN.dpuf

Don’t miss the opportunity to participate in this well known program by David Levy, based on his recent widely reviewed and well regarded book “Mindful Tech”. The popular interactive program will include exercises and participation now re-packaged into a 2 part webinar format. Both parts will be fully recorded for participants to return to, or to work with varying schedules.

Register Now for the 2 part Mindful Tech webinar series

This two part, 90 minutes each, webinars series will introduce participants to some of the central insights of the work Levy has been doing over the past decade and more. By learning to pay attention to their immediate experience (what’s going on in their minds and bodies) while they’re online, people are able to see more clearly what’s working well for them and what isn’t, and based on these observations to develop personal guidelines that allow them to operate more effectively and healthfully. Levy will demonstrate this work by giving participants exercises they can do, both during the online program and between the sessions.

Presenter

David Levy

David M. Levy is a professor at the Information School of the University of Washington. For more than a decade, he has been exploring, via research and teaching, how we can establish a more balanced relationship with our digital devices and apps. He has given many lectures and workshops on this topic, and in January 2016 published a book on the subject, “Mindful Tech: How to Bring Balance to Our Digital Lives” (Yale). Levy is also the author of “Scrolling Forward: Making Sense of Documents in the Digital Age” (rev. ed. 2016).

Additional information is available on his website at: http://dmlevy.ischool.uw.edu/

Then register for the webinar and get Full details

Can’t make the dates but still want to join in? Registered participants will have access to both parts of the recorded webinars.

Cost:

  • LITA Member: $68
  • Non-Member: $155
  • Group: $300

Registration Information

Register Online page arranged by session date (login required)
OR
Mail or fax form to ALA Registration
OR
Call 1-800-545-2433 and press 5
OR
email registration@ala.org

Questions or Comments?

For all other questions or comments related to the preconference, contact LITA at (312) 280-4269 or Mark Beatty, mbeatty@ala.org.

Islandora: iCampBC - Instructors Announced!

planet code4lib - Tue, 2016-05-24 13:59

Islandora Camp is going back to Vancouver from July 18 - 20, courtesy of our wonderful hosts at the British Columbia Electronic Library Network. Camp will (as usual) consist of three days: One day of sessions taking a big-picture view of the project and where it's headed, one day of hands-on workshops for developers and front-end administrators, and one day of community presentations and deeper dives into Islandora tools and sites. The instructors for that second day have been selected and we are pleased to introduce them:

Developers

Mark Jordan has taught at two other Islandora Camps and at the Islandora Conference. He is the developer of Islandora Context, Islandora Themekey, Islandora Datastream CRUD, and the XML Solution Pack, and is one of the codevelopers of the the Move to Islandora Kit. He is also an Islandora committer and is currently serving as Chair of the Islandora Foundation Board. His day job is as Head of Library Systems at Simon Fraser University.

Rosie Le Faive started with Islandora in 2012 while creating the a trilingual digital library for the Commission for Environmental Cooperation. With experience and - dare she say - wisdom gained from creating highly customized sites, she's now interested in improving the core Islandora code so that everyone can use it. Her interests are in mapping relationships between objects, and intuitive UI design. She is the Digital Infrastructure and Discovery librarian at UPEI, and develops for Agile Humanities.  

Admins

Melissa Anez has been working with Islandora since 2012 and has been the Community and Project Manager of the Islandora Foundation since it was founded in 2013. She has been a frequent instructor in the Admin Track and developed much of the curriculum, refining it with each new Camp.

Janice Banser is the Systems Librarian at Simon Fraser University.  She has been working with Islandora, specifically the admin interface, for over a year now. She is a member of the Islandora Documentation Interest Group and has contributed to the last two Islandora releases. She has been working with Drupal for about 6 years and has been a librarian since 2005.

Patrick Hochstenbach: Crosshatching with my fountain pen

planet code4lib - Tue, 2016-05-24 04:34
Filed under: portaits Tagged: crosshatch, fountain pen, ink, paper, portrait, sktchy, twsbi

Terry Reese: MarcEdit Update

planet code4lib - Tue, 2016-05-24 01:39

Yesterday, I posted a significant update to the Windows/Linux builds and a maintenance update to the Mac build that includes a lot of prep work to get it ready to roll in a number of changes that I’ll hopefully complete this week.  Unfortunately, I’ve been doing a lot of travelling, which means that my access to my mac setup has been pretty limited and I didn’t want to take another week getting everything synched together. 

So what are the specific changes:

ILS Integrations
I’ve been spending a lot of time over the past three works head down working on ILS integrations.  Right now, I’m managing two ILS integration scenarios – one is with Alma and their API.  I’m probably 80% finished with that work.  Right now, all the code is written, I’m just not getting back expected responses from their bibliographic update API.  Once I sort out that issue – I’ll be integrating this change into MarcEdit and will provide a youtube video demonstrating the functionality. 

The other ILS integration that I’ve been accommodating is working with MarcEdit’s MARC SQL Explorer and the internal database structure.  This work builds on some work being done with the Validate Headings tool to close the authority control loop.  I’ll likely be posting more about that later this week as I’m currently have a couple libraries test this functionality to make sure I’ve not missed anything.  Once they give me the thumbs up, this will make its way into the MarcEditor as well. 

But as part of this work, I needed to create a way for users to edit and search the local database structure in a more friendly way.  So, leveraging the ILS platform, I’ve included the ability for users to work with the local database format directly within the MarcEditor.  You can see how this works here (https://www.youtube.com/watch?v=dMJ_pUxyoFc&feature=youtu.be): Integrating the MarcEditor with a local SQL store.  I’m not sure what the ideal use case is for this functionality – but over the past couple of weeks, it had been requested by a couple of power users currently using the MARC SQL Explorer for some data edits, but hoping for an easier to user interface.  This work will be integrated into the Mac MarcEdit version at the end of this week.  All the prep work (window/control development) has been completed.  At this point, its just migrating the code so that it works within the Mac’s object-C codebase.

Edit Shortcuts
I created two new edit shortcuts in the MarcEditor.  The first, Find Records With Duplicate Tags, was created to help users look for records that may have multiple tags or a tag/subfield combination with a set of records.  This is work that can be done in the Extract Selected Records tool, but it requires a bit a trickery and knowledge of how MarcEdit formats data. 

How does this work – say you wanted to know which records had multiple call numbers (050) fields in a record.  You would select this option, enter 050 in the prompt, and then the tool would create for you a jump list showing all the records that met your criteria. 

Convert To Decimal Degrees
The second Edit ShortCut function is the first Math function (I’ll be adding two more, specifically around finding records with dates greater than or less than a specific value) targeting the conversion of Degree/Minutes/Seconds to decimal degrees.  The process has been created to be MARC agnostic, so users can specify the field, and subfields to process.  To run this function, select it from the Edit Shortcuts as demonstrated in the screenshot below:

When selected, you will get the following prompt:

This documents the format for defining the field/subfields to be processed.  Please note, it is important to define the all four potential values for conversion – even if they are not used within the record set. 

Using this function, you can now convert a value like:
=034  1\$aa$b1450000$dW1250000$eW1163500$fN0461500$gN0420000
To:
=034  1\$aa$b1450000$d+125.0000$e+116.5800$f+046.2500$g+042.0000

This function should allow users to transition their cartographic data to a format that is much more friendly to geographic interpretation if desired.

Bug Fixes:
This update also addressed a bug in the Build New field parser.  If you have multiple arguments, side-by-side, within the same field grouping (i.e., {100$a}{100$b}{100$c} – the parser can become confused.  This has been corrected.

Updates:
Included and update to the linked data rules file, updating the 7xx fields to include the $t in the processing.  Also updated the UNIMARC translation to include a 1:1 translation for 9xx data.

Over the next week, I hope to complete the Alma integration, but will focusing the development work in my free time on getting the Mac version synched with these changes.

–tr

DuraSpace News: Sandy Payette to Speak at 2016 VIVO Conference

planet code4lib - Tue, 2016-05-24 00:00

From the VIVO 2016 Planning Committee

Register today to attend the 2016 VIVO conference and hear from leading experts within our community.

DuraSpace News: Find out What’s Inside Hydra-in-a-Box at Open Repositories 2016: PCDM, Design, Emerging Architecture, Repository Tooling

planet code4lib - Tue, 2016-05-24 00:00

Austin, TX  It’s only three weeks away! If you will attend the 11th Annual International Conference on Open Repositories (#OR2016) here are the sessions that will be of interest if you want to learn more about the Hydra-in-a-Box project

Workshop: Modeling your Repository Objects with the Portland Common Data Model (PCDM)

Monday, June 13, 1:30-3:30 PM; 4:00-6:00 PM

Eric Hellman: 97% of Research Library Searches Leak Privacy... and Other Disappointing Statistics.

planet code4lib - Mon, 2016-05-23 20:18

...But first, some good news. Among the 123 members of the Association of Research Libraries, there are four libraries with almost secure search services that don't send clickstream data to Amazon, Google, or any advertising network. Let's now sing the praises of libraries at Southern Illinois University, University of Louisville, University of Maryland, and University of New Mexico for their commendable attention to the privacy of their users. And it's no fault of their own that they're not fully secure. SIU fails to earn a green lock badge because of mixed content issues in the CARLI service; while Louisville, Maryland and New Mexico miss out on green locks because of the weak cipher suite used by OCLC on their Worldcat Local installations. These are relatively minor issues that are likely to get addressed without much drama.

Over the weekend, I decided to try to quantify the extent of privacy leakage in public-facing library services by studying the search services of the 123 ARL libraries. These are the best funded and most prestigious libraries in North America, and we should expect them to positively represent libraries. I went to each library's on-line search facility and did a search for a book whose title might suggest to an advertiser that I might be pregnant. (I'm not!) I checked to see whether the default search linked to by the library's home page (as listed on the ARL website) was delivered over a secure connection (HTTPS). I checked for privacy leakage of referer headers from cover images by using Chrome developer tools (the sources tab). I used Ghostery to see if the library's online search used Google Analytics or not. I also noted whether advertising network "web beacons" were placed by the search session.

72% of the ARL libraries let Google look over the shoulder of every click by every user, by virtue of the pervasive use of Google Analytics. Given the commitment to reader privacy embodied by the American Library Association's code of ethics, I'm surprised this is not more controversial. ALA even sponsors workshops on "Getting Started with Google Analytics". To paraphrase privacy advocate and educator Dorothea Salo, the code of ethics does not say:
We protect each library user's right to privacy and confidentiality with respect to information sought or received and resources consulted, borrowed, acquired or transmitted, except for Google Analytics.While it's true that Google has a huge stake in maintaining the trust of users in their handling of personal information, and people seem to trust Google with their most intimate secrets, it's also true that Google's privacy policy puts almost no constraints on what Google (itself) can do with the information they collect. They offer strong commitments not to share personally identifiable information with other entities, but they are free to keep and use personally identifiable information. Google can associate Analytics-tracked library searches with personally identifiable information for any user that has a Google account; Libraries cannot be under the illusion that they are uninvolved with this data collection if they benefit from Google Analytics. (Full disclosure: many of the the web sites I administer also use Google Analytics.)

80% of the ARL libraries provide their default discovery tools to users without the benefit of a secure connection. This means that any network provider in the path between the library and the user can read and alter the query, and the results returned to the user. It also means that when a user accesses the library over public wifi, such as in a coffee shop, the user's clicks are available for everyone else in the coffee shop to look at, and potentially to tamper with. (The Digital Library Privacy Pledge is not having the effect we had hoped for, at least not yet.)

28% of ARL libraries enrich their catalog displays with cover images sourced from Amazon.com. Because of privacy leakage in referer headers, this means that a user's searches for library books are available for use by Amazon when Amazon wants to sell that user something. It's not clear that libraries realize this is happening, or whether they just don't realize that their catalog enrichment service uses cover images sourced by Amazon.

13% of ARL libraries help advertisers (other than Google) target their ads by allowing web beacons to be placed on their catalog web pages. Whether the beacons are from Facebook, DoubleClick, AddThis or Sharethis, advertisers track individual users, often in a personally identifiable way. Searches on these library catalogs are available to the ad networks to maximize the value of advertising placed throughout their networks.

Much of the privacy leakage I found in my survey occurs beyond the control of librarians. There are IT departments, vender-provided services, and incumbent bureaucracies involved. Important library services appear to be unavailable in secure versions. But specific, serious privacy leakage problems that I've discussed with product managers and CTOs of library automation vendors have gone unfixed for more than a year. I'm getting tired of it.

The results of my quick survey for each of the 123 ARL libraries are available as a Google Sheet. There are bound to be a few errors, and I'd love to be able to make changes as privacy leaks get plugged and websites become secure, so feel free to leave a comment.

LITA: LITA announces the Top Tech Trends panel at ALA Annual 2016

planet code4lib - Mon, 2016-05-23 19:34

Kicking off LITA’s celebration year of it’s 50th year the Top Technology Trends Committee announces the panel for the highly popular session at  2016 ALA Annual in Orlando, FL.

Top Tech Trends
starts Sunday June 26, 2016, 1:00 pm – 2:30 pm, in the
Orange County Convention Center, Room W109B
and kicks off Sunday Afternoon with LITA.

This program features the ongoing roundtable discussion about trends and advances in library technology by a panel of LITA technology experts. The panelists will describe changes and advances in technology that they see having an impact on the library world, and suggest what libraries might do to take advantage of these trends. This year’s panelists line up is:

  • Maurice Coleman, Session Moderator, Technical Trainer, Harford County Public Library, @baldgeekinmd
  • Blake Carver, Systems Administrator, LYRASIS, @blakesterz
  • Lauren Comito, Job and Business Academy Manager, Queens Library, @librariancraftr
  • Laura Costello, Head of Research & Emerging Technologies, Stony Brook University, @lacreads
  • Carolyn Coulter, Director, PrairieCat Library Consortium, Reaching Across Illinois Library System (RAILS), @ccoulter
  • Nick Grove, Digital Services Librarian, Meridian Library District – unBound, @nickgrove15

Check out the Top Tech Trends web site for more information and panelist biographies.

Safiya Noble

Followed by the LITA Awards Presentation & LITA President’s Program with Dr. Safiya Noble
presenting: Toward an Ethic of Social Justice in Information
at 3:00 pm – 4:00 pm, in the same location

Dr. Noble is an Assistant Professor in the Department of Information Studies in the Graduate School of Education and Information Studies at UCLA. She conducts research in socio-cultural informatics; including feminist, historical and political-economic perspectives on computing platforms and software in the public interest. Her research is at the intersection of culture and technology in the design and use of applications on the Internet.

Concluding with the LITA Happy Hour
from 5:30 pm – 8:00 pm
that location to be determined

This year marks a special LITA Happy Hour as we kick off the celebration of LITA’s 50th anniversary. Make sure you join the LITA Membership Development Committee and LITA members from around the country for networking, good cheer, and great fun! Expect lively conversation and excellent drinks; cash bar. Help us cheer for 50 years of library technology.

 

Open Knowledge Foundation: Open Knowledge International – our new name!

planet code4lib - Mon, 2016-05-23 15:55

Notice something a little different? We have had a change of name!

As of today, we officially move from being called “Open Knowledge” to “Open Knowledge International (OKI)”.

“Open Knowledge International” is the name by which the community groups have referred to us for a couple of years, conveying our role in supporting the groups around the world, as well as our role within the broader open knowledge movement globally. We are excited to announce our new name that reflects this.

Open Knowledge International is registered in the UK, and this has sometimes led to assumptions that we operate in and for the benefit of this region. However, the UK is no more of a priority to Open Knowledge International than other areas of the world; in fact, we want to look more closely at ways we can be engaged at a global level, where efforts to push open knowledge are already happening and where we can make a difference by joining alongside the people making it happen. This is evident by our efforts to support the associated Open Knowledge Network, with a presence in more than 40 countries and cross-border Working Groups, as well as our support of international projects, such as the Global Open Data Index, that both blends open knowledge expertise and draws upon the global open data community. Finally, we are an international team, with staff based in nearly every region, collaborating virtually to promote openness online and on the ground.

By formalising Open Knowledge International as our name beyond the community groups associated with us and to the broader open knowledge movement, we are reflecting the direction we are striving to undertake, now and increasingly so in the future. We are grateful to have such a strong community behind us as we undertake a name change that better reflects our priorities and as we continue to seek new opportunities on a global scale.

We are also planning to transition from the domain name okfn.org for brand consistency and will begin that transition in the coming months. If you would like to discuss this change of name, and what it means, please join in on our forum – discuss.okfn.org.

For revised logos please see okfn.org/press/logos and please contact press@okfn.org if you have any questions about the use of this brand.

David Rosenthal: Improving e-Journal Ingest (among other things)

planet code4lib - Mon, 2016-05-23 15:00
Herbert Van de Sompel, Michael Nelson and I have a new paper entitled Web Infrastructure to Support e-Journal Preservation (and More) that:
  • describes the ways archives ingest e-journal articles,
  • shows the areas in which these processes use heuristics, which makes them fallible and expensive to maintain,
  • and shows how the use of DOIs, ResourceSync, and Herbert and Michael's "Signposting" proposal could greatly improve these and other processes that need to access e-journal content.
It concludes with a set of recommendations for CrossRef and the e-journal publishers that would be easy to adopt and would not merely improve these processes but also help remedy the deficiencies in the way DOI's are used in practice that were identified in Martin Klein et al's paper in PLoS One entitled Scholarly Context Not Found: One in Five Articles Suffers from Reference Rot, and in Persistent URIs Must Be Used To Be Persistent, presented by Herbert and co-authors to the 25th international world wide web conference.


LITA: Getting your color on: maybe there’s some truth to the trend

planet code4lib - Mon, 2016-05-23 14:00

Coloring was never my thing, even as a young child, the amount of decision required in coloring was actually stressful to me. Hence my skepticism of this zen adult coloring trend. How could something so stressful for me be considered a thing of “zen”. I purchased a book and selected coloring tools about a year ago, coloring bits and pieces here and there but not really getting it. Until now.

While reading an article about the psychology behind adult coloring, I found this quote to be exceptionally interesting:

The action involves both logic, by which we color forms, and creativity, when mixing and matching colors. This incorporates the areas of the cerebral cortex involved in vision and fine motor skills [coordination necessary to make small, precise movements]. The relaxation that it provides lowers the activity of the amygdala, a basic part of our brain involved in controlling emotion that is affected by stress. -Gloria Martinez Ayala [quoted in Coloring Isn’t Just For Kids. It Can Actually Help Adults Combat Stress]

A page, colored by Whitni Watkins, from Color Me Stress Free by Lacy Mucklow and Angela Porter

As I was coloring this particular piece [pictured to the left] I started seeing the connection the micro process of coloring has to the macro process of managing a library and/or team building. Each coloring piece has individual parts that contribute to forming the outline of full work of art. But it goes deeper than that.

For exampled, how you color and organize the individual parts can determine how beautiful or harmonious the picture can be. You have so many different color options to choose from, to incorporate into your picture, some will work better than others. For example, did you know in color theory, orange and blue is a perfect color combination? According to color theory, harmonious color combinations use any two colors opposite each other on the color wheel.” [7]  But that the combination of orange, blue and yellow is not very harmonious?

Our lack of knowledge is a significant hindrance for creating greatness, knowing your options while coloring is incredibly important. Your color selection will determine what experience one has when viewing the picture. Bland, chaotic or pleasing, each part working together, contributing to the bigger picture. “Observing the effects colors have on each other is the starting point for understanding the relativity of color. The relationship of values, saturations and the warmth or coolness of respective hues can cause noticeable differences in our perception of color.” [6]  Color combinations, that may seem unfitting to you, may actually compliment each other.  

Note that some colors will be used more frequently and have a greater presence in the final product due to the qualities that color holds but remember that even the parts that only have a small presence are crucial to bringing the picture together in the end. 

“Be sure to include those who are usually left out of such acknowledgments, such as the receptionist who handled the flood of calls after a successful public relations effort or the information- technology people who installed the complex software you used.”[2]

There may be other times where you don’t use a certain color as much as it should have and could have been used. The picture ends up fully colored and completed but not nearly as beautiful (harmonious) as it could have been. When in the coloring process, ask yourself often “‘What else do we need to consider here?’ you allow perspectives not yet considered to be put on the table and evaluated.” [2] Constant evaluation of your process will lead to a better final piece.

While coloring I also noticed that I color individual portions in a similar manner. I color triangles and squares by outlining and shading inwards. I color circular shapes in a circular motion and shading outwards. While coloring, we find our way to be the most efficient but contained (within the lines) while simultaneously coordinating well with the other parts. Important to note, that the way you found to be efficient in one area  may not work in another area and you need to adapt and be flexible and willing to try other ways. Imagine coloring a circle the way you color a square or a triangle. You can take as many shortcuts as you want to get the job done faster but you may regret them in the end. Cut carefully. 

Remember while coloring: Be flexible. Be adaptable. Be imperturbable.

You can color how ever you see fit. You can choose which colors you want, the project will get done. You can be sure there will be moments of chaos, there will be moments that lack innovation. Experiment, try new things and the more you color the better you’ll get. However, coloring isn’t for everyone, at that’s okay. 

Now, go back and read again, this time substitute the word color for manage.

Maybe there is something to be said about this trend of the adult coloring book. 

References:
1. Coloring Isn’t Just For Kids. It Can Actually Help Adults Combat Stress http://www.huffingtonpost.com/2014/10/13/coloring-for-stress_n_5975832.html
2. Twelve Ways to Build an Effective Team http://people.rice.edu/uploadedFiles/People/TEAMS/Twelve%20Ways%20to%20Build%20an%20Effective%20Team.pdf
3. COLOURlovers: History Of The Color Wheel http://www.colourlovers.com/blog/2008/05/08/history-of-the-color-wheel
4. Smashing Magazine: Color Theory for Designers, Part 1: The Meaning of Color: https://www.smashingmagazine.com/2010/01/color-theory-for-designers-part-1-the-meaning-of-color/
5. Some Color History http://hyperphysics.phy-astr.gsu.edu/hbase/vision/colhist.html
6. Color Matters: Basic Color Theory http://www.colormatters.com/color-and-design/basic-color-theory
7. lifehacker: Learn the Basics of Color Theory to Know What Looks Good http://lifehacker.com/learn-the-basics-of-color-theory-to-know-what-looks-goo-1608972072
8. lifehacker: Color Psychology Chart http://lifehacker.com/5991303/pick-the-right-color-for-design-or-decorating-with-this-color-psychology-chart
9. Why Flexible and Adaptive Leadership is Essential http://challenge2050.ifas.ufl.edu/wp-content/uploads/2013/10/YuklMashud.2010.AdaptiveLeadership.pdf

DuraSpace News: VIVO Updates for May 22–VIVO Needs Your Financial Support

planet code4lib - Mon, 2016-05-23 00:00

From Mike Conlon, VIVO project director

DuraSpace News: VIVO Updates for May 15–Conference Posters due May 23, Remember to Register!

planet code4lib - Mon, 2016-05-23 00:00

From Mike Conlon, VIVO project director

Poster deadline extended.  There's still time for you to submit a poster to VIVO 2016!  This is a great opportunity for you to share your work with the VIVO community. The deadline for poster submissions has been extended to May 23.  See http://vivoconference.org

Patrick Hochstenbach: Sktchy portrait

planet code4lib - Sat, 2016-05-21 07:58
Filed under: portaits, Sketchbook Tagged: fountainpen, illustration, ink, Photoshop, portrait, sktchy

District Dispatch: ALA briefs congress on critical impact of rural broadband access

planet code4lib - Fri, 2016-05-20 19:41

Launched in February of this year, the bipartisan Congressional Rural Broadband Caucus was founded “to facilitate discussion, educate Members of Congress and develop policy solutions to close the digital divide in rural America.”At its most recent meeting, Marijke Visser of the ALA’s Office for Information Technology Policy (OITP) and co-panelists from the public and private sectors briefed the Caucus, congressional staff and a general audience at a public session entitled “Strengthening Rural Economics through Broadband Deployment.”

Her presentation highlighted that libraries currently play a pivotal role in providing broadband access in rural communities, addressing the “E’s of Libraries®”across the country: employment and entrepreneurship, education, individual empowerment, and civic engagement. Noting broadly  that “Libraries

Source: Consumer Affairs

strengthen local economies through supporting small business development and entrepreneurship,”Marijke went on to provide specific examples of how libraries have  helped small businesses develop business plans, conduct market research, foster employee certification, use 3D printers, and even use library software programs to design and print creative menus for a restaurant.

She also spotlighted the growing importance of video conferencing availability to rural residents and communities, telling of how a new mother in Alaska received needed healthcare training via video conference at her local public library, thus avoiding a lengthy trip to Seattle, and how a business in Texas was able to secure a contract by using video conferencing through a local public library to expedite OSHA certification of 40 workers.

Marijke also emphasized that today’s libraries clearly are much more than book-lending facilities and places for children’s story time; they are one-stop community hubs, replete with maker spaces, digital production studios, video-conferencing capacity and more. In response to questions from Congressional staff, Marijke also highlighted various services libraries provide to veterans, including resume building, job application assistance, benefit application filing, and financial literacy training.

Marijke and her fellow panelists were welcomed to the Caucus’ meeting by Committee Co-Chairs Reps. Kevin Cramer (R-ND), and Mark Pocan (D-WI2), and Rep. Dave Loebsack (D-IA2). Membership in the Caucus currently stands at 34 Representatives. Its mission, as explained upon its launch by Rep. Bob Latta (R-OH5), is to “bring greater attention to the need for high-speed broadband in rural America, and help encourage and spur innovative solutions to address this growing consumer demand.”

ALA thanks the Caucus for the opportunity to participate in its event, and both the Office of Government Relations and OITP look forward to continuing to work with its members to boost broadband capacity in libraries and homes across rural America.

The post ALA briefs congress on critical impact of rural broadband access appeared first on District Dispatch.

Library of Congress: The Signal: UNESCO PERSIST: A Global Exchange on Digital Preservation

planet code4lib - Fri, 2016-05-20 15:57

This is a guest post by Robert R. Buckley, Technical Adviser at the National Archives of the UAE in Abu Dhabi and the Coordinator for the PERSIST Policy Working Group.

UNESCO PERSIST meeting in Abu Dhabi. Photo courtesy of National Archives of the UAE.

Readers of this blog would have first seen mention of the UNESCO PERSIST project in The Signal last January. It occurred in a guest post on intellectual property rights related to software emulation. Dealing with IP rights is one of the known challenges of digital preservation. Dealing with the volume of digital content being generated is another, requiring decisions on what content to select and preserve for the benefit of society. These and other digital preservation activities typically depend on policies that influence decision-making and planning processes with a view to enabling sustainability. All these issues fall within the scope for the PERSISTproject and were addressed at its recent meeting held March 14-16 in Abu Dhabi.

The meeting was hosted by Dr. Abdulla El Reyes, Director General of the National Archives of the UAE and Chair of the Memory of the World Program. PERSIST is part of the Memory of the World Program and a partnership between UNESCO, the International Council of Archives and the International Federation of Libraries Associations and Institutions. (If it were an acronym, PERSIST would stand for Platform to Enhance and Reinforce the Sustainability of the Information Society Trans-globally.) It is a response to the UNESCO/UBC Vancouver Declaration, adopted at the 2012 Memory of the World in the Digital Age: Digitization and Preservation in Vancouver, where conference participants agreed on the pressing need to establish a road map proposing solutions, agreements and policies for implementation by all stakeholders, in particular governments and industry.

The focus of the PERSIST project is on providing these stakeholders, as well as heritage institutions, with resources to address the challenges of long-term digital preservation and the risks of losing access to part of our digital heritage through technology obsolescence. Fostering a high-level dialogue and joint action on digital preservation issues among all relevant stakeholders is a core objective of PERSIST. For example, during the UNESCO General Conference last November in Paris, PERSIST hosted an event that included Microsoft, Google and the ACM. This is the kind of thing UNESCO is well positioned to do and where it can add value on a global scale in the very active and fertile field of digital preservation.

The Abu Dhabi meeting was attended by over 30 experts, representing heritage institutions, universities and governmental, non-governmental and commercial organizations from a dozen countries spread across five continents. The meeting had an ambitious agenda that included formulating an operating plan for 2016-2017. The major outcomes of the meeting were organized around the work of the three task forces into which PERSIST was divided: Content, Technology and Policy.

Official launch of the UNESCO/PERSIST Selection Guidelines. Photo courtesy of National Archives of the UAE

First was the launch of the UNESCO/PERSIST Guidelines for the selection of digital heritage for long-term preservation, drafted by the Content Task Force. The selection process, in the form of a decision tree, takes a risk-assessment approach to evaluating significance, assessing sustainability and considering availability in dealing with the overwhelming volume of digital information now being created and shared. Written by a team of seven experts from the library, archives, and museum community, the Guidelines aim to provide an overarching starting point for heritage institutions when drafting their own policies on the selection of digital heritage for long-term sustainable digital preservation.

Second was the progress by the Technology Task Force on defining the PERSIST technology strategy and finding an organizational home that would maintain, manage and make available the legacy software platform for future access to digital heritage at risk due to software obsolescence. (PERSIST is in contact with the Software Preservation Network and will be presenting at the SPN Forum in August.)

The diagram illustrates the role of the UNESCO PERSIST project in the digital preservation ecosystem, including access to legacy software licenses. The organizational home, which we have been calling the UNESCO PERSIST Organization or UPO, would complement the work of the UNESCO PERSIST project. It would be a non-profit that would be able to enter into legal agreements with software vendors—a significant capability. Conversations are underway with a candidate organization about hosting the UPO.

Role of UNESCO PERSIST in the Digital Preservation Ecosystem. Diagram by Natasa Milic-Frayling. CLICK TO ENLARGE

Third was the formal creation of the Policy Task Force. In one way or another its initial outputs are all related to the Recommendation concerning the preservation of, and access to, documentary heritage including in digital form, which was approved at the UNESCO General Conference in November 2015 and which requires action by UNESCO Member States. Besides contributing directly to the guidelines for implementing the digital part of the Recommendation, the task force also plans to take a community-based approach to develop supporting tools such as a Model National Digital Preservation Strategy and a Starter’s Guide for policymakers. Already the Selection guidelines provide a tool for the identification of documentary heritage called for by the Recommendation. The Policy team will also work with the other task forces on strategic policy questions.

From here, there is still much to be done in disseminating the selection guidelines that would make the challenges of digital preservation more manageable, in developing and putting on a firm foundation the software technology platform that would enable access to legacy documents, and in establishing policy guidelines that would provide institutional and national frameworks where they are most needed for the preservation of digital documentary heritage.

You can hear more about PERSIST at the IFLA WLIC 2016 and the SPN Forum in August, the ICA Congress in September and iPRES 2016 in October. You can also read about PERSIST online, watch an introductory video and follow it on Twitter at #unescopersist.

Pages

Subscribe to code4lib aggregator