You are here

Feed aggregator

LibUX: Public library users are mobile

planet code4lib - Wed, 2016-09-28 02:39

Half (49%) of those who have visited a public library website in the past year used handheld mobile devices (such as smartphones or tablets). — Pew Research Center

LibUX: Usage Trends in Libraries 2016 – Pew Research Center

planet code4lib - Wed, 2016-09-28 02:31

Surprise, surprise. Searching the catalog remains the bread and butter of library web design.

LibUX: 2016 Statewide Keynote: Harper Reed

planet code4lib - Wed, 2016-09-28 01:41

So Courtney McDonald tweeted: “dude, schedule an hour to watch the #statewide16 keynote by Harper Reed.” I totally agree.

DuraSpace News: Third Annual Meeting of the German DSpace User Group

planet code4lib - Wed, 2016-09-28 00:00

The German DSpace User Group Meeting 2016 took place at ZBW – Leibniz Information Centre for Economics in Hamburg on September 27th, 2016. This was the third annual meeting for the group. With 45 participants, it was the largest meeting so far. High attendance was due in part to the momentum that DSpace has in Germany right now.  A few new institutions participated that were already working on their migration or are considering migrating to DSpace.

DuraSpace News: 4Science, A "Young" Registered Service Provider with a lot of Experience!

planet code4lib - Wed, 2016-09-28 00:00

By Susanna Mornati, 4Science  At 4Science,, we are all very proud to appear for the first time in the DuraSpace Digest.

LibUX: Library Usage and Trends

planet code4lib - Tue, 2016-09-27 19:41

In this episode I’m joined by not one but two (!) guests. Carli Spina and Emily King come on the show to talk about the 2016 Libraries Usage and Trends Report published earlier this month by the Pew Research Center.

Stream and subscribe

If you like, you can download the MP3 or subscribe to LibUX on StitcheriTunes, YouTube, Soundcloud, Google Play Music, or just plug our feed straight into your podcatcher of choice.

LITA: LITA Forum early bird rates extended

planet code4lib - Tue, 2016-09-27 18:22
We’ve extended the LITA members early bird registration another two weeks, so there’s still time to register for the 2016 LITA Forum at the early bird rate and save $50

Fort Worth, TX
November 17-20, 2015

LITA Forum early bird rates now will end October 14, 2016
Register Now!

Join us in Fort Worth, Texas, at the Omni Fort Worth Hotel located in Downtown Fort Worth, for the 2016 LITA Forum, a three-day education and networking event featuring 2 preconferences, 3 keynote sessions, more than 55 concurrent sessions and 25 poster presentations. It’s the 19th annual gathering of the highly regarded LITA Forum for technology-minded information professionals. Meet with your colleagues involved in new and leading edge technologies in the library and information technology field. Registration is limited in order to preserve the important networking advantages of a smaller conference. Attendees take advantage of the informal Friday evening reception, networking dinners and other social opportunities to get to know colleagues and speakers.

Why attend the LITA Forum

Tune in to #LITAchat Friday, September 30, 2016, at noon Central time to learn about the 2016 LITA Forum with guest tweeters from the Forum Planning Committee. From #litaforum, they will discuss the upcoming LITA Forum, November 17-20, in Fort Worth, Texas: why you should attend, what to expect, how to get the most out of the experience, and much more! To participate, launch your favorite Twitter client and check out the #LITAchat hashtag. On the web client, just search for #LITAchat and then click “LIVE” to follow along. Ask questions using the hashtag #LITAchat, add your own comments, and even answer questions posed by other participants.

Register now to receive the LITA members early bird discount:

  • LITA member early bird rate: $340
  • LITA member regular rate: $390

Keynote Speakers:

  • Cecily Walker, Vancouver Public Library
  • Waldo Jaquith, U.S. Open Data
  • Tara Robertson, @tararobertson

The Preconference Workshops:

  • Librarians can code! A “hands-on” computer programming workshop just for librarians
  • Letting the Collections Tell Their Story: Using Tableau for Collection Evaluation

Comments from past attendees:

“Best conference I’ve been to in terms of practical, usable ideas that I can implement at my library.”
“I get so inspired by the presentations and conversations with colleagues who are dealing with the same sorts of issues that I am.”
“After LITA I return to my institution excited to implement solutions I find here.”
“This is always the most informative conference! It inspires me to develop new programs and plan initiatives.”

Forum Sponsors:

OCLC, Yewno

Get all the details, register and book a hotel room at the 2016 Forum website.

See you in Fort Worth.

Terry Reese: Note on Automatic Updates

planet code4lib - Tue, 2016-09-27 17:45

Please note, MarcEdit’s Automated update tool will notify you of the update, but you will need to manually download the file from:  My web host, Bluehost, has made a change to their server configuration that makes no sense, but ultimately, dumps traffic sent from non-web browsers (connections without a user-agent).  Right now, users will get this message when they attempt to download using the automatic update:

I can accommodate the requirements that they have setup now, but it will mean that users will need to do manual downloads for the current update posted 9/27/2016 and the subsequent update — which I’ll try to get out tonight or tomorrow.

I apologize for the inconvenience, but after spending 8 hours yesterday and today wrangling with them and trying to explain what this breaks (because I have some personal tools that this change affects), I’m just not getting anywhere.  Maybe something will magically change, maybe not — but for now I’ll be rewriting the update process to try and protect from these kinds of unannounced changes in the future.

So again, you’ll want to download MarcEdit from since the automatic update download connection is currently being dumped by my web host.

LITA: Volunteers needed to help with privacy initiative

planet code4lib - Tue, 2016-09-27 17:14
Are you interested in improving privacy in libraries all across the country? If so, we need your help! The recently-released ALA Library Privacy Guidelines are a great collection of the standards and practices that libraries should be putting into place to protect users’ digital information. A small group of us is now working on creating checklists and resource guides for each set of guidelines in order to help real live library staff implement these guidelines with ease. And we’re looking for volunteers to help! We need folks to help out with developing checklists for the following. We’re particularly hoping to find people with experience in school libraries and networked services, but we’ll take all willing volunteers!
  1. Library Privacy Guidelines for Public Access Computers and Networks
  2. Library Privacy Guidelines for Library Websites (social media), OPACs, and Discovery Services
  3. Library Privacy Guidelines for Library Management Systems
  4. Library Privacy Guidelines for Data Exchange Between Networked Devices and Services
  5.  Library Privacy Guidelines for E-book Lending and Digital Content Vendors
  6. Library Privacy Guidelines for Students in K-12 Schools

If you’re able and interested in putting in a few hours to help us out with this project, pop me an email at with what you can help out with. And thank you!

Terry Reese: MarcEdit Update (Windows/Linux)

planet code4lib - Tue, 2016-09-27 04:10

I’ve posted a new set of updates.  The initial set is for Windows and Linux.  I’ll be posting Mac updates later this week.  Here’s the list of changes:

  • Behavior Change — Windows/Linux: Intellisense turned off by default (this is the box that shows up when you start to type a diacritic) for new installs. As more folks use UTF8, this option makes less sense. Will likely make plans to remove it within the next year.
  • Enhancement: Select Extracted Records: UI Updates to the import process.
  • Enhancement: Select Extracted Records: Updates to the batch file query.
  • Behavior Change: Z39.50 Client: Override added to the Z39.50 client to enable the Z39.50 client to override search limits. Beware, overriding this option is potentially problematic.
  • Update: Linked Data Rules File: Rules file updated to add databases for the Japanese Diet library, 880 field processing, and the German National Library.
  • Enhancement: Task Manager: Added a new macro/delimiter. {current_file} will print the current filename if set.
  • Bug Fix: RDA Helper – Abbreviation expansion is failing to process specific fields when config file is changed.
  • Bug Fix: MSXML Engine – In an effort to allow the xsl:strip-whitespace element, I broke this process. The work around has been to use the engine. However, I’ll correct this. Information on how you emulate the xsl:strip-whitespace element will be here:
  • Bug Fix: Task Manager Editing – when adding the RDA Helper to a new task, it asks for file paths. This was due to some enhanced validation around files. This didn’t impact any existing tasks.
  • Bug Fix: UI changes – I’m setting default sizes for a number of forms for usability
  • Bug Fix/Enhancement: Open Refine Import – OpenRefine’s release candidate changes the tab delimited output slightly. I’ve added some code to accommodate the changes.
  • Enhancement: MarcEdit Linked Data Platform – adding enhancements to make it easier to add collections and update the rules file
  • Enhancement: MarcEdit Linked Data Platform – updating the rules file to include a number of new endpoints
  • Enhancement: MarcEdit Linked Data Platform – adding new functionality to the rules file to support the recoding of the rules file for UNIMARC.
  • Enhancement: Edit Shortcut – Adding a new edit short cut to find fields missing words
  • Enhancement: XML Platform – making it clearer that you can use either XQuery or XSLT for transformations into MARCXML
  • Enhancement: OAI Harvester – code underneath to update user agent and accommodate content-type requirements on some servers.
  • Enhancement: OCLC API Integration – added code to integrate with the validation. Not sure this makes its way into the interface yet, but code will be there.
  • Enhancement: Saxon.NET version bump
  • Enhancement: SPARQL Explorer – Updating the sparql engine to give me more access to low level data manipulation
  • Enhancement: Autosave option when working in the MarcEditor. Saves every 5 minutes. Will protect against crashes. data

Downloads are available from the downloads page (


DuraSpace News: Welcome Heather Greer Klein: Hosted Services Customer Specialist

planet code4lib - Tue, 2016-09-27 00:00

Austin, TX  DuraSpace is pleased to announce that Heather Greer Klein accepted the position as hosted services customer specialist effective October 26, 2016. In this role, Heather will work closely with the DuraSpace hosted services team to manage the lead to sale process of DuraSpace hosted services (DuraCloud, DSpaceDirect, ArchivesDirect) including customer service, product pricing, new account set up, onboarding, and training.

SearchHub: Solr Distributed Indexing at WalmartLabs

planet code4lib - Mon, 2016-09-26 21:57

As we countdown to the annual Lucene/Solr Revolution conference in Boston this October, we’re highlighting talks and sessions from past conferences. Today, we’re highlighting Shenghua Wan’s talk, “Solr Distributed Indexing at WalmartLabs”.

As a retail giant, Walmart provides millions of items’ information via its e-commerce websites, and the number grows quickly. This calls for big data technologies to index the documents. Map-Reduce framework is a scalable and high-available base on top of which the distributed indexing can be built. While original Solr has a map-reduce index tool, there exist some barriers which makes it unable to deal with Walmart’s use case easily and efficiently. In this case study, Shenghua demonstrates a way to build your own distributed indexing tool and optimize the performance by making the indexing stage a map-only job before they are merged.

Shenghua Wan is a Senior Software Engineer on the Polaris Search Team at WalmartLabs. His focus is applying big data technologies to deal with large-scale product information to be searched online.

Solr Distributed Indexing in WalmartLabs: Presented by Shengua Wan, WalmartLabs from Lucidworks

Join us at Lucene/Solr Revolution 2016, the biggest open source conference dedicated to Apache Lucene/Solr on October 11-14, 2016 in Boston, Massachusetts. Come meet and network with the thought leaders building and deploying Lucene/Solr open source search technology. Full details and registration…

The post Solr Distributed Indexing at WalmartLabs appeared first on

Library of Congress: The Signal: Collections as Data Tomorrow

planet code4lib - Mon, 2016-09-26 18:10

Tomorrow, September 27, 2016, NDI is hosting our Collections as Data symposium, which will be free and open to the public. We’re really excited about the speakers we have lined up for the day, and hope you can join is in person or through the live-streamed video.

In preparation for the event, our colleagues in Library Services put together a wonderful collection of resources to help orient researchers using Library of Congress collections as data:

Using Collections as Data?

Research Guidance Is Available!

Digitized Collections and Items from Library of Congress

Torch of Learning. Photo by Carol Highsmith, 2007.

LITA: Transmission #10 – Season 1 Finale

planet code4lib - Mon, 2016-09-26 17:51

In the final episode of our first season, I’m telling you about the intention and future of the program, and a little more about myself. I’m also putting out a call for bloggers, contributors, innovators and visionaries! Collaborate with me- send an email to lindsay dot cronk at gmail dot com!

Stay tuned for changes coming in two weeks on October 10th!

Karen Coyle: 2 Mysteries Solved!

planet code4lib - Mon, 2016-09-26 16:57
One of the disadvantages of a long tradition is that the reasons behind certain practices can be lost over time. This is definitely the case with many practices in libraries, and in particular in practices affecting the library catalog. In U.S. libraries we tend to date our cataloging practices back to Panizzi, in the 1830's, but I suspect that he was already building on practices that preceded him.

A particular problem with this loss of history is that without the information about why a certain practice was chosen it becomes difficult to know if or when you can change the practice. This is compounded in libraries by the existence of entries in our catalogs that were created long before us and by colleagues whom we can no longer consult.

I was recently reading through volume one of the American Library Journal from the year 1876-1877. The American Library Association had been founded in 1876 and had its first meeting in Philadelphia in September, 1876. U.S. librarianship finally had a focal point for professional development. From the initial conference there were a number of ALA committees working on problems of interest to the library community. A Committee on Cooperative Cataloguing, led by Melvil Dewey, (who had not yet been able to remove the "u" from "cataloguing") was proposing that cataloging of books be done once, centrally, and shared, at a modest cost, with other libraries that purchased the same book. This was realized in 1902 when the Library of Congress began selling printed card sets. We still have cooperative cataloging, 140 years later, and it has had a profound effect on the ability of American libraries to reduce the cost of catalog creation.

Other practices were set in motion in 1876-1877, and two of these can be found in that inaugural volume. They are also practices whose rationales have not been obvious to me, so I was very glad to solve these mysteries.
Title caseSome time ago I asked on Autocat, out of curiosity, why libraries use sentence case for titles. No one who replied had more than a speculative answer. In 1877, however, Charles Ammi Cutter reports on The Use of Capitals in library cataloging and defines a set of rules that can be followed. His main impetus is "readability;" that "a profusion of capitals confuses rather than assists the eye...." (He also mentions that this is not a problem with the Bodleian library catalog, as that is written in Latin.)

Cutter would have preferred that capitals be confined to proper names, eschewing their use for titles of honor (Rev., Mrs., Earl) and initialisms (A.D). However, he said that these uses were so common that he didn't expect to see them changed, and so he conceded them.

All in all, I think you will find his rules quite compelling. I haven't looked at how they compare to any such rules in RDA. So much still to do!
CentimetersI have often pointed out, although it would be obvious to anyone who has the time to question the practice, that books are measured in centimeters in Anglo-American catalogs, although there are few cultures as insistent on measuring in inches and feet than those. It is particularly un-helpful that books in libraries are cataloged with a height measurement in centimeters while the shelves that they are destined for are measured in inches. It is true that the measurement forms part of the description of the book, but at least one use of that is to determine on which shelves those books can be placed. (Note that in some storage facilities, book shelves are more variable in height than in general library collections and the size determination allows for more compact storage.) If I were to shout out to you "37 centimeters" you would probably be hard-pressed to reply quickly with the same measurement in inches. So why do we use centimeters?

The newly-formed American Library Association had a Committee on Sizes. This committee had been given the task of developing a set of standard size designations for books. The "size question" had to do with the then current practice to list sizes as folio, quarto, etc. Apparently the rise of modern paper making and printing meant that those were no longer the actual sizes of books. In the article by Charles Evans (pp. 56-61) he argued that actual measurements of the books, in inches, should replace the previous list of standard sizes. However, later, the use of inches was questioned. At the ALA meeting, W.F. Poole (of Poole's indexes) made the following statement (p. 109):
"The expression of measure in inches, and vulgar fractions of an inch, has many disadvantages, while the metric decimal system is simple, and doubtless will soon come into general use."The committee agreed with this approach, and concluded:
"The committee have also reconsidered the expediency of adopting the centimeter as a unit, in accordance with the vote at Philadelphia, querying whether it were really best to substitute this for the familiar inch. They find on investigation that even the opponents of the metric system acknowledge that it is soon to come into general use in this country; that it is already adopted by nearly every other country of importance except England; that it is in itself a unit better adapted to our wants than the inch, which is too large for the measurement of books." (p. 180)
The members of the committee were James L. Whitney, Charles A. Cutter, and Melvil Dewey, the latter having formed the American Metric Bureau in July of 1876, both a kind of lobbying organization and a sales point for metric measures. My guess is that the "investigation" was a chat amongst themselves, and that Dewey was unmovable when it came to using metric measures, although he appears not to have been alone in that. I do love the fact that the inch is "too large," and that its fractions (1/16, etc.) are "vulgar."

Dewey and cohort obviously weren't around when compact discs came on the scene, because those are measured in inches ("1 sound disc : digital ; 4 3/4 in"). However, maps get the metric treatment: "1 map : col. ; 67 x 53 cm folded to 23 x 10 cm". Somewhere there is a record of these decisions, and I hope to come across them.

It would have been ideal if the U.S. had gone metric when Dewey encouraged that move. I suspect that our residual umbilical chord linking us to England is what scuppered that. Yet it is a wonder that we still use those too large, vulgar measurements. Dewey would be very disappointed to learn this.

So there it is, two of the great mysteries solved in the record of the very first year of the American library profession. Here are the readings; I created separate PDFs for the two most relevant sections:

American Library Journal, volume 1, 1876-1877 (from the Internet Archive)
Cutter, Charles A. The use of capitals. American Library Journal, v.1, n. 4-5, 1877. pp. 162-166
The Committee on Sizes of Books, American Library Journal, v.1, n. 4-5, 1877, pages 178-181

Also note that beginning on page 92 there is a near verbatim account of every meeting at the first American Library Association conference in Philadelphia, September, 1876. So verbatim that it includes the mention of who went out for a smoke and missed a key vote. And the advertisements! Give it a look.

Library of Congress: The Signal: 2016-2017 Class of National Digital Stewardship Residents Selected

planet code4lib - Mon, 2016-09-26 15:15

Five new National Digital Stewardship Residents will be joining the Library in late September 2016. Selected from a competitive pool and representing five different library schools, the residents bring a range of skills and experience in working with digital and archival collections. The NDSR program offers recent graduates an opportunity to gain professional experience under the guidance of a mentor. They will acquire hands-on knowledge and skills in the collection, selection, management, long-term preservation and accessibility of digital assets.

Throughout the year, residents and their mentors will attend digital stewardship workshops at the Library of Congress and at one of their five host institutions in the greater Washington, D.C. region.

  • Meredith Broadway of Dallas, Texas, has a Master of Science in Data Curation and Certificate in Special Collections from the University of Illinois at Urbana-Champaign, and a bachelor’s degree from Rhodes College. Meredith will be a resident at the World Bank Group focusing on an assessment framework and appraisal guidelines for identification of data for permanent preservation; a set of analytic process document guidelines to enable documentation of processes used in the collection and analysis of data; and guidelines for linking datasets to related documents and analytical reports.
  • Joseph Carrano of Middlebury, Connecticut, has dual Master’s degrees from the University of Maryland in History and Library Science, and a bachelor’s degree from the University of Connecticut. Joe will be part of a team at the Georgetown University Library developing open-source project guidelines, documentation and workflows for different preservation platforms. He will be involved in all stages of the process of inventory, selection, curation, preparation and ingest of files of all formats.
  • Elizabeth England of Washington, DC, has a Masters degree in Library and Information Science from the University of Pittsburgh, and a Bachelor’s degree from Drew University. Elizabeth will be a resident in the University Archives at the Johns Hopkins University Sheridan Libraries, applying core archival functions such as appraisal, accessioning, processing, preservation, description, and provision of access to a 50 terabyte collection of born-digital photographs, using scripting languages and tools that are vital to manipulating large data sets.
  • Amy Gay of Binghamton, New York, has a Masters degree in Library and Information Science from Syracuse University, and a bachelor’s degree from the State University of New York, Oneonta. Amy will be a resident at the Food & Drug Administration, Office of Science & Engineering Laboratories, Center for Devices & Radiological Health, working on the “CDRH Science Data Catalogue Pilot”; a joint project to develop a searchable digital catalog for data sets, software code, computational models, images and more as part of federally mandated public access efforts. She will lead catalog content acquisition and curation, as well as refining the metadata schema and taxonomy.
  • Megan Potterbusch of Nashville, Tennessee, has a master’s degree from the School of Library and Information Science at Simmons College, and a bachelor’s degree from Earlham College. Megan will serve as a resident at the Association of Research Libraries working in partnership with the George Washington University Libraries and the Center for Open Science to prototype the process of linking the output from a university research unit to a library digital repository through the Open Science Framework — an open source tool that integrates and supports research workflow.

David Rosenthal: The Things Are Winning

planet code4lib - Mon, 2016-09-26 15:00
More than three years ago my friend Jim Gettys, who worked on One Laptop Per Child, and on the OpenWrt router software, started warning that the Internet of Things was a looming security disaster. Bruce Schneier's January 2014 article The Internet of Things Is Wildly Insecure — And Often Unpatchable and Dan Geer's April 2014 Heartbleed as Metaphor were inspired by Jim's warnings. That June Jim gave a talk at Harvard's Berkman Center entitled (In)Security in Home Embedded Devices. That September Vint Cerf published Bufferbloat and Other Internet Challenges, and Jim blogged about it. That Christmas a botnet running on home routers took down the gaming networks of Microsoft's Xbox and Sony's Playstation. That wasn't enough to motivate action to fix the problem.

As I write this on 9/24/16 the preceding link doesn't work, although the Wayback Machine has copies. To find out why the link isn't working and what it has to do with the IoT, follow me below the fold.

The insecurity of the IoT has been a theme of many of my blog posts since 2014, pointing out that it was handing the bad guys, even relatively unskilled bad guys, a weapon that could render the Internet unusable. But nothing has been done to fix the problems and defuse the weapon. Dan Goodin's Why the silencing of KrebsOnSecurity opens a troubling chapter for the ‘Net tells us that we are running out of time:
KrebsOnSecurity, arguably the world's most intrepid source of security news, has been silenced, presumably by a handful of individuals who didn't like a recent series of exposés reporter Brian Krebs wrote. ... On Thursday morning, ... he reported that a sustained attack was bombarding his site with as much as 620 gigabits per second of junk data. ... At 4 pm, Akamai gave Krebs two hours' notice that it would no longer assume the considerable cost of defending KrebsOnSecurity. Krebs opted to shut down the site to prevent collateral damage hitting his service provider and its customers. ... In 2013, attacks against anti-spam organization Spamhaus generated headlines because the 300Gb torrents were coming uncomfortably close to Internet-threatening size. The assault against KrebsOnSecurity represents a much greater threat for at least two reasons. First, it's twice the size. Second and more significant, ... the attacks against KrebsOnSecurity harness so-called Internet-of-things devices—think home routers, webcams, digital video recorders, and other everyday appliances that have Internet capabilities built into them.Go read the whole article.

This is asymmetric warfare. It doesn't take much skill or many resources to build a DDOS weapon of this kind. But defending against it is beyond the reach of most websites:
Krebs said he has explored the possibility of retaining a DDoS mitigation service, but he found that the cost—somewhere between $100,000 and $200,000 per year for the type of always-on protection he needs against high-bandwidth attacks—is more than he can afford.So, unless you're seriously wealthy, any time you publish something on the net the bad guys don't like, they can blow your web presence away. Krebs' conclusion is sad:
"Free speech in the age of the Internet is not really free," he said. "We're long overdue to treat this threat with a lot more urgency. Unfortunately, I just don't see that happening right now."And don't think that knocking out important individual Web sites like KrebsOnSecurity is the limit of the bad guys capabilities. Everyone seems to believe that the current probing of the root servers' defenses is the work of China but, as the Moon Worm showed, careful preparation isn't necessarily a sign of a state actor. There are many bad guys out there who could take the Internet down; the only reason they don't is not to kill the goose that lays the golden eggs.

Pastor Martin Niemöller had it right:
First they came for the security gurus, and I did not speak out -
Because I was not a security guru.This is probably yet another reason why we need to evolve to a Decentralized Internet (not just a Decentralized Web), perhaps Named Data Networking (NDN). Although, as I wrote, I'm not aware of a major "black hat" analysis of these decentralized proposals, the argument is very plausible.

Why can a large number of small, compromised devices with limited bandwidth upstream bring down a large, powerful Web site, even one defended by an expensive DDOS mitigation service? Two reasons:
  • In today's centralized Internet, the target Web site will be at one, or a small number of IP addresses. The network focuses the traffic from all the compromised devices on to those addresses, consuming massive resources at the target.
  • In today's centralized Web, the target Web site will be be one tenant sharing the resources of a data center, so the focused traffic inflicts collateral damage on the other tenants. It was the cost in resources and the risk to other customers that caused Akamai to kick out KrebsOnSecurity.
In NDN, a request for a resource only travels as far as one of the nearest copies. And in the process it creates additional copies along the path, so that a subsequent request will travel less far. Thus, instead of focusing traffic, large numbers of requests defocus the traffic. They spread the responsibility for satisfying the request out across the infrastructure instead of concentrating it. By moving the load caused by bad behavior closer to the bad actors, it creates incentives for the local infrastructure to detect and prevent the bad behavior.

Denial-of-service attacks are possible in NDN. They take the form of flooding requests for resources that are known not to exist; flooding requests for resources that do exist, such as posts that you don't like, won't work. But both local and cooperative detection and mitigation techniques seem likely to be effective, see for example:
The fundamental problems, as in so many areas, are that the thinking is short-term and the incentives are misaligned. Iain Thomson at The Register reports on a parallel example:
A study by the RAND Corporation, published in the Journal of Cybersecurity, looked at the frequency and cost of IT security failures in US businesses and found that the cost of a break-in is much lower than thought – typically around $200,000 per case. With top-shelf security systems costing a lot more than that, not beefing up security looks in some ways like a smart business decision.

Romanosky analyzed 12,000 incident reports and found that typically they only account for 0.4 per cent of a company's annual revenues. That compares to billing fraud, which averages at 5 per cent, or retail shrinkage (ie, shoplifting and insider theft), which accounts for 1.3 per cent of revenues.Of course, if because of the insecurity of IoT devices the Internet becomes unusable, or even merely uninteresting once the bad guys have driven anything interesting away, everyone, from the ISPs to the DDOS mitigation services to the IoT device vendors will be out of business. But right now the money is rolling in and it doesn't cost anything to just kick off the targets of the bad guys wrath. Actually fixing things is someone else's problem.

Update 9/25/16: Cory Doctorow writes:
Meanwhile, Krebs was eventually bailed out by Google's Project Shield, one of Jigsaw's anti-"surveillance, extremist indoctrination, and censorship" tools. That right there is another sign of the times: the attacks launched by state-level actors and those who can muster comparable firepower are no match for Google -- so far.He quotes a post by Krebs called The Democratization of Censorship:
But what we’re allowing by our inaction is for individual actors to build the instrumentality of tyranny. And to be clear, these weapons can be wielded by anyone — with any motivation — who’s willing to expend a modicum of time and effort to learn the most basic principles of its operation.Krebs post is long but important - go read it now, before it goes away again. If it does, the Wayback Machine has it.

District Dispatch: ALA asks presidential candidates about broadband plans

planet code4lib - Mon, 2016-09-26 13:35

Tonight, candidates for president Hillary Clinton and Donald Trump will face one another in the first presidential debate of this election in Hempstead, New York, at Hofstra University, moderated by NBC’s Nightly News anchor Lester Holt. The theme of tonight’s discussion will be “the direction of America, achieving prosperity, and securing America.”

ALA and other groups have addressed an open letter to debate moderators, calling them to ask candidates about broadband access in their infrastructure plans.

Both candidates have expressed that updating our country’s infrastructure is critical to economic development and America’s global competitiveness. We believe our digital infrastructure—broadband to homes, schools, libraries, and other community anchor institutions and businesses—should be part of that conversation. That’s why today we have joined a number of groups on an open letter to the 2016 presidential debate moderators, calling them to ask candidates about how they’ll address broadband in their infrastructure plans. The letter outlines our shared position that many Americans lack access to digital infrastructure and calls on the debate moderators to ask the following question of candidates:

“Home broadband internet access has become an essential tool for education, employment, civic engagement, and even healthcare. Yet 34 million people still lack access to affordable high­speed internet. What will you do as president to help expand access to affordable high­speed internet for everyone in America?”

The debate will run from 9:00 to 10:30 p.m. (Eastern time).

The post ALA asks presidential candidates about broadband plans appeared first on District Dispatch.

FOSS4Lib Recent Releases: VuFind - 3.1

planet code4lib - Mon, 2016-09-26 12:56

Last updated September 26, 2016. Created by Demian Katz on September 26, 2016.
Log in to edit this page.

Package: VuFindRelease Date: Monday, September 26, 2016


Subscribe to code4lib aggregator