Yesterday, S. 2893, legislation introduced by Senator Schumer, passed! It authorizes the National Library Service for the Blind and Physically Handicapped (NLS) to extend its service by providing refreshable Braille display devices to NLS users. Previously, NLS could only supply Braille books in print which are expensive to produce and costly to ship. The NLS did have the capability of sending Braille files to users, but many could not afford the refreshable Braille display devices. Braille readers— popular with many people with print disabilities— allow readers the ability to read Braille from a device connected to a computer keyboard. With so much content now displayed on a computer screen, Braille readers are indispensable. Isn’t technology cool?
Kudos to Senator Schumer for acting on a recommendation from the Government Accountability Office (GAO) in its recent report entitled “Library Services For Those With Disabilities: Additional Steps Needed to Ease Services and Modernize Technology” to “give NLS the opportunity to provide braille in a modernized format and potentially achieve cost savings, Congress should consider amending the law to allow the agency to use federal funds to provide its users playback equipment for electronic braille files (i.e., refreshable braille devices).”
The VIAF API is undergoing enhancements in an upcoming July install scheduled for 7/19/2016.
“Yeas 74, Nays 18”: with those few magic words yesterday, Dr. Carla Hayden was confirmed overwhelmingly by the United States Senate to serve as the nation’s 14th Librarian of Congress. ALA strongly endorsed Dr. Hayden’s nomination, worked hard for her confirmation as an organization, and is proud to have enabled tens of thousands of Americans (librarians and many others alike) to communicate their pride in and support of Dr. Hayden to their Senators.
Today’s magic words are the ones that our parents first acquainted us with – “thank you.” Too often, in the heat of legislative debate and public advocacy, they’re forgotten, but not by librarians and the people who support what (and who) we stand for. Today, keep calling, emailing, and Tweeting the Senators who voted “Yea” to confirm Dr. Hayden (complete list by state below), and no matter where you live, also thank:
- Senate Majority Leader Mitch McConnell for initiating and enabling yesterday’s historic vote;
- Senate Majority Whip John Cornyn for influentially supporting Dr. Hayden with his vote;
- The Rules Committee’s indefatigable staff and leadership, Chairman Roy Blunt and Ranking members Chuck Schumer; and, by no means least
- Dr. Hayden’s biggest boosters in the Senate, her home state of Maryland’s Senators Barbara Mikulski and Ben Cardin.
Dr. Hayden’s nomination, Rules Committee vetting, hearing and ultimate consideration on the floor of the Senate were, appropriately, not partisan. They were done right, done fairly and done well and the nation will benefit for a decade from that model process.
Saying “thank you” is appropriate, easy, and it’s the right thing to do. Please, pass it on proudly and loudly – #HaydenISLoC
The post Thank your Senators for the new Librarian of Congress appeared first on District Dispatch.
Great to hear Koha’s Nicole Engard and Brendan Gallagher interviewed on FLOSS Weekly episode 236 talking about the integrated library system. Six (!) years ago Evergreen was on FLOSS Weekly episode 132, with Mike Rylander and the rich radio-friendly baritone voice of Ontario’s own Dan Scott explaining about the other free and open ILS written in Perl.
Austin, TX The peak of summer is also the mid-point in the annual DuraSpace Membership Campaign. Many thanks to those in our community who have become 2016 DuraSpace Members. We are pleased to report that we are within reach of our Membership Campaign goal of $1,250,000. Financial contributions come from our members, registered service providers and our corporate sponsors.
This afternoon, the Senate voted to confirm Dr. Carla Hayden as the 14th Librarian of Congress! Dr. Hayden will be the first professional librarian to hold the position in over 40 years, as well as the first woman and first African American Librarian of Congress.
You can join our celebration on social media (#HaydenISLoC) and by taking a moment to thank the 74 Senators who voted to confirm Dr. Hayden!
The post Hayden confirmed as the 14th Librarian of Congress appeared first on District Dispatch.
CRRA Update Spring 2016
(December, January, February)
Please see the PDF for the more visually rich version.
Open Knowledge Foundation: Why Open Source Software Matters for Government and Civic Tech – and How to Support It
Today we’re publishing a new research paper looking at whether free/open source software matters for government and civic tech. Matters in the sense that it should have a deep and strategic role in government IT and policy rather than just being a “nice to have” or something “we use when we can”.
As the paper shows the answer is a strong yes: open source software does matter for government and civic tech — and, conversely, government matters for open source. The paper covers:
- Why open software is especially important for government and civic tech
- Why open software needs special support and treatment by government (and funders)
- What specific actions can be taken to provide this support for open software by government (and funders)
We also discuss how software is different from other things that government traditionally buy or fund. This difference is why government cannot buy software like it buys office furniture or procures the building of bridges — and why buying open matters so much.
The paper is authored by our President and Founder Dr Rufus Pollock.Read the Full Version of the Paper Online »
Download PDF Version of the paper »
Discussion and Comments » Why Open Software
We begin with four facts about software and government which form a basis for the conclusions and recommendations that follow.
- The economics of software: software has high fixed costs and low (zero) marginal costs and it is also incremental in that new code builds on old. The cost structure creates a fundamental dilemma between finding ways to fund the fixed cost, e.g. by having proprietary software and raising prices; and promoting optimal access by setting the price at the marginal cost level of zero. In resolving this dilemma, proprietary software models favour the funding of fixed costs but at the price of inefficiently raised pricing and hampering future development, whilst open source models favour efficient pricing and access but face the challenge of funding the fixed costs to create high quality software in the first place. The incremental nature of software sharpens this dilemma and contributes to technological and vendor lock-in.
Switching costs are significant: it is (increasingly) costly to switch off a given piece of software once you start using it. This is because you make “asset (software) specific investments”: in learning how to use the software, integrating the software with your systems, extending and customizing the software, etc. These all mean there are often substantial costs associated with switching to an alternative later.
The future matters and is difficult to know: software is used for a long time — whether in its original or upgraded form. Knowing the future is therefore especially important in purchasing software. Predictions about the future in relation to software are especially hard because of its complex nature and adaptability; behavioural biases mean the level of uncertainty and likely future change are underestimated. Together these mean lock-in is under-estimated.
Governments are bad at negotiating, especially in this environment, and hence the lock-in problem is especially acute for Government. Government are generally poor decision-makers and bargainers due to the incentives faced by government as a whole and by individuals within government. They are especially weak when having to make trade-offs between the near-term and the more distant future. They are even weaker when the future is complex, uncertain and hard to specify contractually up front. Software procurement has all of these characteristics, making it particularly prone to error compared to other government procurement areas.
Note: numbers in brackets e.g. (1) refer to one of the four observations of the previous section.
A. Lock-in to Proprietary Software is a Problem
Incremental Nature of Software (1) + Switching Costs (2)
Lock-in happens for a software technology, and, if it is proprietary, to a vendor
Zero Marginal Cost of Software (1) + Uncertainty about the Future in user needs and technologies (3) + Governments are Poor Bargainers (4)
Lock-in to proprietary software is a problem
Lock-in has high costs and is under-estimated – especially so for government
B. Open Source is a Solution
Lock-in is a problem
Strategies that reduce lock-in are valuable
Economics of Software (1)
Open-source is a strategy for government (and others) to reduce future lock-in
Why? Because it requires the software provider to make an up-front commitment to making the essential technology available both to users and other technologists at zero cost, both now and in the future
Together these two points
Open source is a solution
And a specific commitment to open source in government / civic tech is important and valuable
C. Open Source Needs Support
And Government / Civic Tech is an area where it can be provided effectively
Software has high fixed costs and a challenge for open source is to secure sufficient support investment to cover these fixed costs (1 – Economics)
Governments are large spenders on IT and are bureaucratic: they can make rules to pre-commit up front (e.g. in procurement) and can feasibly coordinate whether at local, national or, even, international levels on buying and investment decisions related to software.
Government is especially well situated to support open source
Government has the tools to provide systematic support
Government should provide systematic support
We have established in the previous section that there is a strong basis for promoting open software. This section provides specific strategic and tactical suggestions for how to do that. There are five proposals that we summarize here. Each of these is covered in more detail in the main section below. We especially emphasize the potential of the last three options as it does not require up-front participation by government and can be boot-strapped with philanthropic funding.
1. Recognize and reward open source in IT procurement.
Give open source explicit recognition and beneficial treatment in procurement. Specifically, introduce into government tenders: EITHER an explicit requirement for an open source solution OR a significant points value for open source in the scoring for solutions (more than 30% of the points on offer).
2. Make government IT procurement more agile and lightweight.
Current methodologies follow a “spec and deliver” model in which government attempts to define a full spec up front and then seeks solutions that deliver against this. The spec and deliver model greatly diminishes the value of open source – which allows for rapid iteration in the open, and more rapid switching of provider – and implicitly builds lock-in to the selected provider whose solution is a black-box to the buyer. In addition, whilst theoretically shifting risk to the supplier of the software, given the difficulty of specifying software up front it really just inflates upfront costs (since the supplier has to price in risk) and sets the scene for complex and cumbersome later negotiations about under-specified elements.
3. Develop a marketing and business development support organization for open source in key markets (e.g. US and Europe).
The organization would be small, at least initially, and focused on three closely related activity areas (in rough order of importance):
- General marketing of open source to government at both local and national level: getting in front of CIOs, explaining open source, demystifying and derisking it, making the case etc. This is not specific to any specific product or solution.
Supporting open source businesses, especially those at an early-stage, in initial business development activities including: connecting startups to potential customers (“opening the rolodex”) and guidance in navigating the bureaucracy of government procurement including discovering and responding to RFPs.
Promoting commercialization of open source by providing advice, training and support for open source startups and developers in commercializing and marketing their technology. Open source developers and startups are often strong on technology and weak on marketing and selling their solutions and this support would help address these deficiencies.
4. Open Offsets: establish target levels of open source financing combined with a “offsets” style scheme to discharge these obligations.
An “Open Offsets” program would combine three components:
- Establish target commitments for funding open source for participants in the program who could include government, philanthropists and private sector. Targets would be a specific measurable figure like 20% of all IT spending or $5m.
Participants discharge their funding commitment either through direct spending such as procurement or sponsorship or via purchase of open source “offsets”. “Offsets” enable organizations to discharge their open source funding obligation in an analogous manner to the way carbon offsets allow groups to deliver on their climate change commitments.
Administrators of the open offset fund distribute the funds to relevant open source projects and communities in a transparent manner, likely using some combination of expert advice, community voting and value generated (this latter based on an estimate of the usage and value of created by given pieces of open software).
5. “Choose Open”: a grass-roots oriented campaign to promote open software in government and government run activities such as education.
“Choose Open” would be modelled on recent initiatives in online political organizing such as “Move On” in the 2004 US Presidential election as well as online initiatives like Avaaz. It would combine central provision of message, materials and policy with localized community participation to drive change.Read the Full Version of the Paper Online »
Download PDF Version of the paper »
Discussion and Comments »
In an earlier post I speculated about the plateau in ebook adoption. According to recent statistics from publishers we are now actually seeing a decline in ebook sales after a period of growth (and then the leveling off that I discussed before). Here’s my guess about what’s going on—an educated guess, supported by what I’m hearing from my sources and network.
First, re-read my original post. I believe it captured a significant part of the story. A reminder: when we hear about ebook sales we hear about the sales from (mostly) large publishers and I have no doubt that ebooks are a troubled part of their sales portfolio. But there are many other ebooks than those reported by the publishers that release their stats, and ways to acquire them, and thus there’s a good chance that there’s considerable “dark reading” (as I called it) that accounts for the disconnect between the surveys that say that e-reading is growing while sales (again, from the publishers that reveal these stats) are declining.
The big story I now perceive is a bifurcation of the market between what used to be called high and low culture. For genre fiction (think sexy vampires) and other genres where there is a lot of self-publishing, readers seem to be moving to cheap (often 99 cent) ebooks from Amazon’s large and growing self-publishing program. Amazon doesn’t release its ebook sales stats, but we know that they already have 65% of the ebook market and through their self-publishing program may reach a disturbing 90% in a few years. Meanwhile, middle- and high-brow books for the most part remain at traditional publishers, where advances still grease the wheels of commerce (and writing).
Other changes I didn’t discuss in my last post are also happening that impact ebook adoption. Audiobook sales rose by an astonishing 40% over the last year, a notable story that likely impacts ebook growth—for the vast majority of those with smartphones, they are substitutes (see also the growth in podcasts). In addition, ebooks have gotten more expensive in the past few years, while print (especially paperback) prices have become more competitive; for many consumers, a simple Econ 101 assessment of pricing accounts for the ebook stall.
I also failed to account in my earlier post for the growing buy-local movement that has impacted many areas of consumption—see vinyl LPs and farm-to-table restaurants—and is, in part, responsible for the turnaround in bookstores—once dying, now revived—an encouraging trend pointed out to me by Oren Teicher, the head of the American Booksellers Association. These bookstores were clobbered by Amazon and large chains late last decade but have recovered as the buy-local movement has strengthened and (more behind the scenes, but just as important) they adopted technology and especially rapid shipping mechanisms that have made them more competitive.
Personally, I continue to read in both print and digitally, from my great local public library and from bookstores, and so I’ll end with an anecdotal observation: there’s still a lot of friction in getting an ebook versus a print book, even though one would think it would be the other way around. Libraries still have poor licensing terms from publishers that treat digital books like physical books that can only be loaned to one person at a time despite the affordances of ebooks; ebooks are often not that much cheaper, if at all, than physical books; and device-dependency and software hassles cause other headaches. And as I noted in my earlier post, there’s still not a killer e-reading device. The Kindle remains (to me and I suspect many others) a clunky device with a poor screen, fonts, etc. In my earlier analysis, I probably also underestimated the inertial positive feeling of physical books for most readers—which I myself feel as a form of consumption that reinforces the benefits of the physical over the digital.
It seems like all of these factors—pricing, friction, audiobooks, localism, and traditional physical advantages—are combining to restrict the ebook market for “respectable” ebooks and to shift them to Amazon for “less respectable” genres. It remains to be seen if this will hold, and I continue to believe that it would be healthy for us to prepare for, and create, a better future with ebooks.
Austin, TX DuraSpace is pleased to announce the launch of a new DuraCloud web site: http://duracloud.org The site makes it easy to request a customized DuraCloud quote or to create a free trial account. Simple navigation points users to more information about the service, and four different subscription plans. Please let us know what you think!
We are excited to announce that the second face-to-face Mashcat event in North America will be held on January 24th, 2017, in downtown Atlanta, Georgia, USA. We invite you to save the date. We will be sending out a call for session proposals and opening up registration in the late summer and early fall.
Not sure what Mashcat is? “Mashcat” was originally an event in the UK in 2012 aimed at bringing together people working on the IT systems side of libraries with those working in cataloguing and metadata. Four years later, Mashcat is a loose group of metadata specialists, cataloguers, developers and anyone else with an interest in how metadata in and around libraries can be created, manipulated, used and re-used by computers and software. The aim is to work together and bridge the communications gap that has sometimes gotten in the way of building the best tools we possibly can to manage library data. Among our accomplishments in 2016 was holding the first North American face-to-face event in Boston in January and running webinars. If you’re unable to attend a face-to-face meeting, we will be holding at least one more webinar in 2016.
Thanks for considering, and we hope to see you in January.
Register now for the 2016 LITA Forum
Fort Worth, TX
November 17-20, 2016
Join us in Fort Worth, Texas, at the Omni Fort Worth Hotel located in Downtown Fort Worth, for the 2016 LITA Forum, a three-day education and networking event featuring 2 preconferences, 3 keynote sessions, more than 55 concurrent sessions and 25 poster presentations. It’s the 19th annual gathering of the highly regarded LITA Forum for technology-minded information professionals. Meet with your colleagues involved in new and leading edge technologies in the library and information technology field. Registration is limited in order to preserve the important networking advantages of a smaller conference. Attendees take advantage of the informal Friday evening reception, networking dinners and other social opportunities to get to know colleagues and speakers.
- Cecily Walker, Vancouver Public Library
- Waldo Jaquith, U.S. Open Data
- Tara Robertson, @tararobertson
- Librarians can code! A “hands-on” computer programming workshop just for librarians
- Letting the Collections Tell Their Story: Using Tableau for Collection Evaluation
Comments from past attendees:
“Best conference I’ve been to in terms of practical, usable ideas that I can implement at my library.”
“I get so inspired by the presentations and conversations with colleagues who are dealing with the same sorts of issues that I am.”
“After LITA I return to my institution excited to implement solutions I find here.”
“This is always the most informative conference! It inspires me to develop new programs and plan initiatives.”
See you in Fort Worth.
There are probably a hundred reasons why the Senate should immediately vote – and unanimously at that — to confirm Dr. Carla Hayden to serve as the next Librarian of Congress. With the clock ticking down to zero this week on its pre-recess calendar, here are our top ten for the Senate to award her the job now:
- She brought Baltimore’s large historic library system into the 21st Century and she’ll do the same for the Library of Congress.
- The nation’s Library has been led by a library professional three times before in its history; its technology and organizational needs demand a fourth now.
- The Senate Rules Committee approved her without dissent by voice vote.
- Every state and major national library association in America strongly back her confirmation.
- University of Chicago PhDs don’t come in cereal boxes.
- Breathing new life into the Library of Congress demands Dr. Hayden’s deep understanding of technology, opportunity and community.
- The world’s greatest library deserves to be led by one of Fortune Magazine’s 50 “World’s Greatest Leaders” for 2016.
- Congress and the public it serves needs the best possible librarian as the Librarian.
- It’s hard to find anything or anyone else that the Copyright Alliance and Internet Association agree on.
- “Vacancy” is the sign you want to see on a motel marquee at the end of a long drive, not on the Librarian of Congress’ chair at the beginning of a new Congress.
Ask your Senators to confirm Dr. Carla Hayden today – visit the Action Center for additional talking points and pre-written tweets messages.
The post Ten reasons to confirm Dr. Hayden for Librarian of Congress appeared first on District Dispatch.
UI/UX Assets, who create design assets and resources for user interface and user experience designers, make available these really useful flowchart cards designed by Johan Netzler. These are common design patterns you can use to think through the design and flow of a site. Super handy.
I love this kind of stuff. Here, I pieced together an idea for the homepage of a public library.
128 UX flowchart cards. Perfect tool for creating user journeys and UX flows using Sketch. Not only does it come with hundreds of elements, it is as always extremely well organized. Each card follows a flexible grid and a strict layer structure, creating consistency across all cards. This is a perfect instrument to make your ideas minimal, readable and easy to follow.
UX Flowchart Cards on UI/UX Assets
DPLA: DPLA Welcomes Denise Stephens and Mary Minow to Board, Honors Departing Paul Courant and Laura DeBonis
On July 1, 2016, the Digital Public Library of America had several transitions on its Board of Directors. Two of our original board members rotated off the board at the end of their second terms, and two new board members joined in their stead. We wish to salute the critical roles that Paul Courant and Laura DeBonis played in our young organization, and give a warm welcome to Denise Stephens and Mary Minow as we continue to mature.Paul Courant
Paul Courant was at the first meeting that conceptualized DPLA in the fall of 2010 at the Radcliffe Institute, and he has been instrumental in DPLA’s inception and growth ever since. Paul led the creation of one of our founding hubs, HathiTrust, and, with his wide-ranging administrative experience as a provost and university librarian at the University of Michigan and his deep economic knowledge, he has been a tremendous resource to DPLA. With HathiTrust, Paul crystallized the importance of nonprofit institutions holding, preserving, and making accessible digital copies of books (and later, other documents). HathiTrust’s model of large-scale collaboration was also an inspiration for DPLA.
Paul has long been a vocal and effective advocate for open access and for sharing the holdings of our cultural heritage institutions as widely as possible with the global public. His shrewd vision of the national and international landscape for libraries was tremendously influential as we formed, launched, and expanded over the last six years. Paul’s very good humor will also be greatly missed.
Paul N. Courant previously served as the University Librarian and Dean of Libraries, Harold T. Shapiro Collegiate Professor of Public Policy, Arthur F. Thurnau Professor, Professor of Economics and Professor of Information at the University of Michigan. From 2002-2005 he served as Provost and Executive Vice-President for Academic Affairs, the chief academic officer and the chief budget officer of the University. He has also served as the Associate Provost for Academic and Budgetary Affairs, Chair of the Department of Economics and Director of the Institute of Public Policy Studies (which is now the Gerald R. Ford School of Public Policy). In 1979 and 1980 he was a Senior Staff Economist at the Council of Economic Advisers. Paul has authored half a dozen books, and over seventy papers covering a broad range of topics in economics and public policy, including tax policy, state and local economic development, gender differences in pay, housing, radon and public health, relationships between economic growth and environmental policy, and university budgeting systems. More recently, his academic work has considered the economics of universities, the economics of libraries and archives, and the effects of new information technologies and other disruptions on scholarship, scholarly publication, and academic libraries. Paul holds a BA in History from Swarthmore College, an MA in Economics from Princeton University, and a PhD in Economics from Princeton University.Laura DeBonis
Laura DeBonis’s background is very different from Paul’s, but she brought an equal measure of economic and business expertise, and a similar passion to seeing how technology can help the general public. Her early and leading involvement with Google Books, and her ability to establish partnerships across multiple domains, was incredibly helpful to DPLA. Laura’s knowledge of digitization and sense of the power of computational technology—as well as her understanding of where its limits lie and where human activity and collaboration must step in—were enormously useful as we set up DPLA’s distributed national system. In recent years, her savvy understanding of the ebook ecosystem has helped us plan our work in this area, and impacted the Open eBook Initiative. Laura was constantly available to staff, and always ready with well-considered, thoughtful advice. We wish her well and plan to stay in touch.
Laura DeBonis currently works as a consultant to education companies and non-profits. In addition to the DPLA, she also serves on the Public Interest Declassification Board at the National Archives. Laura previously worked at Google in a variety of positions including Director of Library Partnerships for Book Search, Google’s initiative to make all the world’s books discoverable and searchable online. Laura started her career in documentary film and multimedia and in strategy consulting for internet businesses. She is a graduate of Harvard College and has a MBA from Harvard Business School.
Denise Stephens, the University Librarian at the University of California, Santa Barbara, begins her first term on the board this month. We have been particularly impressed with the way that Denise has combined a deep understanding of libraries, physical and digital, with a public spirit and sense of community. The recently renovated library at UCSB, with both analog and digital resources oriented toward the many needs of students, teachers, and the public, is itself a model for DPLA. Her many years of experience and passion for libraries and public service will be greatly appreciated at DPLA.
Denise Stephens has served as University Librarian at UCSB since 2011. Her background includes a broad range of leadership and management roles related to the intersection of evolving information resource strategies and scholarship in the academic environment. She has actively participated in implementing digital library initiatives and service programs in research university libraries for 20 years. In addition to her current position, she has held campus-wide library and information technology executive leadership roles at Syracuse University (as Associate and Acting University Librarian) and the University of Kansas, where she served as Vice Provost and Chief Information Officer. Early in her career, she helped to launch transformative spatial data services among emerging digital library programs at the University of Virginia. Ms. Stephens has also contributed to efforts promoting transformed scholarly communications and persistent accessibility of information resources as a member of the BioOne Board of Directors and the Depository Library Council of the Federal Depository Library Program. Ms. Stephens has a BA in Political Science and a Master of Library and Information Studies from the University of Oklahoma.Mary Minow
Mary Minow is one of the foremost legal scholars on issues that impact libraries, including copyright and fair use. She has been very active in the library community, serving on boards and committees that span a range of interests and communities. Her thoughtful discourses on the nature and role of libraries, the importance of access to culture and the need for intellectual freedom, fits beautifully into our work, and we look forward to her inspiring words and advice. She has worked as both a librarian and a lawyer, and will help us bridge these worlds as well.
Mary Minow is an advanced leadership initiative fellow at Harvard University and is a Presidential Appointee to the board of the Institute of Museum and Library Services. She has also worked as a consultant with libraries in California and across the country on copyright, privacy, free speech and related legal issues. She most recently was counsel to Califa, a consortium of California libraries that set up its own statewide ebook lending service. Previously she was the Follett Chair at Dominican University’s School of Library and Information Science. Current and past board memberships include the Electronic Privacy Information Center, the Freedom to Read Foundation and the California Association of Trustees and Commissioners (Past Chair). She is the recipient of the first Zoia Horn Intellectual Freedom award and also received a WISE (Web-based Information Science Education) award for excellence in online education when she taught part time at San Jose State University.
Inc.’s John Brandon recently wrote about The Slow, Sad, and Ultimately Predictable Decline of 3D Printing. Uh, not so fast.
3D Printing is just getting started. For libraries whose adopted mission is to introduce people to emerging technologies, this is a fantastic opportunity to do so. But it has to be done right.Another dead end?
Brandon cites a few reasons for his pessimism:
- 3D printed objects are low quality and the printers are finicky
- 3D printing growth is falling behind initial estimates
- people in manufacturing are not impressed
- and the costs are too high
I won’t get into all that’s wrong with this analysis, as I feel like most of it is incorrect, or at the very least, a temporary problem typical of a new technology. Instead, I’d like to discuss this in the library maker context. And in fact, you can apply these ideas to any tech project.How to make failure a win—no matter what
Libraries are quick to jump on tech. Remember those QR Codes that would revolutionize mobile access? Did your library consider a Second Life branch? How about those Chromebooks!
Inevitably, these experiments are going to fail. But that’s okay.
As this blog often suggests, failure is a win when doing so teaches you something. Experimenting is the first step in the process of discovery. And that’s really what all these kinds of projects need to be.
In the case of a 3D Printing project at your library, it’s important to keep this notion front and center. A 3D Printing pilot with the goal of introducing the public to the technology can be successful if people simply try it out. That seems easy enough. But to be really successful, even this kind of basic 3D Printing project needs to have a fair amount of up-front planning attached to it.
Chicago Public Library created a successful Maker Lab. Their program was pretty simple: Hold regular classes showing people how to use the 3D printers and then allow those that completed the introductory course to use the printers in open studio lab times. When I tried this out at CPL, it was quite difficult to get a spot in the class due to popularity. The grant-funded project was so successful, based on the number of attendees, that it was extended and continues to this day.
As a grant-funded endeavor, CPL likely wrote out the specifics before any money was handed over. But even an internally-funded project should do this. Keep the goals simple and clear so expectations on the front line match those up the chain of command. Figure out what your measurements of success are before you even purchase the first printer. Be realistic. Always document everything. And return to that documentation throughout the project’s timeline.Taking it to the next level
San Diego Public Library is an example of a Maker Project that went to the next level. Uyen Tran saw an opportunity to merge startup seminars with their maker tools at her library. She brought aspiring entrepreneurs into her library for a Startup Weekend event where budding innovators learned how the library could be a resource for them as they launched their companies. 3D printers were part of this successful program.
It’s important to note that Uyen already had the maker lab in place before she launched this project. And it would be risky for a library to skip the establishment of a rudimentary 3D printer program before trying for this more ambitious program.
But it could be done if that library was well organized with solid project managers and deep roots in the target community. But that’s a tall order to fill.What’s the worst thing that could go wrong?
The worst thing that could go wrong is doubling down on failure: repeating one failed project after another without changing the flawed approach behind it.
I’d also add that libraries are often out ahead of the public on these technologies, so dead ends are inevitable. To address this, I would also add one more tactic to your tech projects: listening.
The public has lots of concerns about a variety of things. If you ask them, they’ll tell you all about them. Many of their concerns are directly related to libraries, but we can often help. We have permission to do so. People trust us. It’s a great position to be in.
But we have to ask them to tell us what’s on their mind. We have to listen. And then we need to think creatively.
Listening and thinking outside the box was how San Diego took their 3D Printers to the next level.The Long Future of 3D Printing
The Wright Brothers first flight managed only 120 feet in the air. A year later, they flew 24 miles. These initial attempts looked nothing like the jet age and yet the technology of flight was born from these humble experiments.
Already, 3D printing is being adopted in multiple industries. Artists are using it to prototype their designs. Astronauts are using it to print parts aboard the International Space Station. Bio-engineers are now looking at printing stem-cell structures to replace organs and bones. We’re decades away from the jet age of 3D printing, but this tech is here to stay.
John Brandon’s read is incorrect simply because he’s looking at the current state and not seeing the long-term promise. When he asks a Ford engineer for his take on 3D Printing in the assembly process, he gets a smirk. Not a hotbed of innovation. What kind of reaction would he have gotten from an engineer at Tesla? At Apple? Fundamentally, he’s approaching 3D Printers from the wrong perspective and this is why it looks doomed.
Libraries should not make this mistake. The world is changing ever more quickly and the public needs us to help them navigate the new frontier. We need to do this methodically, with careful planning and a good dose of optimism.
Starting in 2012, the British Library replaced its interlibrary loan service with a license document delivery agreement with the International Association of Scientific, Technical & Medical Publishers (STM) and the Publishers Association. Perhaps to improve turnaround time to provide better service, perhaps to save money by outsourcing, or perhaps because of fear of infringement, the British Library agreed to switch to the International Non-Commercial Document Supply (INCD) service. Their previous interlibrary loan service was extremely popular and apparently lawful because UK copyright law has interlibrary loan copyright exception similar to the one we have in US copyright law – that libraries could send journal articles to other libraries to meet the request of a user. But did it cover international ILL?
The abandoned interlibrary loan service provided resources to 59 countries that did not have the materials requested by faculty, researchers, and students at their own libraries. Being one of the largest research collections in the world, interlibrary loan from the British Library was naturally, heavily relied upon. After moving to the INCD service however, the popular interlibrary loan service deteriorated in spectacular fashion, detailed by Teresa Hackett from Electronic Information for Libraries) (EIFL). In her blog post entitled “Licensed to Fail,” Hackett describes the swift demise of INCD service, and, through a freedom of information request, has the data to bolster her argument. You must read it, although you likely will not be surprised.
Back in 2012, when announcing the INCD partnership, Michael Mabe, CEO of the STM said that “the British Library framework license (INCD) will give publishers, including our members, contractual control over the international cross-border delivery of copies from their material via an established and respected document supply service. It will also allow the British Library to improve the service, and delivery times, available to its authorized users.” Alas, the British Library cancelled the service this month. It did not fit the bill, dramatically reducing the access to research materials (while delivering on publisher contractual control).
One wonders. Maybe this explains the popularity of Sci-Hub.
The past 3 weeks, I’ve been doing a lot of work on MarcEdit. These initial changes impact just the windows and linux version of MarcEdit. I’ll be taking some time tomorrow and Wed. to update the Mac version. The current changes are as follows:
* Enhancement: Language files have been updated
* Enhancement: Command-line tool: -task option added to support tasks being run via the command-line.
* Enhancement: Command-line tool: -clean and -validate options updated to support structure validation.
* Enhancement: Alma integration: Updating version numbers and cleaned up some windowing in the initial release.
* Enhancement: Small update to the validation rules file.
* Enhancement: Update to the linked data rules file around music headings processing.
* Enhancement: Linked Data Platform: collections information has been moved into the configuration file. This will allow local indexes to be added so long as they support a json return.
* Enhancement: Merge Records — 001 matching now looks at the 035 and included oclc numbers by default.
* Enhancement: MarcEngine: Updated the engine to accommodate invalid data in the ldr.
* Enhancement: MARC SQL Explorer — added an option to allow mysql database to be created as UTF8.
* Enhancement: Handful of odd UI changes.
You can get the update from the downloads page (http://marcedit.reeset.net/downloads) or via the automated update tools.
MarcEdit’s command-line function has always had the ability to run validation tasks against the MarcEdit rules file. However, the program hasn’t included access to the cleaning functions of the validator. As of the last update, this has changed. If the –validate command is invoked without a rules file defined, the program will validate the structure of the data. If the –clean option is passed, the program will remove invalid structural data from the file.
Here’s an example of the command:
>> cmarcedit.exe -s “C:\Users\rees\Desktop\CLA_UCB 2016\Data File\sample data\bad_sample_records.mrc” –validate