You are here

Feed aggregator

Terry Reese: MarcEdit Update

planet code4lib - Tue, 2016-05-24 01:39

Yesterday, I posted a significant update to the Windows/Linux builds and a maintenance update to the Mac build that includes a lot of prep work to get it ready to roll in a number of changes that I’ll hopefully complete this week.  Unfortunately, I’ve been doing a lot of travelling, which means that my access to my mac setup has been pretty limited and I didn’t want to take another week getting everything synched together. 

So what are the specific changes:

ILS Integrations
I’ve been spending a lot of time over the past three works head down working on ILS integrations.  Right now, I’m managing two ILS integration scenarios – one is with Alma and their API.  I’m probably 80% finished with that work.  Right now, all the code is written, I’m just not getting back expected responses from their bibliographic update API.  Once I sort out that issue – I’ll be integrating this change into MarcEdit and will provide a youtube video demonstrating the functionality. 

The other ILS integration that I’ve been accommodating is working with MarcEdit’s MARC SQL Explorer and the internal database structure.  This work builds on some work being done with the Validate Headings tool to close the authority control loop.  I’ll likely be posting more about that later this week as I’m currently have a couple libraries test this functionality to make sure I’ve not missed anything.  Once they give me the thumbs up, this will make its way into the MarcEditor as well. 

But as part of this work, I needed to create a way for users to edit and search the local database structure in a more friendly way.  So, leveraging the ILS platform, I’ve included the ability for users to work with the local database format directly within the MarcEditor.  You can see how this works here ( Integrating the MarcEditor with a local SQL store.  I’m not sure what the ideal use case is for this functionality – but over the past couple of weeks, it had been requested by a couple of power users currently using the MARC SQL Explorer for some data edits, but hoping for an easier to user interface.  This work will be integrated into the Mac MarcEdit version at the end of this week.  All the prep work (window/control development) has been completed.  At this point, its just migrating the code so that it works within the Mac’s object-C codebase.

Edit Shortcuts
I created two new edit shortcuts in the MarcEditor.  The first, Find Records With Duplicate Tags, was created to help users look for records that may have multiple tags or a tag/subfield combination with a set of records.  This is work that can be done in the Extract Selected Records tool, but it requires a bit a trickery and knowledge of how MarcEdit formats data. 

How does this work – say you wanted to know which records had multiple call numbers (050) fields in a record.  You would select this option, enter 050 in the prompt, and then the tool would create for you a jump list showing all the records that met your criteria. 

Convert To Decimal Degrees
The second Edit ShortCut function is the first Math function (I’ll be adding two more, specifically around finding records with dates greater than or less than a specific value) targeting the conversion of Degree/Minutes/Seconds to decimal degrees.  The process has been created to be MARC agnostic, so users can specify the field, and subfields to process.  To run this function, select it from the Edit Shortcuts as demonstrated in the screenshot below:

When selected, you will get the following prompt:

This documents the format for defining the field/subfields to be processed.  Please note, it is important to define the all four potential values for conversion – even if they are not used within the record set. 

Using this function, you can now convert a value like:
=034  1\$aa$b1450000$dW1250000$eW1163500$fN0461500$gN0420000
=034  1\$aa$b1450000$d+125.0000$e+116.5800$f+046.2500$g+042.0000

This function should allow users to transition their cartographic data to a format that is much more friendly to geographic interpretation if desired.

Bug Fixes:
This update also addressed a bug in the Build New field parser.  If you have multiple arguments, side-by-side, within the same field grouping (i.e., {100$a}{100$b}{100$c} – the parser can become confused.  This has been corrected.

Included and update to the linked data rules file, updating the 7xx fields to include the $t in the processing.  Also updated the UNIMARC translation to include a 1:1 translation for 9xx data.

Over the next week, I hope to complete the Alma integration, but will focusing the development work in my free time on getting the Mac version synched with these changes.


DuraSpace News: Sandy Payette to Speak at 2016 VIVO Conference

planet code4lib - Tue, 2016-05-24 00:00

From the VIVO 2016 Planning Committee

Register today to attend the 2016 VIVO conference and hear from leading experts within our community.

DuraSpace News: Find out What’s Inside Hydra-in-a-Box at Open Repositories 2016: PCDM, Design, Emerging Architecture, Repository Tooling

planet code4lib - Tue, 2016-05-24 00:00

Austin, TX  It’s only three weeks away! If you will attend the 11th Annual International Conference on Open Repositories (#OR2016) here are the sessions that will be of interest if you want to learn more about the Hydra-in-a-Box project

Workshop: Modeling your Repository Objects with the Portland Common Data Model (PCDM)

Monday, June 13, 1:30-3:30 PM; 4:00-6:00 PM

Eric Hellman: 97% of Research Library Searches Leak Privacy... and Other Disappointing Statistics.

planet code4lib - Mon, 2016-05-23 20:18

...But first, some good news. Among the 123 members of the Association of Research Libraries, there are four libraries with almost secure search services that don't send clickstream data to Amazon, Google, or any advertising network. Let's now sing the praises of libraries at Southern Illinois University, University of Louisville, University of Maryland, and University of New Mexico for their commendable attention to the privacy of their users. And it's no fault of their own that they're not fully secure. SIU fails to earn a green lock badge because of mixed content issues in the CARLI service; while Louisville, Maryland and New Mexico miss out on green locks because of the weak cipher suite used by OCLC on their Worldcat Local installations. These are relatively minor issues that are likely to get addressed without much drama.

Over the weekend, I decided to try to quantify the extent of privacy leakage in public-facing library services by studying the search services of the 123 ARL libraries. These are the best funded and most prestigious libraries in North America, and we should expect them to positively represent libraries. I went to each library's on-line search facility and did a search for a book whose title might suggest to an advertiser that I might be pregnant. (I'm not!) I checked to see whether the default search linked to by the library's home page (as listed on the ARL website) was delivered over a secure connection (HTTPS). I checked for privacy leakage of referer headers from cover images by using Chrome developer tools (the sources tab). I used Ghostery to see if the library's online search used Google Analytics or not. I also noted whether advertising network "web beacons" were placed by the search session.

72% of the ARL libraries let Google look over the shoulder of every click by every user, by virtue of the pervasive use of Google Analytics. Given the commitment to reader privacy embodied by the American Library Association's code of ethics, I'm surprised this is not more controversial. ALA even sponsors workshops on "Getting Started with Google Analytics". To paraphrase privacy advocate and educator Dorothea Salo, the code of ethics does not say:
We protect each library user's right to privacy and confidentiality with respect to information sought or received and resources consulted, borrowed, acquired or transmitted, except for Google Analytics.While it's true that Google has a huge stake in maintaining the trust of users in their handling of personal information, and people seem to trust Google with their most intimate secrets, it's also true that Google's privacy policy puts almost no constraints on what Google (itself) can do with the information they collect. They offer strong commitments not to share personally identifiable information with other entities, but they are free to keep and use personally identifiable information. Google can associate Analytics-tracked library searches with personally identifiable information for any user that has a Google account; Libraries cannot be under the illusion that they are uninvolved with this data collection if they benefit from Google Analytics. (Full disclosure: many of the the web sites I administer also use Google Analytics.)

80% of the ARL libraries provide their default discovery tools to users without the benefit of a secure connection. This means that any network provider in the path between the library and the user can read and alter the query, and the results returned to the user. It also means that when a user accesses the library over public wifi, such as in a coffee shop, the user's clicks are available for everyone else in the coffee shop to look at, and potentially to tamper with. (The Digital Library Privacy Pledge is not having the effect we had hoped for, at least not yet.)

28% of ARL libraries enrich their catalog displays with cover images sourced from Because of privacy leakage in referer headers, this means that a user's searches for library books are available for use by Amazon when Amazon wants to sell that user something. It's not clear that libraries realize this is happening, or whether they just don't realize that their catalog enrichment service uses cover images sourced by Amazon.

13% of ARL libraries help advertisers (other than Google) target their ads by allowing web beacons to be placed on their catalog web pages. Whether the beacons are from Facebook, DoubleClick, AddThis or Sharethis, advertisers track individual users, often in a personally identifiable way. Searches on these library catalogs are available to the ad networks to maximize the value of advertising placed throughout their networks.

Much of the privacy leakage I found in my survey occurs beyond the control of librarians. There are IT departments, vender-provided services, and incumbent bureaucracies involved. Important library services appear to be unavailable in secure versions. But specific, serious privacy leakage problems that I've discussed with product managers and CTOs of library automation vendors have gone unfixed for more than a year. I'm getting tired of it.

The results of my quick survey for each of the 123 ARL libraries are available as a Google Sheet. There are bound to be a few errors, and I'd love to be able to make changes as privacy leaks get plugged and websites become secure, so feel free to leave a comment.

LITA: LITA announces the Top Tech Trends panel at ALA Annual 2016

planet code4lib - Mon, 2016-05-23 19:34

Kicking off LITA’s celebration year of it’s 50th year the Top Technology Trends Committee announces the panel for the highly popular session at  2016 ALA Annual in Orlando, FL.

Top Tech Trends
starts Sunday June 26, 2016, 1:00 pm – 2:30 pm, in the
Orange County Convention Center, Room W109B
and kicks off Sunday Afternoon with LITA.

This program features the ongoing roundtable discussion about trends and advances in library technology by a panel of LITA technology experts. The panelists will describe changes and advances in technology that they see having an impact on the library world, and suggest what libraries might do to take advantage of these trends. This year’s panelists line up is:

  • Maurice Coleman, Session Moderator, Technical Trainer, Harford County Public Library, @baldgeekinmd
  • Blake Carver, Systems Administrator, LYRASIS, @blakesterz
  • Lauren Comito, Job and Business Academy Manager, Queens Library, @librariancraftr
  • Laura Costello, Head of Research & Emerging Technologies, Stony Brook University, @lacreads
  • Carolyn Coulter, Director, PrairieCat Library Consortium, Reaching Across Illinois Library System (RAILS), @ccoulter
  • Nick Grove, Digital Services Librarian, Meridian Library District – unBound, @nickgrove15

Check out the Top Tech Trends web site for more information and panelist biographies.

Safiya Noble

Followed by the LITA Awards Presentation & LITA President’s Program with Dr. Safiya Noble
presenting: Toward an Ethic of Social Justice in Information
at 3:00 pm – 4:00 pm, in the same location

Dr. Noble is an Assistant Professor in the Department of Information Studies in the Graduate School of Education and Information Studies at UCLA. She conducts research in socio-cultural informatics; including feminist, historical and political-economic perspectives on computing platforms and software in the public interest. Her research is at the intersection of culture and technology in the design and use of applications on the Internet.

Concluding with the LITA Happy Hour
from 5:30 pm – 8:00 pm
that location to be determined

This year marks a special LITA Happy Hour as we kick off the celebration of LITA’s 50th anniversary. Make sure you join the LITA Membership Development Committee and LITA members from around the country for networking, good cheer, and great fun! Expect lively conversation and excellent drinks; cash bar. Help us cheer for 50 years of library technology.


Open Knowledge Foundation: Open Knowledge International – our new name!

planet code4lib - Mon, 2016-05-23 15:55

Notice something a little different? We have had a change of name!

As of today, we officially move from being called “Open Knowledge” to “Open Knowledge International (OKI)”.

“Open Knowledge International” is the name by which the community groups have referred to us for a couple of years, conveying our role in supporting the groups around the world, as well as our role within the broader open knowledge movement globally. We are excited to announce our new name that reflects this.

Open Knowledge International is registered in the UK, and this has sometimes led to assumptions that we operate in and for the benefit of this region. However, the UK is no more of a priority to Open Knowledge International than other areas of the world; in fact, we want to look more closely at ways we can be engaged at a global level, where efforts to push open knowledge are already happening and where we can make a difference by joining alongside the people making it happen. This is evident by our efforts to support the associated Open Knowledge Network, with a presence in more than 40 countries and cross-border Working Groups, as well as our support of international projects, such as the Global Open Data Index, that both blends open knowledge expertise and draws upon the global open data community. Finally, we are an international team, with staff based in nearly every region, collaborating virtually to promote openness online and on the ground.

By formalising Open Knowledge International as our name beyond the community groups associated with us and to the broader open knowledge movement, we are reflecting the direction we are striving to undertake, now and increasingly so in the future. We are grateful to have such a strong community behind us as we undertake a name change that better reflects our priorities and as we continue to seek new opportunities on a global scale.

We are also planning to transition from the domain name for brand consistency and will begin that transition in the coming months. If you would like to discuss this change of name, and what it means, please join in on our forum –

For revised logos please see and please contact if you have any questions about the use of this brand.

David Rosenthal: Improving e-Journal Ingest (among other things)

planet code4lib - Mon, 2016-05-23 15:00
Herbert Van de Sompel, Michael Nelson and I have a new paper entitled Web Infrastructure to Support e-Journal Preservation (and More) that:
  • describes the ways archives ingest e-journal articles,
  • shows the areas in which these processes use heuristics, which makes them fallible and expensive to maintain,
  • and shows how the use of DOIs, ResourceSync, and Herbert and Michael's "Signposting" proposal could greatly improve these and other processes that need to access e-journal content.
It concludes with a set of recommendations for CrossRef and the e-journal publishers that would be easy to adopt and would not merely improve these processes but also help remedy the deficiencies in the way DOI's are used in practice that were identified in Martin Klein et al's paper in PLoS One entitled Scholarly Context Not Found: One in Five Articles Suffers from Reference Rot, and in Persistent URIs Must Be Used To Be Persistent, presented by Herbert and co-authors to the 25th international world wide web conference.

LITA: Getting your color on: maybe there’s some truth to the trend

planet code4lib - Mon, 2016-05-23 14:00

Coloring was never my thing, even as a young child, the amount of decision required in coloring was actually stressful to me. Hence my skepticism of this zen adult coloring trend. How could something so stressful for me be considered a thing of “zen”. I purchased a book and selected coloring tools about a year ago, coloring bits and pieces here and there but not really getting it. Until now.

While reading an article about the psychology behind adult coloring, I found this quote to be exceptionally interesting:

The action involves both logic, by which we color forms, and creativity, when mixing and matching colors. This incorporates the areas of the cerebral cortex involved in vision and fine motor skills [coordination necessary to make small, precise movements]. The relaxation that it provides lowers the activity of the amygdala, a basic part of our brain involved in controlling emotion that is affected by stress. -Gloria Martinez Ayala [quoted in Coloring Isn’t Just For Kids. It Can Actually Help Adults Combat Stress]

A page, colored by Whitni Watkins, from Color Me Stress Free by Lacy Mucklow and Angela Porter

As I was coloring this particular piece [pictured to the left] I started seeing the connection the micro process of coloring has to the macro process of managing a library and/or team building. Each coloring piece has individual parts that contribute to forming the outline of full work of art. But it goes deeper than that.

For exampled, how you color and organize the individual parts can determine how beautiful or harmonious the picture can be. You have so many different color options to choose from, to incorporate into your picture, some will work better than others. For example, did you know in color theory, orange and blue is a perfect color combination? According to color theory, harmonious color combinations use any two colors opposite each other on the color wheel.” [7]  But that the combination of orange, blue and yellow is not very harmonious?

Our lack of knowledge is a significant hindrance for creating greatness, knowing your options while coloring is incredibly important. Your color selection will determine what experience one has when viewing the picture. Bland, chaotic or pleasing, each part working together, contributing to the bigger picture. “Observing the effects colors have on each other is the starting point for understanding the relativity of color. The relationship of values, saturations and the warmth or coolness of respective hues can cause noticeable differences in our perception of color.” [6]  Color combinations, that may seem unfitting to you, may actually compliment each other.  

Note that some colors will be used more frequently and have a greater presence in the final product due to the qualities that color holds but remember that even the parts that only have a small presence are crucial to bringing the picture together in the end. 

“Be sure to include those who are usually left out of such acknowledgments, such as the receptionist who handled the flood of calls after a successful public relations effort or the information- technology people who installed the complex software you used.”[2]

There may be other times where you don’t use a certain color as much as it should have and could have been used. The picture ends up fully colored and completed but not nearly as beautiful (harmonious) as it could have been. When in the coloring process, ask yourself often “‘What else do we need to consider here?’ you allow perspectives not yet considered to be put on the table and evaluated.” [2] Constant evaluation of your process will lead to a better final piece.

While coloring I also noticed that I color individual portions in a similar manner. I color triangles and squares by outlining and shading inwards. I color circular shapes in a circular motion and shading outwards. While coloring, we find our way to be the most efficient but contained (within the lines) while simultaneously coordinating well with the other parts. Important to note, that the way you found to be efficient in one area  may not work in another area and you need to adapt and be flexible and willing to try other ways. Imagine coloring a circle the way you color a square or a triangle. You can take as many shortcuts as you want to get the job done faster but you may regret them in the end. Cut carefully. 

Remember while coloring: Be flexible. Be adaptable. Be imperturbable.

You can color how ever you see fit. You can choose which colors you want, the project will get done. You can be sure there will be moments of chaos, there will be moments that lack innovation. Experiment, try new things and the more you color the better you’ll get. However, coloring isn’t for everyone, at that’s okay. 

Now, go back and read again, this time substitute the word color for manage.

Maybe there is something to be said about this trend of the adult coloring book. 

1. Coloring Isn’t Just For Kids. It Can Actually Help Adults Combat Stress
2. Twelve Ways to Build an Effective Team
3. COLOURlovers: History Of The Color Wheel
4. Smashing Magazine: Color Theory for Designers, Part 1: The Meaning of Color:
5. Some Color History
6. Color Matters: Basic Color Theory
7. lifehacker: Learn the Basics of Color Theory to Know What Looks Good
8. lifehacker: Color Psychology Chart
9. Why Flexible and Adaptive Leadership is Essential

DuraSpace News: VIVO Updates for May 22–VIVO Needs Your Financial Support

planet code4lib - Mon, 2016-05-23 00:00

From Mike Conlon, VIVO project director

DuraSpace News: VIVO Updates for May 15–Conference Posters due May 23, Remember to Register!

planet code4lib - Mon, 2016-05-23 00:00

From Mike Conlon, VIVO project director

Poster deadline extended.  There's still time for you to submit a poster to VIVO 2016!  This is a great opportunity for you to share your work with the VIVO community. The deadline for poster submissions has been extended to May 23.  See

Patrick Hochstenbach: Sktchy portrait

planet code4lib - Sat, 2016-05-21 07:58
Filed under: portaits, Sketchbook Tagged: fountainpen, illustration, ink, Photoshop, portrait, sktchy

District Dispatch: ALA briefs congress on critical impact of rural broadband access

planet code4lib - Fri, 2016-05-20 19:41

Launched in February of this year, the bipartisan Congressional Rural Broadband Caucus was founded “to facilitate discussion, educate Members of Congress and develop policy solutions to close the digital divide in rural America.”At its most recent meeting, Marijke Visser of the ALA’s Office for Information Technology Policy (OITP) and co-panelists from the public and private sectors briefed the Caucus, congressional staff and a general audience at a public session entitled “Strengthening Rural Economics through Broadband Deployment.”

Her presentation highlighted that libraries currently play a pivotal role in providing broadband access in rural communities, addressing the “E’s of Libraries®”across the country: employment and entrepreneurship, education, individual empowerment, and civic engagement. Noting broadly  that “Libraries

Source: Consumer Affairs

strengthen local economies through supporting small business development and entrepreneurship,”Marijke went on to provide specific examples of how libraries have  helped small businesses develop business plans, conduct market research, foster employee certification, use 3D printers, and even use library software programs to design and print creative menus for a restaurant.

She also spotlighted the growing importance of video conferencing availability to rural residents and communities, telling of how a new mother in Alaska received needed healthcare training via video conference at her local public library, thus avoiding a lengthy trip to Seattle, and how a business in Texas was able to secure a contract by using video conferencing through a local public library to expedite OSHA certification of 40 workers.

Marijke also emphasized that today’s libraries clearly are much more than book-lending facilities and places for children’s story time; they are one-stop community hubs, replete with maker spaces, digital production studios, video-conferencing capacity and more. In response to questions from Congressional staff, Marijke also highlighted various services libraries provide to veterans, including resume building, job application assistance, benefit application filing, and financial literacy training.

Marijke and her fellow panelists were welcomed to the Caucus’ meeting by Committee Co-Chairs Reps. Kevin Cramer (R-ND), and Mark Pocan (D-WI2), and Rep. Dave Loebsack (D-IA2). Membership in the Caucus currently stands at 34 Representatives. Its mission, as explained upon its launch by Rep. Bob Latta (R-OH5), is to “bring greater attention to the need for high-speed broadband in rural America, and help encourage and spur innovative solutions to address this growing consumer demand.”

ALA thanks the Caucus for the opportunity to participate in its event, and both the Office of Government Relations and OITP look forward to continuing to work with its members to boost broadband capacity in libraries and homes across rural America.

The post ALA briefs congress on critical impact of rural broadband access appeared first on District Dispatch.

Library of Congress: The Signal: UNESCO PERSIST: A Global Exchange on Digital Preservation

planet code4lib - Fri, 2016-05-20 15:57

This is a guest post by Robert R. Buckley, Technical Adviser at the National Archives of the UAE in Abu Dhabi and the Coordinator for the PERSIST Policy Working Group.

UNESCO PERSIST meeting in Abu Dhabi. Photo courtesy of National Archives of the UAE.

Readers of this blog would have first seen mention of the UNESCO PERSIST project in The Signal last January. It occurred in a guest post on intellectual property rights related to software emulation. Dealing with IP rights is one of the known challenges of digital preservation. Dealing with the volume of digital content being generated is another, requiring decisions on what content to select and preserve for the benefit of society. These and other digital preservation activities typically depend on policies that influence decision-making and planning processes with a view to enabling sustainability. All these issues fall within the scope for the PERSISTproject and were addressed at its recent meeting held March 14-16 in Abu Dhabi.

The meeting was hosted by Dr. Abdulla El Reyes, Director General of the National Archives of the UAE and Chair of the Memory of the World Program. PERSIST is part of the Memory of the World Program and a partnership between UNESCO, the International Council of Archives and the International Federation of Libraries Associations and Institutions. (If it were an acronym, PERSIST would stand for Platform to Enhance and Reinforce the Sustainability of the Information Society Trans-globally.) It is a response to the UNESCO/UBC Vancouver Declaration, adopted at the 2012 Memory of the World in the Digital Age: Digitization and Preservation in Vancouver, where conference participants agreed on the pressing need to establish a road map proposing solutions, agreements and policies for implementation by all stakeholders, in particular governments and industry.

The focus of the PERSIST project is on providing these stakeholders, as well as heritage institutions, with resources to address the challenges of long-term digital preservation and the risks of losing access to part of our digital heritage through technology obsolescence. Fostering a high-level dialogue and joint action on digital preservation issues among all relevant stakeholders is a core objective of PERSIST. For example, during the UNESCO General Conference last November in Paris, PERSIST hosted an event that included Microsoft, Google and the ACM. This is the kind of thing UNESCO is well positioned to do and where it can add value on a global scale in the very active and fertile field of digital preservation.

The Abu Dhabi meeting was attended by over 30 experts, representing heritage institutions, universities and governmental, non-governmental and commercial organizations from a dozen countries spread across five continents. The meeting had an ambitious agenda that included formulating an operating plan for 2016-2017. The major outcomes of the meeting were organized around the work of the three task forces into which PERSIST was divided: Content, Technology and Policy.

Official launch of the UNESCO/PERSIST Selection Guidelines. Photo courtesy of National Archives of the UAE

First was the launch of the UNESCO/PERSIST Guidelines for the selection of digital heritage for long-term preservation, drafted by the Content Task Force. The selection process, in the form of a decision tree, takes a risk-assessment approach to evaluating significance, assessing sustainability and considering availability in dealing with the overwhelming volume of digital information now being created and shared. Written by a team of seven experts from the library, archives, and museum community, the Guidelines aim to provide an overarching starting point for heritage institutions when drafting their own policies on the selection of digital heritage for long-term sustainable digital preservation.

Second was the progress by the Technology Task Force on defining the PERSIST technology strategy and finding an organizational home that would maintain, manage and make available the legacy software platform for future access to digital heritage at risk due to software obsolescence. (PERSIST is in contact with the Software Preservation Network and will be presenting at the SPN Forum in August.)

The diagram illustrates the role of the UNESCO PERSIST project in the digital preservation ecosystem, including access to legacy software licenses. The organizational home, which we have been calling the UNESCO PERSIST Organization or UPO, would complement the work of the UNESCO PERSIST project. It would be a non-profit that would be able to enter into legal agreements with software vendors—a significant capability. Conversations are underway with a candidate organization about hosting the UPO.

Role of UNESCO PERSIST in the Digital Preservation Ecosystem. Diagram by Natasa Milic-Frayling. CLICK TO ENLARGE

Third was the formal creation of the Policy Task Force. In one way or another its initial outputs are all related to the Recommendation concerning the preservation of, and access to, documentary heritage including in digital form, which was approved at the UNESCO General Conference in November 2015 and which requires action by UNESCO Member States. Besides contributing directly to the guidelines for implementing the digital part of the Recommendation, the task force also plans to take a community-based approach to develop supporting tools such as a Model National Digital Preservation Strategy and a Starter’s Guide for policymakers. Already the Selection guidelines provide a tool for the identification of documentary heritage called for by the Recommendation. The Policy team will also work with the other task forces on strategic policy questions.

From here, there is still much to be done in disseminating the selection guidelines that would make the challenges of digital preservation more manageable, in developing and putting on a firm foundation the software technology platform that would enable access to legacy documents, and in establishing policy guidelines that would provide institutional and national frameworks where they are most needed for the preservation of digital documentary heritage.

You can hear more about PERSIST at the IFLA WLIC 2016 and the SPN Forum in August, the ICA Congress in September and iPRES 2016 in October. You can also read about PERSIST online, watch an introductory video and follow it on Twitter at #unescopersist.

District Dispatch: Senate committee approves legislative branch funding without fireworks

planet code4lib - Fri, 2016-05-20 15:49

In stark contrast to Tuesday’s full House Appropriations Committee markup – which, as previously reported, featured almost 30 minutes of hot debate over legislative report language intended to bar the Library of Congress (LC) from retiring the subject headings “Aliens” and “Illegal aliens” — the Senate

Photo source: Daisuke Tashiro

Appropriations Committee took scarcely 3 minutes on Thursday to call up, “debate” and pass its version of the “Leg Approps” bill devoid of the House’s controversial provision. As in the earlier debate, the presidents of ALA and ALCTS wrote to key Members of the Senate Committee prior to the vote asking them not to incorporate language like that adopted by the House.

Both bills are now in line to be considered on the floors of their respective chambers but no timetable has yet been set . . . and that could take some time. Even if passed by both bodies, House and Senate negotiators will then need to reconcile differences between the bills, hot button LC subject heading report text included. Congressional insiders forecast that, if necessary, no such negotiations are anticipated until after November’s elections.  ALA and ALCTS will continue to educate all Members of Congress in the intervening months, however, about the House’s folly in countermanding the Library of Congress’ solidly reasoned, professional cataloging judgement.

The post Senate committee approves legislative branch funding without fireworks appeared first on District Dispatch.

Jonathan Rochkind: Really slow rspec suite? Use the fuubar formatter!

planet code4lib - Fri, 2016-05-20 15:19

I am working on a ‘legacy’-ish app that unfortunately has a pretty slow test suite (10 minutes+).

I am working on some major upgrades to some dependencies, that require running the full test suite or a major portion of it iteratively lots of times. I’m starting with a bunch of broken tests, and whittling them down.

It was painful. I was getting really frustrated with the built-in rspec formatters — I’d see an ‘f’ on the output, but wouldn’t know what test had failed until the whole suite finished, or or I could control-c or run with –fail-fast to see the first/some subset of failed tests when they happen, but interrupting the suite so I’d never see other later failures.

Then I found the fuubar rspec formatter.  Perfect!

  • A progress bar makes the suite seem faster psychologically even though it isn’t. There’s reasons a progress bar is considered good UI for a long-running task!
  • Outputs failed spec as they happen, but keep running the whole suite. For a long-running suite, this lets me start investigating a failure as it happens without having to wait for suite to run, while still letting the suite finish to see the total picture of how I’m doing and what other sorts of failures I’m getting.

I recommend fuubar, it’s especially helpful for slow suites. I had been wanting something like this for a couple months, and wondering why it wasn’t a built-in formatter in rspec — just ran across it now in a reddit thread (started by someone else considering writing such a formatter who didn’t know fuubar already existed!).  So I write this blog post to hopefully increase exposure!

Filed under: General

FOSS4Lib Upcoming Events: Hydra Virtual Connect 2016

planet code4lib - Fri, 2016-05-20 15:12
Date: Thursday, July 7, 2016 - 11:00 to 14:00Supports: Hydra

Last updated May 20, 2016. Created by Peter Murray on May 20, 2016.
Log in to edit this page.

Hydra Virtual Connect is a new opportunity for the Hydra community to ‘connect' online in between face-to-face meetings, complementing the annual fall Hydra Connect conference and regional Hydra events. Presentations will include some of the Hydra talks given at Open Repositories 2016 for those that were unable to attend, plus reports from partners, community members, and interest groups.

For more information, see the Hydra Virtual Connect wiki page at

OCLC Dev Network: Retrofitting an Existing API with JSON-LD

planet code4lib - Fri, 2016-05-20 13:00

Learn about how to use JSON-LD to add linked data support to an existing JSON API.

Peter Murray: Happy to Announce that I’m Joining Index Data

planet code4lib - Thu, 2016-05-19 21:25

Index Data posted an announcement on their blog about how I will be joining them next month. Confirmed! I'll be working on the open source library service platform that was announced by EBSCO last month, and more specifically in a role as an organizer and advocate for people participating in the project. It feels like my career has been building to this role. And it also means getting re-engaged in the OLE project; I was part of the design effort in 2008-2009 and then drifted away as professional responsibilities took me in other directions. In the executive overview of the OLE design report, we said:

…the project planners produced an OLE design framework that embeds libraries directly in the key processes of scholarship generation, knowledge management, teaching and learning by utilizing existing enterprise systems where appropriate and by delivering new services built on connections between the library’s business systems and other technology systems.

That vision is as important to libraries today as it was then — even as the state of technology has advanced to make this vision harder (in some ways) and easier (in others) to achieve.

I will miss my colleagues and work at Cherry Hill. Cary, Jungleen, Justin and everyone else are great to work with, which is really important with a virtual organization. The client work has also been interesting and challenging with the opportunity to learn more about Drupal and relearn how to do operations in a cloud computing environment.

This job change also means that I'll be moving away from the Islandora and CollectionSpace open source communities. I've learned a lot from these groups that I'll be taking forward into my next adventure. (Anyone interested in creating a virtual gathering point for library open source project community managers?) I have a soft spot in by heart for these projects, and I'll be watching them as the grow.

DuraSpace News: OpenVIVO at OR2016

planet code4lib - Thu, 2016-05-19 00:00

Austin, TX  Join Michael Conlon, VIVO project director and Graham Triggs, VIVO technical lead from Duraspace for an OpenVIVO poster presentation at Open Repositories 2016 next month in Dublin. OpenVIVO is a hosted VIVO for representing scholarly work that anyone with an ORCiD can use.

FOSS4Lib Recent Releases: pycounter - 0.14.0

planet code4lib - Wed, 2016-05-18 19:43

Last updated May 18, 2016. Created by wooble on May 18, 2016.
Log in to edit this page.

Package: pycounterRelease Date: Wednesday, May 18, 2016


Subscribe to code4lib aggregator