The state of the work is that is in progress. Like Fedora 4, CLAW is a complete rewrite of the entire Islandora stack. It is a collaborative community effort, and needs the resources of the community. An update on the project was included in the most recent Islandora Community Newsletter. You can check that out here.
- We have weekly CLAW Calls that you are more than welcome to join us on, and add items to the agenda.
- We send updates to the list each week after each call, and you can view them all here.
- We have monthly sprints which are held during the last two weeks of the month. If you (or your colleagues) are in a position to join, you are more than welcome to join us there too.
- We also have weekly CLAW lessons which are led by Diego Pino Navarro. You can find more information on them here.
- Data model, and Hydra interoperability? We're working on implementing the Portland Common Data Model (PCDM). More is available on that here and here.
If you want to see CLAW completed faster, you can help!
- Contribute developer time. Either your own, or some developer time from your institution. Not comfortable with the stack? Thats what CLAW lessons are for!
- Contribute funds: The Islandora Foundation is very close to having the necessary membership funding to hire a Technical Lead, who could devote a lot more time to coordinating the development of CLAW than our current volunteer team has available. Joining the Islandora Foundation has many benefits, but adding a Technical Lead to the project will be a big one in the CLAW corner.
- Contribute opinions: We need to know how you want CLAW to work. You are welcome to attend the weekly CLAW Call detailed above. Please also watch the listserv for questions about features and use cases.
Equinox is pleased to welcome Rogan Hamby to the team! He will join us as a Project and Data Analyst. Rogan got his undergrad degree in English Lit with minors in Computer Science and Sociology and then received his MLIS from the University of South Carolina. By repeatedly graduating he proved to be really bad at being the professional student he had aspirations to be. Deciding that a reference librarian was the next best thing he went off and after twenty years has done nearly every job you can do in a public library, many simultaneously.
In 2009 Rogan was involved with the creation of an Evergreen based public library consortium that he supported for eight years, overseeing operations, migrations and special projects. Believing in the philosophical principles of open source software and the cultural mission of libraries he found Evergreen to be the tool he needed and the community to be people he shared values with. When the time came for his next adventure Rogan realized that he wanted to remain in the Evergreen community and find new ways to aid libraries in their missions. Fortunately, he found a place at Equinox to do that.
Outside of work Rogan has managed to be married, have three children, learn archery from buddhist monks and once play D&D with Dave Arneson. He doesn’t have a favorite book but has re-read works by and about J. R. R. Tolkien more than is probably healthy.
Grace Dunbar, Equinox Vice President, had this to say about the hire: “We’re delighted to be adding Rogan to our team of librarians and techies… and techie-librarians. Many of us have known Rogan for years and we deeply value his experience both in libraries and in the Evergreen community. We also respect his deep and abiding love of Tolkien.”
Open Data is a new and still not very well understood concept in Guyana, as is probably the case in other countries as well. The GIS Collective, a group of volunteers, each highly skilled and experienced in Geographic Information Systems (GIS), know the value of data being available to help a country to develop, and the hurdles posed by unavailable or outdated data.
Secondary school teachers can impart their knowledge to the upcoming generation of youth on the subject. The GIS Collective therefore offered a short seminar on open data for secondary school Geography and IT teachers based in and around the capital city, Georgetown, working through the office of the Guyana Chief Education Officer (CEO) and with the support of the Assistant CEO for Secondary Schools. The event was hosted on the 11 March 2016 at the National Centre for Educational Resource Development (NCERD) located in the Kingston ward of Georgetown.
The idea of open data was briefly presented and discussed, that is ‘What is Open Data?’ and ‘What Open Data does for National Development’. However the main part of the seminar involved the teachers learning-by-doing, producing open data themselves.
The teachers were introduced to a source of open spatial data – Open Street Map (OSM) and taught to use and edit it themselves. The teachers were organised into groups of 4-6 people and using Field Papers to make notes, they walked and surveyed various parts of the surrounding area of the city. Using laptops and the OSM iD editor the teachers then transferred their observations to OSM, digitizing building outlines, naming and describing landmarks, and so on.
The group enriched OSM by adding information on Government Ministries, Embassies, private companies and other buildings, and historic structures such as the Georgetown Lighthouse (built 1830), the Umana Yana (a national landmark built by indigenous peoples) and the Georgetown Seawall Roundhouse (built 1860).
The teachers were enthusiastic participants, and enjoyed the hands-on approach of the seminar. Some have apparently already continued to edit OSM in other areas of Guyana in the days following the seminar. The organisers are grateful for the support of the Guyana Ministry of Education and Open Knowledge International.
The following is a guest post by Andrea Goethals, Digital Preservation and Repository Manager at Harvard Library.
It’s St. Patrick’s Day, so I wanted to have a catchy Irish saying for the title but, believe it or not, Irish sayings about web archiving or even the web are hard to find. I did find some great phrases though, especially “You must take the little potato with the big potato.” Potatoes seem to be a common theme in Irish sayings, along with rain.
In the last couple years within Harvard Library, when we haven’t been thinking about our own frequently inclement weather, we have been thinking a lot about web archiving and what our strategy should be for scaling up our web archiving activities. We wanted to know more about the current practices, needs and expectations of other institutions who are either actively engaged in web archiving or would like to be, and if our institutions had common needs that might be addressed by collaborative efforts.
With the generous support of the Arcadia Fund, my colleague Abigail Bordeaux and I worked closely with Gail Truman of Truman Technologies to conduct a five-month environmental scan of web archiving programs, practices, tools and research. The final report is now available from Harvard’s open access repository, DASH.
The heart of the study was a series of interviews with web archiving practitioners from archives, museums and libraries worldwide; web archiving service providers; and researchers who use web archives. The interviewees were selected from the membership of the International Internet Preservation Consortium, the Web Archiving Roundtable at the Society of American Archivists, the Internet Archive’s Archive-It Partner Community, the Ivy Plus institutions, Working with Internet archives for REsearch (Ruters/WIRE Group), and the Research infrastructure for the Study of Archived Web materials (RESAW).
The interviews of web archiving practitioners covered a wide range of areas, everything from how the institution is maintaining their web archiving infrastructure (e.g. outsourcing, staffing, location in the organization), to how they are (or aren’t) integrating their web archives with their other collections. From this data, profiles were created for 23 institutions, and the data was aggregated and analyzed to look for common themes, challenges and opportunities.
In the end, the environmental scan revealed 22 opportunities for future research and development. These opportunities are listed in Table 1 (below) and described in more detail in the report. At a high level, these opportunities fall under four themes: (1) increase communication and collaboration, (2) focus on “smart” technical development, (3) focus on training and skills development and (4) build local capacity.
One of the biggest takeaways is that the first theme, the need to radically increase communication and collaboration among all individuals and organizations involved in some way in web archiving, was the most prevalent. Thirteen of the 22 opportunities fell under this theme. Clearly much more communication and collaboration is needed among those collecting web content but also between those who are collecting it and researchers who would like to use it.
This environmental scan has given us a great deal of insight into how other institutions are approaching web archiving, which will inform our own web archiving strategy at Harvard Library in the coming years. We hope that it has also highlighted key areas for research and development that need to be addressed if we are to build efficient and sustainable web archiving programs that result in complementary and rich collections that are truly useful to researchers.
New vacancy listings are posted weekly on Wednesday at approximately 12 noon Central Time. They appear under New This Week and under the appropriate regional listing. Postings remain on the LITA Job Site for a minimum of four weeks.
New This Week:
Visit the LITA Job Site for more available jobs and for information on submitting a job posting.
I left there thinking I had basically got my Capybara JS tests reliable enough… but after that, things degraded again.
But now I think I really have fixed it for real, with some block/wait rack middleware based on the original concept by Joel Turkel, which I’ve released as RackRequestBlocker. This is middleware to keep track of ‘outstanding’ requests in your app that were triggered by a feature spec that has finished, and let the main test thread wait until they are complete before DatabaseCleaning and moving on to the next spec.
My RackRequestBlocker implementation is based on the new hotness concurrent-ruby (a Rails5 dependency, great collection of ruby concurrency primitives) instead of Turkel’s use of the older `atomic` gem, and using actual signal/wait logic instead of polling, and refactored to have IMO a more convenient packaged API. Influenced by Dan Dorman’s unfinished attempts to gemify Turkel’s design.
It’s only a few dozen lines of code, check it out for an example of using concurrent-ruby’s primitives to build something concurrent.
And my Capybara JS feature tests now appear to be very very reliable, and I expect them to stay that way. Woot.
To be clear, I also had to turn off DatabaseCleaner transactional strategy entirely, even for non-JS tests. Just RackRequestBlocker wasn’t enough, neither was just turning off transactional strategy. Either one by themselves I still had crazy race conditions — including pg:deadlocks… and actual segfaults!
Why? I honestly am not sure. There’s no reason transactional fixture strategy shouldn’t work when used only for non-JS tests, even with RackRequestBlocker. The segfaults suggests a bug in something C; MRI, pg, poltergeist? (poltergeist was very unpopular in the reddit thread on my original post, but I still think it’s less bad than other options for my situation.) Bug of some kind in the test_after_commit gem we were using to make things work even with transactional fixture strategy? Honestly, I have no idea — I just accepted it, and was happy to have tests that were working.
Try out RackRequestBlocker, see if it helps with your JS Capybara race condition problems, let me know in comments if you want, I’m curious. I can’t support this super well, I just provide the code as a public service, because I fantasize of the day nobody has to go through as many hours as I have fighting with JS feature tests.
Filed under: General
Library advocates! There is still time to apply for this year’s White House Conference on Library and Information Services Taskforce (WHCLIST) award. Applications are due on April 1, 2016 (no joke!).
This award is granted to a non-librarian participant in National Library Legislative Day (NLLD). The winner receives a stipend of $300 and two free nights at the Liaison hotel.
The criteria for the WHCLIST Award are:
- The recipient should be a library supporter (trustee, friend, general supporter) and not a professional librarian.
Recipient should be a first-time attendee of NLLD.
- Representatives of WHCLIST and the ALA Washington office will choose the recipient. The ALA Washington Office will contact the recipient’s senators and representatives to announce the award. The winner of the WHCLIST Award will be announced at NLLD.
- The deadline for applications is April 1, 2016.
To apply for the WHCLIST award, please submit a completed NLLD registration form; a letter explaining why you should receive the award; and a letter of reference from a library director, school librarian, library board chair, Friend’s group chair, or other library representative to:
Grassroots Communications Coordinator
American Library Association
1615 New Hampshire Ave., NW
Washington, DC 20009
Note: Applicants must register for NLLD and pay all associated costs. Applicants must make their own travel arrangements. The winner will be reimbursed for two free nights in the NLLD hotel in D.C and receive the $300 stipend to defray the costs of attending the event.
Library of Congress: The Signal: Advancing Institutional Progress through Digital Repository Assessment
The following is a guest post by Jessica Tieman.
Three quarters of the way into my twelve-month National Digital Stewardship Residency at the U.S. Government Publishing Office, I reflect on the success and challenges of my project. I also recognize how the outcome of my work will impact the future of the GPO, its business units, the communities within the Federal Government and the general public that are all invested in the success of GPO’s audit and certification of govinfo (formerly FDsys) repository.
In this post, I’ll give a brief update on my assigned role at GPO: to prepare govinfo for ISO 16363 Trustworthy Digital Repository Audit and Certification. I’ll also explain how preparing for the audit has served as a way for GPO to advance its strategic plan to transition from a print-centric model to a content-centric digital agency.
I have been collecting and evaluating all existing documentation relating to GPO’s govinfo to satisfy the requirements of the 109 criteria outlined in ISO 16363. Not only this, but where necessary, I have had the opportunity to participate in and sometimes offer guidance for writing and developing documentation for procedures and processes based on digital preservation best practices not yet fully captured within existing documentation at the time I arrived at GPO.
In preparation for evaluating GPO’s documentation and readiness for an ISO 16363 audit, I interviewed TRAC-certified and OAIS-compliant digital repository managers to gather feedback about repository assessment to share with GPO internal staff. I am beginning the internal audit process.
GPO implemented a SharePoint folder-based system to organize all of their documentation and evidence by each criteria. Documentation includes workflows, roles and responsibilities and organizational charts, strategic plans, technical documentation, project specifications, meeting notes, planning documents and vision statements, data management definitions, risk registries, standard operating procedures, policies, and gathered statistics on systems and users, and more. (the repository’s Architecture System Design document is available online.
For me to evaluate GPO against each criteria, I will assess the content I have gathered within the SharePoint system for:
• Adequacy: How well it satisfies the specific criteria requirements;
• Transparency: Documentation truly captures the repository activities and practices and is written with clarity
• Measurability: Procedures have been documented in a manner that can be substantiated through measurable outcomes, such as not only listing end-user requirements, but also providing the data collected on users to validate the requirements;
• Sustainability: Are the current processes scalable and will they remain effective over time as systems, people and funding change
This project has been a time-consuming and highly detailed. What surprised me most about it has been the dynamic nature of evaluating and creating “good” documentation. Many times I found that a document seemed to perfectly meet the expectations of criteria, but later, I realized the many ways in which it wasn’t actually enough.
There is another dynamic aspect to the project: advancing institutional change through assessment. The U.S. Government Publishing Office was the U.S. Government Printing Office for over 150 years until its recent name change in 2014. In addition to the intentional name change, GPO has increasingly been engaging in business and fostering internal developments to further its commitment to authenticating, preserving and distributing Federal information that remains critically important to American democracy in the digital age.
This is a unique opportunity for an ISO 16363 Audit and Certification of the govinfo repository to support GPO’s efforts. In many ways, the govinfo repository plays a critical role in this transformation as it exists as the primary source of Federal information products for all of GPO’s stakeholders and its user community, including all branches of Federal government, the depository and non-depository library community, local and state government, private industries, non-profit organizations, transparency organizations, legal professionals, researchers, data consumers, and the general public.
The value of the govino repository to the Federal Depository Library Program is changing due to GPO’s overall digital transformation. Indeed, the present-day FDLP program was initially codified in Title 44, Chapter 19 of the U.S. Code to mandate availability of government publications for public access. In 1993, Title 44 was expanded to include a mandate for electronic access to government publications in an online facility managed by GPO.
Since this time, GPO has fulfilled this responsibility by providing an open, free, publicly accessible preservation repository, with the goal of functioning as the official resource for government information products. In order to meet this goal, however, collection development for govinfo is essential to increase the variety of digital content within the repository, including digitized content submitted by FDLP libraries.
The repository itself is both impacting and reacting to this “digital stimulus.” This has an effect on how I determine the sustainability of the repository’s documentation in the context of the audit. The repository’s underlying technology will need to be flexible enough to anticipate content that may be arriving from new library partnerships. Staff must be agile enough to develop the workflows for handling producer-archive agreements, digitization guidelines and ingest processes.
For GPO, the eagerness for a certification process impels the need for cross-functional decision-making across business units, receptiveness to new policies and procedures centered around digital publication and preservation standards, and a strong commitment to communicate these new commitments and values to their user community, which includes both their Federal stakeholders, the depository library community and the American public at-large.
The Open Syllabus Project aggregates college syllabi and analyzes the data to provide a variety of ways to explore the data held within. Their stated goal is to provide “a platform for curricular exploration and research.” Through their Open Syllabus Explorer you can search and browse to discover which books appear in these syllabuses. They also rank the texts based on how frequently they are assigned, giving a “teaching score” to each one. There is only one text that has received their highest score — not surprisingly, The Elements of Style. Here are the top twenty, with links to WorldCat:
- The Elements of Style by William Strunk and E.B. White
- Republic by Plato
- The Communist Manifesto by Karl Marx
- Biology by Neil Campbell
- Frankenstein by Mary Wollstonecraft Shelley
- Ethics by Aristotle
- Leviathan by Thomas Hobbes
- The Prince by Niccolo Machiavelli
- Oedipus by Sophocles
- Hamlet by William Shakespeare
- The Odyssey by Homer
- Orientalism by Edward Said
- A Manual for Writers of Term Papers, Theses, and Dissertations by Kate Turabian
- The Illiad by Homer
- Heart of Darkness by Joseph Conrad
- Canterbury Tales by Geoffrey Chaucer
- Antigone by Sophocles
- Letter from Birmingham Jail by Martin Luther King, Jr.
- On Liberty by John Stuart Mill
- The Structure of Scientific Revolutions by Thomas Kuhn
Thanks for Thom Hickey for bringing this project to my attention.
About Roy Tennant
Roy Tennant works on projects related to improving the technological infrastructure of libraries, museums, and archives.Mail | Web | Twitter | Facebook | LinkedIn | Flickr | YouTube | More Posts (95)
I have been very fortunate to address librarians nationally and internationally as a speaker. I love sharing my ideas, experiences, and things I’ve learned and meeting other librarians. I have gotten research ideas, column ideas, and made friends through my travels. I have visited places I’d always wanted to visit. I am not nearly as prolific as some library speakers and my productivity dropped precipitously in 2009 when I had my son, but I still get great pleasure out of sharing knowledge and contributing to the profession.
I’ve had many wonderful experiences as a speaker in terms of how I was treated by conference organizers. We tend to focus so much on the negative that I wanted to highlight some of the amazingly warm and considerate experiences I’ve had where people extended themselves to make me feel welcome. In Puerto Rico, I was taken around the country, taken on hikes, and taken to dinner by some amazingly kind librarians. In New Zealand, with my son tagging along, my family was invited to dinner at the house of one of the librarians who brought out toys for Reed and gave him two books (that we still read and treasure). There was an amazing Haka to kick off the conference and at the end of my keynote, the whole audience SANG to me! In Hawaii, I was given a beautiful lei after each of my talks and the conference organizers took my family out to breakfast. In Iceland, I came to my hotel room to find a giant basket of yummies from the conference organizers.
Those definitely stand out as the most wonderful acts of kindness I have experienced as a speaker. I keep those people and experiences in my heart and would speak again for any of those organizations without hesitation. There were also many other conference experiences where I was treated with consideration by the conference organizers and received a speaker’s gift, was invited out for a meal, got a thank-you card, or received some other recognition of their gratitude for my giving my time to their organization.
Then there are the not-so-nice experiences. Like when I was dropped off at my motel after giving a half-day workshop only to find that there was nowhere to eat within walking distance and I had to eat my dinner from a vending machine. Or the countless state/provincial library conferences I’ve attended (that are not my own) where I ate lonely meals by myself because no one considered the fact that I wouldn’t know anyone (and, FYI, not all speakers are extroverts). Or the ones where I was told at the last minute that the technology I was using won’t work because they never provided me with any information, leaving me to stress needlessly in the minutes before I gave my presentation.
Conference speakers are likely spending a significant amount of time creating content and traveling to be at your conference. Whether you pay them or not, there are certain common courtesies you should extend to speakers who are not a part of your organization:
1. Give the speaker all of the relevant background on the conference – let them know who will be there (what sorts of positions, from what types of libraries, etc.), what the organization does, and anything else that will give the speaker a sense of how to tailor their presentation.
2. Let the speaker know how long they will have to speak – this seems obvious, but I once was told I had an hour and a half to present and take questions and designed my talk around that. Five minutes before I was to start my keynote, I was told that they were going to take 20 minutes or so to do introductions. I took it in stride because I’m an instruction librarian and this happens to us all the time, but it was no less insensitive.
3. Ask the speaker about their technological needs/preferences and any limitations on your side – Some presenters are really particular about using their computer or have to play videos or show stuff from the Internet. Ask them about their needs and preferences beforehand. And let them know about any limitations on your side: if you need them to use your computer, if they have to use PowerPoint, etc. I often make PDFs of my Keynote presentations so that if I can’t use my own computer, I can just use the PDF version (the conversion from Keynote to PowerPoint is not always smooth). One time, I was told 15 minutes before my presentation that I had to use the organizers’ computer and have my presentation in PPT format because they were using some fancy recording software. I was completely flummoxed and stressed and, in my hasty conversion of my slides, the formatting got screwed up. If the computer is not going to be at the podium, provide the speaker with a clicker. The speaker should not have to worry about any of this just before they speak – it’s your job as the organizer to smooth the way for them. The incomparable Jenica Rogers wrote about this issue as well.
4. Invite them out for a meal – they may or may not want to go, especially if they’ve had a long flight, but extending the invitation shows a level of consideration for their needs and the fact that they traveled all this way to be with you.
5. Give them all of the information they need about travel logistics and reimbursement – There shouldn’t be any guesswork on the part of the speaker. I once showed up at the conference hotel where the organizer was supposed to have made me a hotel reservation only to find that I didn’t have one. I had to frantically call them and get it sorted out, spending an hour sitting in the lobby fretting. Turns out they’d made the reservation, but in their own name, and I was supposed to have psychically known that. I once had to argue about my honorarium with an organizer 10 minutes before my keynote. She said that I’d agreed to accept $x for my keynote and that it included my travel reimbursement. Luckily I had the emails to prove that she’d agreed to $x + travel expenses, because $x didn’t even cover my travel expenses! The kicker with that one was that I’d more than halved my keynote speaking fees because I had a soft spot for the consortium (I’d used my first dial-up internet through them back in the day).
All that brings up another point –
6. Give them your cell number or the number of someone who can help them if something comes up – shit happens. Flights get delayed or canceled. Hotel reservations may not exist. Speakers get sick. The speaker might need to contact you and they should not have to do that via email.
7. Tell them about the area – if a person is coming from out of town, they more than likely aren’t familiar with the area in which they’re staying. Tell them about local restaurants you like, attractions they might be interested in, hotel amenities, etc. Yes, we live in the age of Yelp and Google Maps, but it’s still nice to get personal recommendations. And if you have a car and they don’t, see if there’s anything they need. I recently had to walk through a really sketchy neighborhood early in the morning to get to a drug store to buy decongestants. It would have been really nice if someone could have just picked them up for me (or warned me about what I’d have to walk through to get there!).
8. Give them a gift – it doesn’t have to be much – some chocolates, a pen, some stationery, even just a thank-you card. Spending $10 on a speaker will make them feel significantly more appreciated than if you do nothing and it’s such a small effort. I periodically am asked to speak to LIS school students in the Portland cohort of Emporia University. They always give me a little tchotchke, and though it’s not much, it illustrates the fact that they have thought of me and appreciate my time. They also always send me a thank-you card signed by all the students in the class. That means even more. Even at a large-scale conference like Internet Librarian, speakers were given a speaker’s gift to show their appreciation .
9. Make them feel welcome – not all speakers are extroverts. I’m not at all, and going to a place where I don’t know anyone can be intimidating. A good organizer should introduce the speaker to people and make sure they have folks to talk to.
A lot of these things are small things that take just a little effort or a tiny bit of money, but what they do is say to the speaker “I appreciate you and your effort.” You notice I didn’t put “pay them” here, though I do believe that speakers should be paid as what we do provides value. There are all sorts of reasons to accept or not to accept money for a talk (some may have to do with the policies of the institution where you work) and I think it’s up to each person to do that calculus on their own. Whether you are paid or not, most of these things are still common courtesies worth following. They don’t always scale up at a massive conference, but they’re worth keeping in mind.
I don’t think people who treat speakers badly or indifferently are doing so out of malice. I think it has more to do with a lack of consideration and empathy, but the impact on the speaker is the same. I once spent at least 18 hours preparing for a keynote presentation, missing time with my family to do so. I missed work and had to take three flights to get there and back from a conference I had no connection to. I was not invited out for a meal. I was not given basic information I needed to give a solid presentation and had to stress about tech before the talk instead of eating lunch. I was not given a gift or thank-you card of any kind nor any recognition other than “thanks, have a good flight.” I felt used. I felt angry. And I felt stupid for saying yes in the first place. When I have experiences like that, I honestly feel like I never want to speak at another conference again. But then I remember the wonderful experiences I’ve had. The trouble is that you rarely know which kind of experience you’re going to have before you say yes.
Speakers are the lifeblood of any conference. Without them, what do you have? If they are investing their time in traveling and providing content for your conference, the least you can do is treat them with consideration. This summer, I’m doing a preconference workshop for the Association of Christian Librarians, a group I presented for back in 2008. I remember the organizers and attendees being so gracious and easy to work with last time that I was happy to do it again. If you treat your speakers badly, eventually word will spread about it. The library community isn’t as big as you think and a lot of people who give talks around the country know and talk to each other.
But don’t do it just because of that. Do it because you appreciate your speakers.
If you’re interested in reading other librarian speakers’ perspectives, check out this 2007 Cites and Insights article by Walt Crawford that summarizes posts from a number of blogs with advice for both conference speakers and organizers.
Help shape the future of LITA by voting and then staying in touch with your elected officials to make your voice heard.
The 2016 election will be open March 15 – April 22, and results will be announced on April 29. For the 2016 election, eligible members will be sent their voting credentials via email between March 15-18, 2016. If you have not as yet received your voting email, you can initiate the process at this ALA Elections Page.
Candidates for LITA Vice-President/President-Elect
Candidates for LITA Directors-at-large (two elected for three year terms)
Candidates for LITA Councilor
LITA Members Running for ALA President
LITA Members Running for ALA Council
Ana Elisa de Campos Salles
Mario M. Gonzalez
Jennifer Rushton Jamison
Colby Mariva Riggs
Edward L. Sanchez
LITA Nominating Committee:
Michelle Frisque, Chair
For questions about your membership status for voting, please contact ALA’s Member and Customer Service (MaCS) at 1-800-545-2433, press 5 (International members should call +1-312-944-6780) or firstname.lastname@example.org. Visit the ALA Election page for more information about this year’s vote and to view candidates running for ALA offices.
It’s taken just over a year for the Senate to vote on S. 337, the FOIA Improvement Act, but its unanimous approval yesterday is a wonderful way to celebrate Sunshine Week 2016! ALA and many other advocates’ attention will now be focused on clearing the final hurdles to marking FOIA’s 50th anniversary (fittingly on July 4th) with a White House signing ceremony.
Before that can happen, however, Senate and House negotiators first must reconcile S. 337 with the House’s own version of FOIA reform, H.R. 653, passed unanimously in that chamber in January of this year. While similar, the bills are not identical in several substantive ways as this excellent Congressional Research Service history and side-by-side comparison details. With an extra-long summer recess to accommodate the major parties’ political conventions looming, and a legislative calendar further truncated by the 2016 elections themselves, time will be tight if Congress and the public are to avoid the sad situation we were left in at the end of the 113th Congress when time simply ran out to enact FOIA reform in 2014!
As just passed by the Senate, key provisions of the FOIA Improvement Act would: strengthen the Office of Government Information Services (OGIS); “require the Director of the Office of Management and Budget to ensure the operation of a consolidated online request portal that allows a member of the public to submit a request for records to any agency from a single website;” and codify the President’s “presumption of openness” policy instituted for all federal agencies at the very start of this Administration.
ALA sincerely thanks Senator John Cornyn (R-TX), Senate Judiciary Committee Chairman Charles Grassley (R-IA) and Judiciary Ranking Member Patrick Leahy (D-VT) not only for introducing and supporting S. 337 in the current Congress, but for their longstanding commitment to meaningful FOIA reform over many years and multiple Congresses. With their continued leadership, ALA will continue to push with our allies for the House and Senate to quickly “conference” their two bills so that both chambers of Congress can vote again before time runs out to send broad FOIA reform to the President for the first time in many years.
Stay tuned for more on how you can help support that effort, and secure the President’s signature, soon.
The post FOIA reform unanimously passed by Senate faces final hurdle appeared first on District Dispatch.
The Digital Public Library of America has an opening for the position of DPLA Network Manager.
The Digital Public Library of America is growing our Hubs Network. DPLA Hubs include Content and Service Hubs, and represent almost 2,000 cultural heritage institutions throughout the country. Over the next several years, we have the goal for cultural heritage institutions in every state to have an on-ramp to DPLA. This position will play a critical role in helping to grow, document and coordinate activities for the Hubs Network.
Reporting to the DPLA Director for Content, the DPLA Network Manager would perform the follow job duties:
- assist the DPLA Director for Content in building the DPLA Network by working with potential Hubs to assure their success as members of the network
- manage communications with the Hub network
- coordinate the Hubs application process
- provide documentation for the network on various activities
- oversee website updates related to Hubs network information needs, i.e. materials designed to help plan new Hubs, information about the Hubs network, application materials, etc.
- field inquiries about joining the network
- keep statistics related to the network and network activities
- provide education and training to the Hub network, including assisting with the implementation of rightsstatements.org
- assist in curation activities, such as building Exhibitions, Primary Source Sets or Network-owned ebook collections
Experience required: The ideal candidate will have 5+ years of working in digital libraries or a related setting, preferably in a collaborative environment. The Network Manager will understand the operations of the DPLA Hubs including digitization, aggregation, metadata standards and normalization, rights status determination and the human resources required to carry out these activities. The ideal candidate will also possess excellent written and verbal communication skills and strong customer service orientation. The ability to travel to Hub locations and/or other locations to deliver education, training or project presentations is required.
Experience preferred: Direct experience with aggregation of metadata; Knowledge of metadata aggregation tools; Knowledge of the resources required to build and maintain a DPLA Hub. Prior experience with project management and/or personnel management.
Education Required: MLS or related degree
Like its collection, DPLA is strongly committed to diversity in all of its forms. We provide a full set of benefits, including health care, life and disability insurance, and a retirement plan. Starting salary is commensurate with experience.
This position is full-time. DPLA is a geographically-distributed organization, with roughly half of its employees in its headquarters in Boston, Massachusetts, and most in the Northeast corridor between Washington and Boston. Given the significant travel and collaboration associated with this position, proximity to the majority of DPLA’s staff is helpful, and easy access to a major airport is essential.
The Digital Public Library of America strives to contain the full breadth of human expression, from the written word, to works of art and culture, to records of America’s heritage, to the efforts and data of science. Since launching in April 2013, it has aggregated over 11 million items from nearly 2,000 institutions. DPLA is a registered 501(c)(3) non-profit.
To apply, send a letter of interest detailing your qualifications, resume and a list of 3 references in a single PDF to email@example.com. First review of applications will begin April 15, 2016 and will continue until the position is filled.
Journal of Web Librarianship: Libraries and Faculty Collaboration: Four Digital Scholarship Examples
The Access 2016 Program Committee invites proposals for participation in this year’s Access Conference, which will be held on the beautiful campus of the University of New Brunswick in the hip city of Fredericton, New Brunswick from 4-7 October.
There’s no special theme to this year’s conference, but — in case you didn’t know — Access is Canada’s annual library technology conference, so … we’re looking for presentations about cutting-edge library technologies that would appeal to librarians, technicians, developers, programmers, and managers.
Access is a single-stream conference that will feature:
• 45-minute sessions,
• lightning talks (speakers have five minutes to talk while slides—20 in total—automatically advance every 15 seconds),
• a half-day workshop on the last day of the conference,
• and maybe a surprise or two: if you have a bright idea for something different (panel, puppet show, etc.), we’d love to hear it
To submit your proposal, please fill out the form by 15 April.
Please take a look at the Code of Conduct too.
We’re looking forward to hearing from you!
Ticket prices have been set for 2016.
Full conference tickets include admission to hackfest, two and a half days of our amazing single-stream conference and a half-day workshop on the last day. It is all you can eat for one amazingly low price. All prices in Canadian dollars.Ticket Options: 1. Early Bird – $350
A limited number of tickets will be available and should go on sale in June. Don’t miss out on this amazing deal.2. Regular – $450
Standard ticket rates are still unbeatable. You can’t go wrong for four days at this price.3. Speaker Rate – $300
Want to pitch in? Have you got a great project to share? Get your proposal approved and we’ll cut you a great deal. Speakers are provided discounted tickets and can register once approved. As with other tickets, includes hackfest, two and half day conference and half-day workshop.4. Student Rate – $200 (limited)
Limited to 25 tickets, these should go on sale in June as well. Student rate includes hackfest, two and half day conference and half-day workshop. Educational identification will be required.5. One-Day Pass – $225
Only interested or only have time for one great day? We’ve got you covered.
We’ll let you know when tickets are on sale!
In 2001 the World Health Organization worked with the major publishers to set up Hinari, a system whereby researchers in developing countries could get free or very-low-cost access to health journals. There are similar systems for agriculture, the environment and technology. Why would the publishers give access to their journals to researchers at institutions that hadn't paid anything?
The answer is that the publishers were not losing money by doing so. There was no possibility that institutions in developing countries could pay the subscription. Depriving them of access would not motivate them to pay; they couldn't possibly afford to pay. Cross-subsidizing their access cost almost nothing and had indirect benefits, such as cementing the publishers' role as gatekeepers for research, and discouraging the use of open access.
Similarly, peer-to-peer sharing of papers didn't actually lose the major publishers significant amounts of money. Institutions that could afford to subscribe were not going to drop their subscriptions and encourage their researchers to use these flaky and apparently illegal alternatives. The majority usage of these mechanisms was from researchers whose institutions would never subscribe, and who could not afford the extortionate pay-per-view charges. Effective techniques to suppress them would be self-defeating. As I wrote in The Maginot Paywall:
Copyright maximalists such as the major academic publishers, are in a similar position. The more effective and thus intrusive the mechanisms they implement to prevent unauthorized access, the more they incentivize "guerilla open access". Then last June Elsevier filed a case in New York trying to shut down Library Genesis and Sci-Hub. Both are apparently based in Russia, which is not highly motivated to send more of its foreign reserves to Western publishers. So the case was not effective at shutting them down. It turned out, however, to be a classic case of the Streisand Effect, in which attempting to suppress information on the Web causes it to attract far more attention.
The Streisand Effect started slowly, with pieces at Quartz and BBC News in October. The EFF weighed in on the topic in December with What If Elsevier and Researchers Quit Playing Hide-and-Seek?:
Sci-Hub and LibGen have now moved to new domains, and Sci-Hub has set up a .onion address; this allows users to access the service anonymously through Tor. How quickly the sites have gotten back on their feet after the injunction underscores that these services can't really be stopped. Elsevier can't kill unauthorized sharing of its papers; at best, it can only make sharing incrementally less convenient. But the Streisand Effect really kicked in early last month with Simon Oxenham's Meet the Robin Hood of Science, which led to Fiona MacDonald's piece at Science Alert, Kaveh Waddell's The Research Pirates of the Dark Web and Kieran McCarthy's Free science journal library gains notoriety, lands injunctions. Mike Masnick's Using Copyright To Shut Down 'The Pirate Bay' Of Scientific Research Is 100% Against The Purpose Of Copyright went back to the Constitution:
Article 1, Section 8, Clause 8 famously says that Congress has the following power:
To promote the progress of science and useful arts, by securing for limited times to authors and inventors the exclusive right to their respective writings and discoveries. and the 1790 Copyright Act, which was subtitled "An Act for the Encouragement of Learning." Encouragement of learning is what Sci-Hub is for. Mike Taylor's Barbra Streisand, Elsevier, and Sci-Hub was AFAIK the first to point out that Elsevier had triggered the Streisand Effect. Simon Oxenham followed up with The Robin Hood of Science: The Missing Chapter, making the connection with the work of the late Aaron Swartz.
Barbara Fister made the very good point that Universities don't just supply the publishers with free labor in the form of authoring and reviewing:
Because it is labor - lots of labor - to maintain link resolvers, keep license agreements in order, and deal with constant changes in subscription contents. We have to work a lot harder to be publishers' border guards than people realize. and she clearly lays out the impossible situation librarians are in:
We feel we are virtually required to provide access to whatever researchers in our local community ask for while restricting access from anyone outside that narrowly-defined community of users. Instead of curators, we're personal shoppers who moonlight as border guards. This isn't working out well for anyone. Unaffiliated researchers have to find illegal work-arounds, and faculty who actually have access through libraries are turning to the black market for articles because it seems more efficient than contacting their personal shopper, particularly when the library itself doesn't figure in their work flow. In the meantime, all that money we spend on big bundles of articles (or on purchasing access to articles one at a time when we can't afford the bundle anymore) is just a really high annual rent. We can't preserve what we don't own, and we don't curate because our function is to get what is asked for. The Library Loon has a series of posts that are worth reading (together with some of their comments). She links to A Short History of The Russian Digital Shadow Libraries by Balázs Bodó, a must-read analysis starting in Soviet times showing that Sci-Hub is but one product of a long history of resistance to censorship. Bodó has a more reflective piece In the Name of Humanity in Limn's Total Archive issue, where he makes the LOCKSS argument:
This is the paradox of the total piratical archive: they collect enormous wealth, but they do not own or control any of it. As an insurance policy against copyright enforcement, they have already given everything away: they release their source code, their databases, and their catalogs; they put up the metadata and the digitalized files on file-sharing networks. They realize that exclusive ownership/control over any aspects of the library could be a point of failure, so in the best traditions of archiving, they make sure everything is duplicated and redundant, and that many of the copies are under completely independent control. The Loon's analysis of the PR responses from the publishers is acute:
Why point this effluent at librarians specifically rather than academe generally? Because publishers are not stupid; libraries are their gravy train and they know that. The more they can convince librarians that it is somehow against the rules (whether “rules” means “law” or “norms” or even merely “etiquette,” and this does vary across publisher sallies) to cross or question them, the longer that gravy train keeps rolling. Researchers, you simply do not matter to publishers in the least until you credibly threaten a labor boycott or (heaven forfend) actually support librarian budget-reallocation decisions. The money is coming from librarians.Last weekend the Streisand Effect reached the opinion pages of the New York Times with Kate Murphy's Should All Research Papers Be Free?, replete with quotes from Michael Eisen, Alicia Wise, Peter Suber and David Crotty. Alas, Murphy starts by writing "Her protest against scholarly journals’ paywalls". Sci-Hub isn't a protest. Calling something a protest is a way of labelling it ineffectual. Sci-Hub is a tool that implements a paywall-free world. Occupy Wall Street was a protest, but had it actually built a functioning alternative financial system no-one would be describing it that way.
The result of the Streisand Effect has been, among other things, to sensitize the public to the issue of open access. Oxenham writes:
vast numbers of people who read the story thought researchers or universities received a portion of the fees paid by the public to read the journals, which contain academic research funded by taxpayers. This clearly isn't in Elsevier's interest. So, having failed to shut down the services and garnering them a lot of free publicity, where does Elsevier go from here? I see four possible paths:
- They can try to bribe the Russians to clamp down on the services, for example by offering Russian institutions very cheap subscriptions as a quid pro quo. But they only control a minority of the content, and they would be showing other countries how to reduce their subscription costs by hosting the services.
- They can try to punish the Russians for not clamping down, for example by cutting the country off from Elsevier content. But this would increase the incentive to host the services.
- They can sue their customers, the institutions whose networks are being used to access new content. In 2008 publishers sued Georgia State for: pervasive, flagrant and ongoing unauthorized distribution of copyrighted materials Eight years later the case is still being argued on appeal. But in the meantime the landscape has changed. Many research funders now require open access. Many institutions now require (but fail to enforce) deposit of papers in institutional repositories. Institutions facing publisher lawsuits would have a powerful incentive to enforce deposit, because their network isn't needed to leak open access content to Sci-Hub.
- They can sue the sources of their content, the individual researchers who they may be able to trace as the source of Sci-Hub materials. This would be a lot easier if the publishers stopped authenticating via IP address and moved to a system based on individual logins. Although this would make life difficult for Sci-Hub-like services if they used malware-based on-campus proxies, it would also make using subscription journals miserable for the vast majority of researchers and thus greatly increase the attractiveness of open access journals. But the Library Loon correctly points out that Sci-Hub's database of credentials is a tempting target for the publishers and others to attempt to compromise.
“Prices are very high, and that made it impossible to obtain papers by purchasing. You need to read many papers for research, and when each paper costs about 30 dollars, that is impossible.”It seems I was somewhat prophetic in pointing to the risk pay-per-view poses for the publishers in my 2010 JCDL keynote:
Libraries implementing PPV have two unattractive choices:
- Hide the cost of access from readers. This replicates the subscription model but leads to overuse and loss of budget control.
- Make the cost of access visible to readers. This causes severe administrative burdens, discourages use of the materials, and places a premium on readers finding the free versions of content.
On the Internet, we obviously need websites like Sci-Hub where people can access and read research literature. The problem is, such websites oftenly cannot operate without interruptions, because current system does not allow it.
The system has to be changed so that websites like Sci-Hub can work without running into problems. Sci-Hub is a goal, changing the system is one of the methods to achieve it.Sci-Hub is as close as anyone has come to providing what the readers want. None of the big publishers can provide it, not merely because doing so would destroy their business model, but also because none of them individually control enough of the content. And the publishers' customers don't want them to provide it, because doing so would reduce even further the libraries' role in their institutions. No-one would need "personal shoppers who moonlight as border guards".