You are here

Feed aggregator

FOSS4Lib Recent Releases: Vivo - 1.8

planet code4lib - Wed, 2015-05-06 12:00

Last updated May 6, 2015. Created by Peter Murray on May 6, 2015.
Log in to edit this page.

Package: VivoRelease Date: Tuesday, May 5, 2015

FOSS4Lib Upcoming Events: 2015 Koha North American Users Group Conference

planet code4lib - Wed, 2015-05-06 11:54
Date: Wednesday, August 5, 2015 - 08:30 to Saturday, August 8, 2015 - 15:00Supports: Koha

Last updated May 6, 2015. Created by David Nind on May 6, 2015.
Log in to edit this page.

The second annual meeting of the Koha North American Users Group will be held at the Bayfront Convention Center in Erie, Pennsylvania, overlooking one of America's greatest natural freshwater harbors.

Whether you are new to Koha, or are just thinking about migrating, come meet some friendly and dedicated people already using Koha at their libraries. Friendships and connections made at a conference can be a benefit for a lifetime.

FOSS4Lib Recent Releases: Koha - Maintenance releases v 3.14.15, 3.16.10 and 3.18.6

planet code4lib - Wed, 2015-05-06 09:52
Package: KohaRelease Date: Thursday, April 23, 2015

Last updated May 6, 2015. Created by David Nind on May 6, 2015.
Log in to edit this page.

Monthly maintenance releases for Koha. See the release announcements for the details:

LibX: Documentation of LibApps

planet code4lib - Wed, 2015-05-06 06:22

We have added a page under the Documentation tab that contains user documentation for the LibApps in the LibX Core Package.  Here is a link to that page.

The documentation includes information on the following packages:

  • The Book Vendors package of LibApps includes LibApps that work on the Amazon and Barnes and Noble sites.  On the item’s page, these LibApps link the user to their library’s search catalog’s results for the item.  If the edition uses Summon as its primary search catalog, the edition will also be able to utilize LibApps that display the item’s availability in the edition’s library directly on the page.
  • The LibX Core Autolinking LibApps package includes LibApps that link a page’s DOIs, ISBNs, ISSNs, PMIDs, and RFCs to the library’s search result for the corresponding resource- provided that the primary search catalog is Summon.
  • The Full-Text Linker package of LibApps insert Full-Text URLs in the web-content of journals in ACM Digital Library, IEEExplore, and Nature.com.

The Process COinS LibApp, which finds OpenURL COinS on a page and inserts a cue that will link users to the library’s search for that COinS’ resouce, is also documented.

Library Tech Talk (U of Michigan): Using Connection Security Rules in the Library

planet code4lib - Wed, 2015-05-06 00:00

How to secure connections regardless of which network the clients are on.

David Rosenthal: Max Planck Digital Library on Open Access

planet code4lib - Tue, 2015-05-05 21:00
Ralf Schimmer of the Max Planck Society's Digital Library  gave a fascinating presentation (PPT) as part of a panel entitled What Price Open Access at the recent CNI meeting. He, and co-authors Kai Karin Geschuhn and Andreas Vogler have now posted the paper on which it was based, Disrupting the subscription journals' business model for the necessary large-scale transformation to open access. Their argument is:
All the indications are that the money already invested in the research publishing system is sufficient to enable a transformation that will be sustainable for the future. There needs to be a shared understanding that the money currently locked in the journal subscription system must be withdrawn and re-purposed for open access publishing services. The current library acquisition budgets are the ultimate reservoir for enabling the transformation without financial or other risks.They present:
generic calculations we have made on the basis of available publication data and revenue values at global, national and institutional levels.These include detailed data as to their own spending on open access article processing charges (APCs), which they have made available on-line, and from many other sources including the Wellcome Trust and the Austrian Science Fund. They show that APCs are less than €2.0K/article while subscription costs are €3.8-5.0K/article, so the claim that sufficient funds are available is credible. It is important to note that they exclude hybrid APCs such as those resulting from the stupid double-dipping deals the UK made; these are "widely considered not to reflect a true market value". As an Englishman, I appreciate under-statement. Thus they support my and Andrew Odlyzko's contention that margins in the academic publishing business are extortionate.

Below the fold, I look at some of the details in the paper.

Having established the global picture that there is more than enough money, they look at the breakdown by country, and point out that:
For good reasons, multi-authored papers will be captured in the bibliographies or institutional repositories of all their home institutions, but in terms of cost-relevance each of these multi-counted papers needs to be paid for only once. For the time being, the dominant model is that the corresponding author is responsible for picking up and settling the invoice. Therefore the various national and institutional publishing lists must be de-duplicated to reflect the corresponding author papers only, in order to make accurate budget forecasts.They show a consistent pattern whereby only 65-70% of a country's output has a corresponding author from that country. Countries with a higher output tend to have a higher share, and vice versa.

At an institutional level, the share is lower. Between 40-60% of an institution's output typically has  a corresponding author from that institution. Their own experience shows this:
For instance, as the current annual journal article output of all Max Planck Institutes is in the region of 10,000 papers, we anticipate the APC-relevant share to be a maximum of 6,000. Hence our projected costs for a complete transformation scenario would be no more than EUR 12 million. The Max Planck Society is a heavily output-oriented research organization and, at the same time, a big buyer of research information. Our current spending on journal subscriptions is already substantial enough to make the big open access transformation possible without having to ask for extra money.I don't doubt that there is enough money in the system to support a wholly open-access system of scholarly communication paid for by APCs. My concern is that the transition from the current system to this nirvana is difficult precisely because there is in fact way more than enough money in the system.

Back in 2008 I served as a judge for Elsevier's Grand Challenge. Even then, conversations with Elsevier management indicated that they regarded open access as inevitable, but that they would do whatever they could to delay its onset. So my expectation would be that during the transition Schimmer et al propose Elsevier would continue their successful strategy; they would be the last publisher to switch.

There is way more than enough money in the system, so that as each publisher switches, money in library budgets is freed up. Elsevier has a long history of knowing exactly how much libraries can afford to pay for access to their journals, and charging it. So my expectation would be that as other publishers switched, Elsevier would raise prices just enough to absorb the funds freed up. The end-point would be a system in which Elsevier would be the only remaining subscription publisher, and would be vastly more profitable than it is today. It would have both more resources to delay open access, and more motivation to deploy them.

Before embarking on the suggested transition, some means of preventing this disaster needs to be developed.

FOSS4Lib Recent Releases: Evergreen - 2.8.1, 2.7.5, and 2.6.8

planet code4lib - Tue, 2015-05-05 20:00

Last updated May 5, 2015. Created by Peter Murray on May 5, 2015.
Log in to edit this page.

Package: EvergreenRelease Date: Friday, May 1, 2015

FOSS4Lib Recent Releases: Piwik - 2.13.0

planet code4lib - Tue, 2015-05-05 19:56

Last updated May 5, 2015. Created by Peter Murray on May 5, 2015.
Log in to edit this page.

Package: PiwikRelease Date: Thursday, April 30, 2015

LITA: Learn to Teach Coding and Mentor Technology Newbies – in Your Library or Anywhere!

planet code4lib - Tue, 2015-05-05 19:39

Attend a free one hour webinar to discover what learning to teach coding is all about, and then register for and attend the LITA preconference at ALA Annual. This opportunity is following up on the 2014 LITA President’s Program at ALA Annual where then LITA President Cindi Trainor Blyberg welcomed Kimberly Bryant, founder of Black Girls Code.

The informational webinar is free and open to the first 100 log-ins:
Tuesday May 26, 2015 at 1:00 pm Central Time
http://ala.adobeconnect.com/teachcoding/
Enter as guest. The webinar will be recorded and the link to the recording will be posted to these same resource spaces.

Register online for the ALA Annual Conference and add a LITA Preconference 

Black Girls CODE (BGC) is devoted to showing the world that black girls can code, and grow the number of women of color working in technology. LITA is devoted to putting on programs that promote, develop, and aid in the implementation of library and information technology. Together, BCG and LITA offer this full day pre-conference workshop, designed to turn reasonably tech savvy librarians into master technology teachers. The workshop will help attendees develop effective lesson plans and design projects their students can complete successfully in their own coding workshops. The schedule will feature presentations in the morning followed by afternoon breakout workgroups, in which attendees can experiment with programming languages such as Scratch, Ruby on Rails, and more.

Presenters:

Kimberly Bryant, Founder and Executive Director Black Girls CODE

Lake Raymond, Program Coordinator Black Girls CODE

The Black Girl Code Vision: To increase the number of women of color in the digital space by empowering girls of color ages 7 to 17 to become innovators in STEM fields, leaders in their communities, and builders of their own futures through exposure to computer science and technology.

Kimberly Bryant:
That, really, is the Black Girls Code mission: to introduce programming and technology to a new generation of coders, coders who will become builders of technological innovation and of their own futures. Imagine the impact that these curious, creative minds could have on the world with the guidance and encouragement others take for granted.

REGISTRATION:

Cost

  • LITA Member $235 (coupon code: LITA2015)
  • ALA Member $350
  • Non-Member $380

How-to

To register for any of these events, you can include them with your initial conference registration or add them later using the unique link in your email confirmation. If you don’t have your registration confirmation handy, you can request a copy by emailing alaannual@compusystems.com. You also have the option of registering for a preconference only. To receive the LITA member pricing during the registration process on the Personal Information page enter the discount promotional code: LITA2015

Register online for the ALA Annual Conference and add a LITA Preconference
Call ALA Registration at 1-800-974-3084
Onsite registration will also be accepted in San Francisco.

Questions or Comments?

For all other questions or comments related to the course, contact LITA at (312) 280-4269 or Mark Beatty, mbeatty@ala.org

Islandora: Islandora at Open Repositories 2015

planet code4lib - Tue, 2015-05-05 19:14

The full schedule for OR2015 is now up and we wanted to call your attention to all of the Islandora goodness that you can take in if you are planning to attend. The following sessions are either explicitly about Islandora, or are from Islandora users talking about other issues in managing collections and digital assets:

Workshops: General Sessions: Fedora Interest Group: Posters:
  • There is Life After Grant Funding: How Islandora Struck Out On Its Own - Islandora Foundation

Ed Summers: VirtualEnv Builds in Sublime Text 3

planet code4lib - Tue, 2015-05-05 18:52

Back in 1999 I was a relatively happy Emacs user, and was beginning work at a startup where I was one of the first employees after the founders. Like many startups, in addition to owning the company, the founders were hackers, and were routinely working on the servers. When I asked if Emacs could be installed on one of the machines I was told to learn Vi … which I proceeded to do. I needed the job.

Here I am 15 years later, and am finally starting to use Sublime Text 3 a bit more in my work. I’m not be a cool kid anymore, but I can still pretend to be one, eh? The Vintageous plugin lets my fingers feel like they are in Vim, while being able to take advantage of other packages for editing Markdown, interacting with Git and the lovely eye-pleasing themes that are available. I still feel a bit dirty because unlike Vim, Sublime is not opensource ; but at the same time it does feel good to support a small software publisher who is doing good work. Maybe I’ll end up switching back to Vim and supporting it.

Anyway, as a Python developer one thing I immediately wanted to be able to do was to use my project’s VirtualEnv during development, and to run the test suite from inside Sublime. The Virtualenv package makes creating, activating, deactivating, deleting a virtualenv a snap. But I couldn’t seem to get the build to work properly with the virtualenv, even after setting the Build System to Python - Virtualenv

After what felt like a lot of googling around (it was probably just 20 minutes) I didn’t seem to find an answer until I discovered in the Project documentation that I could save my Project, and then go to Project -> Edit Project and add a build_systems stanza like this:

{ "folders": [ { "path": "." } ], "virtualenv": "/Users/ed/.virtualenvs/curio", "build_systems": [ { "name": "Test", "shell_cmd": "/Users/ed/.virtualenvs/curio/bin/python setup.py test" } ] }

Notice how the shell_cmd is using the Python executable in my VirtualEnv? After saving that I was able to go into Tools -> Build System and set the build system to Test, which matches the name of the build system you added in the JSON. Now a command-B will run my test suite with the VirtualEnv.

I guess it would be nice if the VirtualEnv plugin for Sublime did something to make this easier. But rather than go down that rabbit hole I decided to write it down here for the benefit of my future self (and perhaps you).

If you know of a better way to do this please let me know.

Access Conference: AccessYYZ Registration Update

planet code4lib - Tue, 2015-05-05 17:57

Thanks to all of you that have been eagerly waiting for registration to open! We can tell there are a lot of you (because we can read your minds, but also because our site gives us nifty analytics). You can register by clicking over to the registration page. There’s a lot of useful information about diversity scholarships, social events, hotels, and travel discounts so please make sure you read the whole page before completing your registration.

EDIT (Wed, May 6th, 2015 at 8:18 ET) For those of you who are waiting to hear about proposals, please note that responses are in the process of getting sent out. If you haven’t gotten a response from us yet, don’t despair! Something will be coming your way tomorrow or Friday.

DPLA: Apply to host DPLAfest 2016!

planet code4lib - Tue, 2015-05-05 14:00

Hot on the heels of the successful second DPLAfest, we’re looking for the next great site to host next year’s interactive, productive, and exciting event. DPLAfest is an annual event that brings together hundreds of people to celebrate the Digital Public Library of America, our many partners across the country, and our large and growing community of practitioners and members of the public who contribute to, and benefit from, DPLA.

SCENES FROM DPLAfest 2015 IN INDIANAPOLIS

DPLAfest 2015 was co-hosted by the Indianapolis Public Library, Indiana State Library, Indiana Historical Society, and the IUPUI University Library. Those great institutions were proud to host well over 300 attendees from across the world for two-days of discussions, workshops, hands-on activities, and fun events.

DPLAfest host organizations are essential contributors to one of the most prominent gatherings in the country involving librarians, archivists, and museum professionals, developers and technologists, publishers and authors, teachers and students, and many others who work together to further the mission of providing maximal access to our shared cultural heritage. For colleges and universities, DPLAfest is the perfect opportunity to directly engage your students, educators, archivists, librarians and other information professionals in the work of a diverse national community of information and technology leaders. For public libraries, hosting DPLAfest brings the excitement and enthusiasm of our community right to your hometown, enriching your patrons’ understanding of library services through free and open workshops, conversations, and more. It’s also a chance to promote your institution nationally and internationally, given the widespread media coverage of DPLAfest and the energy around the event.

If this opportunity sounds right for you and your organization, let us know! We are calling on universities and colleges, public libraries, archives, museums, historical societies, and others to submit expressions of interests to serve as hosts or co-hosts for DPLAfest 2016, which will take place in mid-April 2016.

To apply, review the information below and submit an expression of interest on behalf of your organization via the form at the bottom of this page. The deadline to apply is Wednesday, July 15, 2015. We will follow up with the most promising proposals shortly following the deadline.

Collaborative applications (such as between a university and a nearby public library) are encouraged. Preference will be given to applicants who can provide venue spaces which are closely located to one another, or in the same building complex or campus. Please note that some host partners can contribute staffing or other day-of support in lieu of venue space.

You can learn more about DPLAfest here. Questions? Email info@dp.la.

Requirements of a DPLAfest 2016 Hosting Site

  • Willingness to make local arrangements and coordinate with DPLA staff and any/all staff at host institution.
  • An auditorium or similar space suitable for a keynote presentation (minimum 250 people).
  • 10 or more smaller rooms for “breakout” sessions (30 – 50 people).
    • Preference will be given to hosts that can provide breakout rooms equipped with projection/display capabilities.
  • Availability of wireless network for all attendees, potentially in excess of 300 simultaneous clients, for free or via conference sponsorship.
  • An organizational commitment to donate use of all venue spaces. (As a small non-profit with limited funds, as well as a strong desire to keep DPLAfest maximally open to the public, we’re unable to pursue host proposals that are unable to offer free or deeply-discounted use of venue spaces).
  • Ability to provide at least one staff person for every venue space to help with day-of AV support, logistical support, etc.
  • Commitment to diversity, inclusion, and openness to all.

Additional Desirable Qualities

  • Proximity to a major airport and hotels.
  • Co-location of proposed event spaces (ie., same building or nearby buildings).
  • Location outside of the Midwest or Boston, MA area (we’re rotating the location of DPLAfest each year; we celebrated DPLAfest 2013 in Boston and DPLAfest 2015 in Indianapolis).

 

Eric Lease Morgan: Loyola Marymount University

planet code4lib - Tue, 2015-05-05 13:43

Twenty new EAD files have been added to the “Catholic Portal” from Loyola Marymount University — http://bit.ly/1DQkBa7

Library of Congress: The Signal: Insights Interview: Josh Sternfeld on Funding Digital Stewardship Research and Development

planet code4lib - Tue, 2015-05-05 13:33

The 2015 iteration of the National Agenda for Digital Stewardship identifies high-level recommendations, directed at funders, researchers, and organizational leaders that will advance the community’s capacity for digital preservation. As part of our Insights Interview series we’re pleased to talk with Josh Sternfeld, a Senior Program Officer in the Division of Preservation and Access at the National Endowment for the Humanities.

The NEH has consistently funded research that addresses the most pertinent issues related to digital stewardship. Its recently revised Research and Development grant program seeks to address major challenges in preserving and providing access to humanities collections and resources, and Josh will help us understand new application guidelines and their perspective on digital stewardship. The deadline for submitting an application is June 25, 2015.

Josh has posted several times on the Signal and we interviewed him about his background and NEH’s digital stewardship interests back in March 2012.

Butch: This year NEH has decided to break its funding for Research and Development into two tiers: Tier I for short-term and Tier II for longer-term projects. Talk about why NEH wanted to split the funding up this way.

Funding by user howardlake on Flickr.

Josh: First of all, thank you for this opportunity to discuss the exciting changes to our grant program! Last year, my colleagues and I in the Division of Preservation and Access undertook an intensive year-long review of our Research and Development program. We reached out to the field, including participants in the 2014 NDSA Digital Preservation Conference, to listen to practitioners’ needs. We discovered that the landscape of research has changed dramatically in very short order. For starters, new content formats (especially in the digital space) are emerging and changing the way we understand the humanities. Yes, tools and platforms are critical for the work of humanities scholars, educators, curators, archivists, librarians and students, but just as important is the need to establish standards, practices, methodologies and workflows to promote resource sharing, evaluation, and collaboration.

By introducing the Tier I grant, we believe we can seed projects at all stages of development, from early conceptualization to advanced implementation. In addition, we want to support discrete research and development projects. Sometimes, a small team of humanities practitioners and scientists can assemble rapidly to collect critical data for the field. Altogether, the combination of short- and longer-term projects is intended to capture the fluid dynamic that we see arising from within cultural heritage research and development.

Butch: Give us a little more detail on each of the funding Tiers and examples of the kinds of projects you’d like to see under each.

Josh: We see Tier I as a promising entry point for a wide variety of project types, from the planning of large, multi-year collaborative projects to standalone projects such as basic research experiments, case studies, or tool development. Tier I projects, therefore, may be used to accomplish an expansive range of tasks. For example, a team creating an open source digital asset management system wants to include additional functionalities that takes the platform out of its “beta” phase. A group of information scientists, working with humanities scholars, wants to investigate the efficacy of a new linked open data model. Or a group of computer scientists wants to test a new approach to search and discovery within a large humanities data corpus.

At the Tier II level, NEH continues to support projects at an advanced implementation stage. Projects at this level must investigate the development of standards, practices, methodologies or workflows that could be shared and adopted by a wider community of practitioners.

For both tiers, we encourage collaboration across the humanities and sciences, whether information, computer, or natural. We believe pairing people from disparate backgrounds poses the best opportunity to accomplish positive outcomes for cultural heritage. We have included possible research topics and areas in our guidelines (pdf) that may provide some guidance, although please bear in mind the list is not intended to be comprehensive.

Butch: Do you foresee that projects originally funded under Tier I will return for Tier II funding down the road?

Josh: Yes, but it is not a prerequisite to apply. After reviewing many successful R&D projects over the years, we learned that the keys to a successful project begin with considerable planning, preparation, preliminary research and in some instances, prototyping, all of which would be eligible for Tier I support. Even if a project team does not continue into a formal implementation stage, a Tier I project can still provide a tremendous benefit to the field.

Butch: The digital stewardship community has often been challenged in securing stewards and funding support for tools and services that have grown to become part of the community infrastructure, such as web archiving tools. How does NEH see itself in terms of helping to develop and sustain a long-term digital stewardship infrastructure?

Josh: We envision the digital stewardship community, along with the wider cultural heritage R&D community, as building on an expanding scaffolding of data, tools, platforms, standards and practices. Each element has its role in advancing knowledge, forming professional connections and advancing the cause of providing better long-term preservation and access to humanities collections. One of the most gratifying parts of our job is to see how a standard under development and supported by R&D funding is eventually used in projects supported through our other grant programs. We think R&D can have the greatest impact by supporting the development of the elements that serve as the practical and theoretical glue binding the work of the humanities. For this reason, the grants do not support direct infrastructural development, per se, but rather applied research that leads to fundamental changes in our approach to stewardship.

Butch: Starting in 2016, the NEH will host an annual Research and Development Project Directors’ Meeting. Tell us about this meeting and how it will help publicize digital stewardship projects and research.

Josh: Compared to the sciences, the cultural heritage community perhaps has fewer opportunities to reflect upon major preservation and access-related challenges in the field in a public forum. Whether we are considering open access of humanities content, the crisis in audiovisual preservation and access, or a host of other topics, these challenges are clearly complex and demand creative thinking. Starting next spring, NEH will host an open forum that will not only provide recently awarded Project Directors the opportunity to showcase their innovative work, but will also encourage participants to think beyond their own projects and offer expert perspective on a single pre-selected issue. I don’t have much more to share at this stage, but I encourage everyone to stay tuned as information becomes available!

Butch: The revised NEH funding approach seems designed to help build connections across the digital stewardship community. How concerned is NEH and organizations like it about the “silo-ing” of digital stewardship research?

Josh: Maintaining active and productive research connections is essential for the success of digital cultural heritage research and development. It is the reason why, starting this year, we are requiring Tier II applicants to supply a separate dissemination proposal describing how research findings on standards, practices, methodologies and workflows will reach a representative audience. Research in digital stewardship has matured in recent years. Project teams can no longer rely on uploading a set of code and expecting a community to form magically around its sustainability. Thankfully, there are so many resourceful ways in which researchers can reach their constituency from holding in-person and virtual workshops, to code sprints, to developing online tutorials, to name just a few possibilities.

Butch: The 2015 National Agenda published last fall included a number of solid recommendations for research and development in the area of digital stewardship. In addition to applying for funds from NEH, what can NDSA member organizations concentrate on that will benefit the community as a whole?

Josh: NDSA has done a wonderful job crystallizing the R&D needs of specific areas and drawing attention to new ones. My recommendation, therefore, comes from social, rather than technical, considerations. I think first and foremost NDSA members should not be afraid to self-identify with the cultural heritage research and development community. All too often during our internal review we found that humanities practitioners were content working with the “status quo” as far as tools, platforms, standards, practices and methodologies are concerned. As a consequence, a lot of time and energy is spent adapting commercial or open source tools that were produced with entirely different audiences in mind. As soon as those in cultural heritage realize that their needs are unique from those of other disciplines, they can begin to form the necessary partnerships, collaborations, programming, and project focus.

Hydra Project: OR2015 program announcement, registration reminder

planet code4lib - Tue, 2015-05-05 08:06

Of interest to Hydranauts

OR2015 NEWS: Full Program Available; Early Registration Deadline Friday; Sign Up for Workshops

Dear Colleagues,

We are pleased to announce that full program and schedule details for Open Repositories 2015, taking place in Indianapolis on June 8-11, are now available on the conference website at http://www.or2015.net/

The program for this 10th Open Repositories conference includes:

– keynote talks from Kaitlin Thaney of Mozilla Science Lab and Anurag Acharya of Google Scholar

– a mix of workshops, tutorials, papers, panels, 24×7 presentations, posters, and “repository rants and raves” addressing a wide variety of topics related to digital repositories and the roles they play in supporting open scholarship, open science, online cultural heritage, and research data

– a Developer Track that includes informal  presentations and demonstrations showcasing community expertise and progress

– interest group sessions focused on the open source DSpace, EPrints, and Fedora (including Hydra and Islandora) repository platforms

– an Ideas Challenge enabling small teams to collaborate on proposing new ideas for moving repositories forward (with prizes)

Coupled with a variety of social activities to help support networking with colleagues from across the globe, along with exhibit tables from conference sponsors, OR2015 should make for a rewarding experience for anyone working in the repositories space.

** Reminder: Discounted Early Registration Ends Friday, May 8 **

Online registration for OR2015 is open, and participants can save $50 by registering by this Friday, May 8. Special negotiated room rates are available at the conference hotel until May 16. For more information, please visit the conference website: http://www.or2015.net/

All conference participants, including those with accepted presentations, need to register in order to attend the conference.

** Sign Up for Workshops and Tutorials **

If you have already registered for OR2015 and are planning to participate in workshops or tutorials on the first day of the conference, Monday, June 8, please visit http://www.or2015.net/workshops to sign up for the sessions you plan to attend. Workshops and tutorials are included in the registration fee, but separate signup is required in order to guarantee a seat.

We look forward to seeing you at OR2015!

Holly Mercer, William Nixon, and Imma Subirats

OR2015 Program Co-Chairs

Jon Dunn, Beth Namachchivaya, Julie Speer, and Sarah Shreeves

OR2015 Conference Organizing Committee

DuraSpace News: VIVO v1.8 is Now Available

planet code4lib - Tue, 2015-05-05 00:00

The VIVO team has announced that VIVO v1.8 is now available with key features and improvements. The VIVO Project is an open source, open ontology, open process platform for hosting information about the interests, activities and accomplishments of faculty and students providing an integrated view of the scholarly work of an organization.

District Dispatch: The hierarchy of creative people

planet code4lib - Mon, 2015-05-04 21:25

Photo by David Lapetina

A coalition formed “to combat copyright piracy and demonstrate the value of creativity”— Creative America—has changed its name to CreativeFuture. Major motion picture and television companies initially formed this group now that is now considering the future, the present day, and no doubt, the past as well.

CreativeFuture now has individual members as well industry and trade groups. These new members call themselves “the creatives.” Apparently, by calling themselves the creatives, they are a specially placed group, distinct from other people who create.

Who are the CreativeFuture creatives? Television and film executives, producers, screen writers, actors and others in the entertainment industry. They argue that “copyright should protect creatives from those who would use the internet to undermine creativity.” In case you were wondering, “Those” are people who use the internet to allegedly infringe copyright by copying and distributing protected content.

They are incorrectly called “pirates,” because, well, it sounds more creative. The icing on the cake is the compelling narrative that goes along with the label. The story goes that if piracy [sic] is unchecked, the entertainment industries will go bankrupt, thousands of people who work for the industry will lose their jobs, and the world will miss out on the fantabulous creative works that the United States provides. And if the creatives grow disillusioned, there will come a time when the creatives will have no reason to create anymore. Only people who are creative—but not as creative as the creatives—will create their subpar content. The world will suffer.

Other than the creatives, who else should be protected by copyright law? The public. The grand feature of the copyright law is that it serves both the interests of creators and rights holders but also the information seeking (and consumer buying) needs of the public. Free expression and learning should be protected as well because they in turn advance knowledge and create new works. This is how the progress of science and the useful arts happens.

Re:create, a new copyright coalition wants to direct more attention to the public, people who create, and new and emerging creators. Piracy [sic] is bad but making extreme attempts to control it with laws like SOPA are overkill, and ultimately only favor the creatives, the companies they work for, and their legacy business models.

In closing, I will end my tongue in cheek rant with a plea. The creatives say that they “must be part of the conversation and stand up for creativity.” We all support creativity already, but the creatives, always craving attention, want to stand up higher and be seen (or heard at a Congressional hearing). I say that the concerns of the public need more attention. Let’s move forward while preserving a balanced copyright law. Don’t miss your chance to be heard.

The post The hierarchy of creative people appeared first on District Dispatch.

Coral Sheldon-Hess: Recipe – Sweet potato and black bean hash

planet code4lib - Mon, 2015-05-04 19:58

This is delicious by itself, with rice, as a taco or quesadilla filling, or, if you want to combine it with some scrambled eggs and a little cheese, in a breakfast burrito. Sour cream goes nicely with it, especially if you get it too spicy. ;) Although it’s very good (and rich and filling) with the bacon, I don’t think you strictly need it. If you prefer to go vegetarian, just increase the other oil and leave the bacon out; it’ll still be good.

This, much more than the last recipe, will give you a peek into how I generally cook. (Spoiler: Makin’ it all up as I go.) I started out to make basically this, but I didn’t bother to look it back up (or I’d have known I had WAY too many sweet potatoes :)); also, I knew I was going to substitute some coconut oil in place of some of the bacon fat.* If olive oil is your thing, do that instead; just keep an eye on the temperature so it doesn’t smoke.

About halfway through cooking it, I changed my mind and decided to make something spicier, a little more like Mexican food and a little less like Southern food; hence, beans and all the spices past the sage. I was working from my recollection of something I liked to order back when I lived in Pittsburgh, at a (now sadly closed) restaurant called The Quiet Storm, and I think I got the spice combo right. But I wish I’d measured, so that I could share exact amounts with you. Below are my estimates.

Ingredients:

  • 32 oz sweet potatoes, minus a few weird-looking chunks
  • 1 lb bacon (minus a few strips that became breakfast), drained, but reserve the fat
  • 15 oz can of black beans, rinsed
  • ~3 Tbsp coconut oil + ~1Tbsp bacon fat; you can add more if it starts sticking to the pan too badly
  • sage – fresh is pretty great, dried is fine; I used 4 fresh leaves plus probably a teaspoon of dried
  • chili powder – at least a teaspoon, probably more like 2
  • onion powder – just a dash
  • garlic powder – a dash
  • oregano – about a teaspoon?
  • cayenne – maybe 1/4-1/2 tsp, depending how spicy you want to go
  • salt – to taste
  • a little water

Method:

If you didn’t buy pre-cut sweet potatoes, peel and chop yours. It will cook faster if you shred them, rather than cutting them into cubes. I like having them cubed, but I think I watched three episodes of a TV show on my cookbookiPad while this was cooking, just so you know.

Cook the bacon however you like to cook bacon. I used a skillet, patted the cooked bacon dry with a paper towel, and then poured all of the fat from the skillet into a measuring cup. I gave the bacon time to cool and ate some breakfast. :) You will eventually want to chop the bacon into little pieces, but you’ll have time for that while the sweet potatoes cook.

Put a little bit of the bacon grease (maybe 1Tbsp, maybe a smidge more) back into your skillet with the coconut oil. It’s going to look like too much oil, but 32 oz of sweet potatoes will eat a LOT of oil while they cook. Let the oil get good and hot (I kept the burner on medium the whole time), and if you’re using fresh sage, drop the leaves in and let them sizzle for just a bit before you dump in the sweet potatoes. Dried sage can go in a little later.

Get the sweet potatoes covered in oil, and then let them heat. You’ll want to stir them up every so often, maybe every 5 or so minutes if you’re an antsy cook like me. For something more like hashbrowns, you want to be more patient.

Chop up your bacon. Once the sweet potatoes are hot — not even cooked through, just hot — it’s cool to add the spices and throw the bacon back in.

About 15 minutes after you add the spices and the bacon, go ahead and add the (rinsed and drained) beans. You’ll want to add water (maybe about a third of a cup?) from time to time, after the beans go in, because they’re prone to drying out.

When everything’s all cooked through, or you’re bored and just want to finish it in the microwave, it’s done. :)

* If you use locally grown bacon from happy pigs that aren’t eating corn, probably using all the bacon fat is a fine choice, but I wasn’t. I’m sorry. One of the perks of having a full-time job again is going to be a return to buying more locally and more ethically, just in general; but for now, I just do what I can. (back)

Peter Murray: From NISO: Invitation to NISO Patron Privacy Virtual Meetings

planet code4lib - Mon, 2015-05-04 19:44

Over the next couple months, NISO is managing a project to “develop a Consensus Framework to Support Patron Privacy in Digital Library and Information Systems.”1 I’m honored and excited to be on the panel exploring this topic and creating the recommendations as this is a topic I’ve written about extensively on this blog. In May and June, NISO is conducting virtual meetings on four topics that will lead up to a day and a half in-person discussion at the ALA annual meeting at the end of June in San Francisco. Reproduced below is the invitation for people to listen in on the virtual meeting discussions. I hope (and expect) that there will be a twitter hashtag for those participating in the call (whether on the panel or in the audience) to add their thoughts. The #nisoprivacy hashtag will be used to gather the discussion online.

As announced last month, NISO, the National Information Standards Organization has launched an initiative to develop a Consensus Framework to Support Patron Privacy in Digital Library and Information Systems with the generous support from the Andrew W. Mellon Foundation. The project involves a series of community discussions on how libraries, publishers and information systems providers can build better privacy protection into their operations and the subsequent formulation of a framework document on the privacy of patron data in these systems.

We are pleased to announce the availability of a limited number of listen-only “seats” to the virtual meetings that comprise the first phase of the project. The virtual meetings will involve a range of industry participants including librarians, publishers, system vendors, legal experts and general non-profit participants, discussing various ‘lenses’ of patron privacy. The dates and times of these events are scheduled as follows:

  • Patron privacy in internal library systems: Thursday, May 7, 10:00 am-1:00 pm ET
  • Patron privacy in vendor systems: Thursday, May 21, 10:00 am-1:00 pm ET
  • Patron privacy in publisher systems: Friday, May 22, 9:00 am-12 noon ET
  • Legal frameworks influencing data sharing and policies: Friday, June 19, 1:00-4:00 pm ET

If you would like to attend any of these meetings as a listen-only guest, please fill out the RSVP form at https://www.surveymonkey.com/s/niso-patron-privacy. [Registration for each meeting will close at noon Eastern time the day before the meeting.]

Each of these virtual meetings will be a three-hour web-based session designed to lay the groundwork for an in-person meeting at the conclusion of the American Library Association meeting in San Francisco, CA in June. We plan to make a live stream of that meeting available to the community. More information about that video stream of the meeting will be distributed next month.

Following the in-person meeting, a Framework document will be completed detailing the privacy principles and recommendations agreed to by the participants, and then circulated for public comment and finalization. More information, including a version of the project proposal, is available on the NISO website at: http://www.niso.org/topics/tl/patron_privacy/.

Thank you for your interest in this important topic that faces the library and information communities.

Footnotes
  1. From NISO’s March 11, 2015, press release about the project.
Link to this post!

Pages

Subscribe to code4lib aggregator