You are here

Feed aggregator

HangingTogether: The end of an era — goodbye to Jim Michalko

planet code4lib - Tue, 2016-03-01 00:19

Today is the day when we say goodbye to our leader and colleague Jim Michalko. Rather than wallowing in our loss, we’d like this post to celebrate Jim’s accomplishments and acknowledge his many wonderful qualities.

Jim Michalko February 2016

Before OCLC, Jim was the president of the Research Libraries Group. He came to RLG from the administration team at the University of Pennsylvania Libraries in 1980. In those relatively early days of library automation, RLG was very much a chaotic start up. Jim, with both a MLS and an MBA, came on as the business manager and as part of the senior administrative team helped to get the organization on more stable footing. He was named RLG president in 1989.

In 2006, Jim once again played a key role in a time of uncertainty, helping to bring RLG into the OCLC fold. This included both integrating RLG data assets into OCLC services and bringing forward programmatic activities into OCLC Research. A key part of those programmatic activities is collaboration with the research library community, and the OCLC Research Library Partnership is a key component in driving our work agenda. Under Jim’s leadership, the Partnership has grown from 110 in 2006 to over 170 institutions now, including libraries at 25 of the top 30 universities in the Times Higher Education World University rankings.

Jim is a wise and gentle leader with a sardonic sense of humor. We’ve appreciated his ability to foster experimentation (and his patience while those experiments played out), his willingness to get obstacles out of our way so that we can get our work done, his tolerance of our quirks and other personal qualities, and his ability to maximize our strengths.

Jim’s retirement is part of a larger story that is playing out in the larger research library community as those who have overseen generations of change in technology, education, and policy are moving on. We will honor these leaders by following in their footsteps, while reminding ourselves that the path they set was marked by innovation.

 

About Merrilee Proffitt

Mail | Web | Twitter | Facebook | LinkedIn | More Posts (284)

LibUX: 034 – How “UX as a Measurement” Leads to “Service Design”

planet code4lib - Tue, 2016-03-01 00:07

You might remember that in our 2016 Design Predictions episode, my number one was that we are going to see an explosion of “Service Design” in writeups, job descriptions, and the like. I hadn’t really heard about Service Design until winter 2015, but as I was editing this episode — a recut of a talk from June prior — my spiel about conceptualizing the user experience as a measurement led into a totally unintended talk about service design. This makes sense, because when we think about UX as a measurement we are thinking about holistic experiences that transcend the screen which reflect back at us the quality of the services we provide.

Every service design decision you make has a performance pay off.

On the user experience as a measurement

Also, the slides from the above.

If you like you can download the MP3.

As usual, you support us by helping us get the word out: share a link and take a moment to leave a nice review. Thanks!

You can subscribe to LibUX on Stitcher, iTunes, or plug our feed right into your podcatcher of choice. Help us out and say something nice. You can find every podcast on www.libux.co.

The post 034 – How “UX as a Measurement” Leads to “Service Design” appeared first on LibUX.

DuraSpace News: Osmania University Offers "Live DVD" for DSpace and Joomla Installation

planet code4lib - Tue, 2016-03-01 00:00

From P. Ramesh, Senior Technical Officer and Asst. Professor, Department of Library and information Science, Osmania University

Hyderabad, India  A team at Osmania University has developed a live DVD for installation of DSpace 5.2 and Joomla 3.4.5, which was formally released in 2015. This live DVD is very useful for Library and Information Science students, teachers and professionals. More than 650 people from 40 countries have downloaded and are using this file.

District Dispatch: CopyTalk cancelled for March

planet code4lib - Mon, 2016-02-29 23:22

CopyTalk webinar is cancelled for March.

IMPORTANT ANNOUNCEMENT:

Due to circumstances beyond any normal human’s control, we will not have a CopyTalk webinar for the month of March. We will be back on schedule April 7th 2015, the first Thursday of the month!

Upcoming webinars will focus on music, video and best practices for fair use regarding art resources.

Stay tuned!

The post CopyTalk cancelled for March appeared first on District Dispatch.

District Dispatch: Free webinar: LibERate the Telecommunications Act of 1996! — Making E-Rate Make Sense

planet code4lib - Mon, 2016-02-29 21:15

Patrons using Wi-Fi at the MLK Digital Commons in Washington D.C.

WHAT:  Free PLA webinar! Presented in partnership with the ALA Office for Information Technology Policy (OITP).

WHEN:  Thursday, 3/3/2016

  • 2:00 PM-3:00 PM (Eastern)
  • 1:00 PM-2:00 PM (Central)
  • 12:00 PM-1:00 PM (Mountain)
  • 11:00 AM-12:00 PM (Pacific)

We’ve all heard about the massive changes to E-rate over the last couple of years. As we’re in the midst of filing for the 2016–2017 year, there are some changes that you don’t want to miss. You have probably heard there is less money for telephone services this year. Don’t let that get you down! It just means there’s even more money to support broadband access and connectivity. Since we have a legacy of providing access to information to the public that dates back to the earliest days of our national independence, we also know that we often have historical, beautiful buildings that had no way to predict the cabling and WiFi needs of today. The FCC wants to help us get past those and other obstacles, and improve our ability to keep our citizenry informed. In this free webinar, OITP staff and guests will touch on E-rate as a program, but really delve into some tools you want to have handy when you’re filing this year—and in years to come!

Learning Outcomes

At the conclusion of this webinar, participants will:

  • Know about changes to the E-rate program that can help you improve internet access and WiFi connectivity in your library;
  • Have discovered new resources to support you as you navigate the E-rate application process to set yourself up for success; and
  • Understand what you need to have on hand to start filing for FY16.

Who Should Attend

Representatives from any public library planning to file for E-rate funding; Appropriate for those new to E-rate as well as those with previous E-rate experience.

Instructors:

Emily Almond is Director of IT for the Georgia Public Library Service.  After starting her career at CNN, she worked at Emory University as a systems librarian and then at the Atlanta Journal-Constitution as an archive manager and a project manager for ajc.com. She has experienced the ways in which technology can transform an organization and further, the ways in which quality leadership and smart management can use the right technology in the right instances to achieve strategic goals. Emily holds a B.S. in Journalism from Kennesaw State University and a MLIS from Florida State University.

Amber Gregory has worked with the E-rate program since 2010 as the coordinator of E-rate Services at the Arkansas State Library where she helps public libraries navigate the program. Amber is currently a member of the American Library Association’s E-Rate Task Force.

Wendy Knapp is the associate director of Statewide Services at the Indiana State Library.

Marijke Visser is associate director of ALA’s OITP where she is responsible for broadband adoption and all of ALA’s work on E-rate issues. She came to OITP in 2009 to support a grant project funded by the Bill & Melinda Gates Foundation looking at broadband capacity in public libraries. She is also program director for OITP’s emerging portfolio on children, youth, and technology.  She co-chairs the Edlinc Coalition, which promotes E-rate policy for libraries and schools at the national level. In addition to E-rate, Marijke supports the Program on Networks focusing on broadband adoption issues for diverse populations.

Registration

Cost:  THIS WEBINAR IS FREE, BUT REGISTRATION IS REQUIRED AND SPACE IS LIMITED.

You can register for this webinar until it begins, or until space is no longer available, whichever comes first. Please do not register unless you are sincere about attending the live webinar. Space is limited, and signing up and not attending may deprive someone else of the opportunity. Thank you for your cooperation.

How to Register

REGISTER NOW!  Click Register to continue the online registration process.

 

If you have a physical or communication need that may affect your participation in this webinar, please contact us at plawebinars@ala.org or 800-545-2433 ext. 5PLA (5752) at least one week prior to the registration deadline above. Without prior notification of need, we cannot attempt to provide appropriate accommodations.

LibERate the Telecommunications Act of 1996! Making E-Rate Make Sense registration in WebEx screenshot

If you have a physical or communication need that may affect your participation in this webinar, please contact us at plawebinars@ala.org or 800-545-2433 ext. 5PLA (5752) at least one week prior to the registration deadline above. Without prior notification of need, we cannot attempt to provide appropriate accommodations.

Tech Requirements

This webinar will be presented using the WebEx platform. You may listen to the audio portion of the webinar via your computer’s speakers, headphones plugged into your computer’s audio jack or USB port; or by dialing in with your telephone (your carrier’s charges may apply) or Skype (by following the process outlined by Skype to place calls to land lines). We suggest that groups, especially larger groups, plan ahead to use an LCD/LED projector in the room to project the webinar. Groups will also want to have speakers or a sound system capable of amplifying the webinar audio for the entire room. No microphone is required.

PLEASE NOTE: PLA provides its webinar audio through voice over IP (VoIP), which means the sound comes through speakers or headphones plugged into your computer. PLA works with its webinar platform provider to assure the highest quality audio is being delivered to attendees. However, variables over which PLA has no control—such as the speed of your Internet connection or traffic on your local network—can affect the end quality of the webinar audio delivered by your computer. Each webinar’s audio is also available by teleconference via a toll number, so we recommend you have access to a long-distance enabled phone as a backup in case you experience audio issues with VoIP. If you do encounter any problems during the webinar, you will receive a link to its archived recording within a week of the live event and can review anything you missed.

Contact

Questions about this webinar? Please contact us at plawebinars@ala.org or 800-545-2433 ext. 5PLA (5752). For questions about webinar registration, please call 800-545-2433 ext. 5.

The post Free webinar: LibERate the Telecommunications Act of 1996! — Making E-Rate Make Sense appeared first on District Dispatch.

District Dispatch: Libraries recognized at House Energy and Commerce hearing on 3D printing

planet code4lib - Mon, 2016-02-29 15:18

U.S. Capitol. photo by Jonathon Colman via Flickr

Rep. Cardenas credits libraries as leaders of the maker movement

On Friday, the House Energy and Commerce Committee’s Subcommittee on Commerce, Manufacturing and Trade held a hearing exploring the implications of the rapid takeoff of 3D printing in this country and beyond. Witnesses included Alan Amling of UPS, Edward Herderick of General Electric, Ed Morris of the National Additive Manufacturing Innovation Institute (NAMI) – also known as America Makes – and Neal Orringer of 3D Systems.

The hearing touched on a myriad of topics, including the emerging field of bioprinting – the printing of human organs, the impact of 3D printing on the supply chain and the consequences of the rise of 3D printed prosthetics for the public, as well as the medical device industry. However, it didn’t start getting good for libraries until the issue of public access to 3D printing and its benefits to students and the workforce came to the fore. Rep. Yvette Clark (NY-9) raised the topic. Not long after, the Chairman handed the floor to Rep. Tony Cardenas (CA-29).

Rep. Cardenas led with a nod to libraries as leaders of the maker movement, followed by an inquiry into the witnesses’ commitment to supporting the learning, innovation and workforce development the library community facilitates through 3D printing:

We’ve noticed that in America’s libraries, we’ve had an increase of opportunities…Libraries are investing in 3D printers – now to the tune of over 400 libraries, at little-to-no cost to individuals going to the library. For me, this is a very important issue for making sure we [provide] access to as many minds, as many inquisitive folks [as possible], so that they can get turned on to how wonderful it is, and to the potential of getting a job in the industry. How committed is the industry to advancing that kind of effort?

Neal Orringer of 3D Systems responded by trumpeting his company’s recent partnership with the Young Adult Library Services Association (YALSA) on the MakerLab Club initiative. “We need to do more like this (the MakerLab Club); it’s going to pay back dividends,” Orringer said.

Orringer also underscored the importance of helping libraries answer practical set-up and management questions so that they can connect their patrons to all of the benefits their 3D printing services have to offer. Ed Morris echoed this sentiment, emphasizing the need for organizations like his to ensure library professionals have the knowledge and the training they need to keep their 3D printers operating over the long-term. Rep. Cardenas concluded the thread on libraries by exhorting the industry leaders in attendance to view partnerships with libraries and other anchor institutions around 3D printing as “an investment in human capital.”

ALA is deeply grateful to Rep. Cardenas for his eloquent acknowledgement of the library community’s efforts to democratize and build skills through 3D printing technology. We hope that the discussion his questions sparked yields fruitful collaboration between libraries and 3D printing leaders across the public, private and non-profit sectors. For a video of Rep. Cardenas’ comments on libraries, click here. For a full video of the hearing, click here.

The post Libraries recognized at House Energy and Commerce hearing on 3D printing appeared first on District Dispatch.

Open Knowledge Foundation: Sloan Foundation Funds Frictionless Data Tooling and Engagement at Open Knowledge

planet code4lib - Mon, 2016-02-29 12:58

We are excited to announce that Open Knowledge International has received $700,000 in funding from The Alfred P. Sloan Foundation over two years to work on a broad range of activities to enable better research and more effective civic tech through our Frictionless Data initiative. The funding will target standards work, tooling, and infrastructure around “data packages” as well as piloting and outreach activities to support researchers and civic technologists in addressing real problems encountered when working with data.

The Alfred P. Sloan Foundation is a philanthropic, not-for-profit grant-making institution based in New York City. Established in 1934 by Alfred Pritchard Sloan Jr., then-President and Chief Executive Officer of the General Motors Corporation, the Foundation makes grants in support of original research and education in science, technology, engineering, mathematics and economic performance.  

“Analyzing and working with data is a significant (and growing) source of pain for researchers of all types”, says Josh Greenberg, Program Director at the Alfred P. Sloan Foundation. “We are excited to support Open Knowledge International in this critical area. This support will help data-intensive researchers to be more efficient and effective.” What is being funded?

The funding will support three key streams of work around data packages: (a) the further development of the data package suite of standards, (b) the creation and enhancement of a suite of tools and integrations around these standards, and (c) broad outreach and engagement to educate researchers about the benefits of this approach.

Standards

The Data Package standard is a simple, lightweight specification for packaging all types of data, but we have a special emphasis on tabular (e.g. CSV) data. As the sources of useful data grow, effective data-driven research is becoming more and more critical. Such research often depends on cleaning and validating data, as well as combining such data from multiple sources, processes that are still frequently manual, tedious, and error-prone.  Data packages allow for the greater automation of these processes, thereby eliminating the “friction” involved.  

Tooling and Integration

A key aspect of this work is that it aligns with researchers’ usual tools and will require few or no changes to existing data and data structures.  To do this, we are seeking to build and support integrations with popular tools for research, for example, R, STATA, LibreOffice, etc.  In addition, we are looking to define ways of seamless translating datasets to and from typical file formats used across various research communities such as HDF5, NetCDF, etc.

Community Outreach

While our core mission is to design a well defined set of specifications and build a rich and vibrant ecosystem of tooling around them, none of this is possible without also building a broad awareness of data packages, where to use them and their utility, and a sustainable group of engaged users to support this.  To make our work in this area as effective as possible, we are building partnerships with organizations in research, civic tech, as well as government.

Be a part of the Frictionless Data future

We are looking to discover much more about the needs of different research groups and to identify the problems they might currently have.  To do this, we are running targeted pilots to trial these tools and specifications on real data.

Are you a researcher looking for better tooling to manage your data?  

Do you work at or represent an organization working on issues related to research data like DataCite, DataONE, RDA, or CODATA and would like to work with us on complementary issues for which data packages are suited?

Are you a developer and have an idea for something we can build together?

Are you a student looking to learn more about data wrangling, managing research data, or open data in general?

If any of the above apply to you, email us at frictionless@okfn.org.  We’d love to hear from you.  If you have any other questions or comments about this initiative, please visit this topic in our forum: https://discuss.okfn.org/t/sloan-foundation-funds-frictionless-data-at-open-knowledge/1928 or hashtag #frictionlessdata. 

Stuart Yeates: Prep notes for NDF2011 demonstration

planet code4lib - Mon, 2016-02-29 06:56
I didn't really have a presentation for my demonstration at the NDF, but the event team have asked for presentations, so here are the notes for my practice demonstration that I did within the library. The notes served as an advert to attract punters to the demo; as a conversation starter in the actual demo and as a set of bookmarks of the URLs I wanted to open.



Depending on what people are interested in, I'll be doing three things

*) Demonstrating basic editing, perhaps by creating a page from the requested articles at http://en.wikipedia.org/wiki/Wikipedia:WikiProject_New_Zealand/Requested_articles

*) Discussing some of the quality control processes I've been involved with (http://en.wikipedia.org/wiki/Wikipedia:Articles_for_deletion and http://en.wikipedia.org/wiki/New_pages_patrol)

*) Discussing how wikipedia handles authority control issues using redirects (https://secure.wikimedia.org/wikipedia/en/wiki/Wikipedia:Redirect ) and disambiguation (https://secure.wikimedia.org/wikipedia/en/wiki/Wikipedia:Disambiguation )

I'm also open to suggestions of other things to talk about.

Stuart Yeates: Thoughts on the NDFNZ wikipedia panel

planet code4lib - Mon, 2016-02-29 06:55





Last week I was on an NDFNZ wikipedia panel with Courtney Johnston, Sara Barham and Mike Dickison. Having reflected a little and watched the youtube at https://www.youtube.com/watch?v=3b8X2SQO1UA I've got some comments to make (or to repeat, as the case may be).

Many people, including apparently including Courtney, seemed to get the most enjoyment out of writing the ‘body text’ of articles. This is fine, because the body text (the core textual content of the article) is the core of what the encyclopaedia is about. If you can’t be bothered with wikiprojects, categories, infoboxes, common names and wikidata, you’re not alone and there’s no reason you need to delve into them to any extent. If you start an article with body text and references that’s fine; other people will to a greater or less extent do that work for you over time. If you’re starting a non-trivial number of similar articles, get yourself a prototype which does most of the stuff for you (I still use https://en.wikipedia.org/wiki/User:Stuartyeates/sandbox/academicbio which I wrote for doing New Zealand women academics). If you need a prototype like this, feel free to ask me.

If you have a list of things (people, public art works, exhibitions) in some machine readable format (Excel, CSV, etc) it’s pretty straightforward to turn them into a table like https://en.wikipedia.org/wiki/Wikipedia:WikiProject_New_Zealand/Requested_articles/Craft#Proposed_artists or https://en.wikipedia.org/wiki/Enjoy_Public_Art_Gallery Send me your data and what kind of direction you want to take it.

If you have a random thing that you think needs a Wikipedia article, add to https://en.wikipedia.org/wiki/Wikipedia:WikiProject_New_Zealand/Requested_articles  if you have a hundred things that you think need articles, start a subpage, a la https://en.wikipedia.org/wiki/Wikipedia:WikiProject_New_Zealand/Requested_articles/Craft and https://en.wikipedia.org/wiki/Wikipedia:WikiProject_New_Zealand/Requested_articles/New_Zealand_academic_biographies both completed projects of mine.

Sara mentioned that they were thinking of getting subject matter experts to contribute to relevant wikipedia articles. In theory this is a great idea and some famous subject matter experts contributed to Britannica, so this is well-established ground. However, there have been some recent wikipedia failures particularly in the sciences. People used to ground-breaking writing may have difficulty switching to a genre where no original ideas are permitted and everything needs to be balanced and referenced.

Preparing for the event, I created a list of things the awesome Dowse team could do as follow-ups to they craft artists work, but we never got to that in the session, so I've listed them here:
  1. [[List of public art in Lower Hutt]] Since public art is out of copyright, someone could spend a couple of weeks taking photos of all the public art and creating a table with clickable thumbnail, name, artist, date, notes and GPS coordinates. Could probably steal some logic from somewhere to make the table convertible to a set of points inside a GPS for a tour.
  2. Publish from their archives a complete list of every exhibition ever held at the Dowse since founding. Each exhibition is a shout-out to the artists involved and the list can be used to check for potentially missing wikipedia articles.
  3. Digitise and release photos taken at exhibition openings, capturing the people, fashion and feeling of those era. The hard part of this, of course, is labelling the people.
  4. Reach out to their broader community to use the Dowse blog to publish community-written obituaries and similar content (i.e. encourage the generation of quality secondary sources).
  5. Engage with your local artists and politicians by taking pictures at Dowse events, uploading them to commons and adding them to the subjects’ wikipedia articles—have attending a Dowse exhibition opening being the easiest way for locals to get a new wikipedia image.
I've not listed the 'digitise the collections' option, since at the end of the day, the value of this (to wikipedia) declines over time (because there are more and more alternative sources) and the price of putting them online declines. I'd much rather people tried new innovative things when they had the agility and leadership that lets them do it, because that's how the community as a whole moves forward.

Journal of Web Librarianship: A Review of "The Complete Guide to Using Google in Libraries: Research, User Applications, and Networking, Vol. 2"

planet code4lib - Mon, 2016-02-29 05:37
Volume 10, Issue 1, January-March 2016, pages 45-45
10.1080/19322909.2016.1123991
Dena L. Luce

Terry Reese: MarcEdit Update

planet code4lib - Sun, 2016-02-28 14:49

Update was posted Feb. 27 to all versions.  Update Contains the following changes:

6.2.85

  • Enhancement: Characterset Detection: MarcEdit is including a tool that will provide a heuristical analysis of a file to provide best guess characterset detection. (http://blog.reeset.net/archives/1897)
  • Enhancement: Build New Tool Function: Adding a find macro to the function so that users can now identify specific fields when building new fields from data in a MARC record. (http://blog.reeset.net/archives/1902)
  • Update: Build Links — improved handling of MESH data ** Update: Build Links — improved handling of AAT data
  • Update: Build Links — improved handling of ULAN data
  • Update: Build Links — added work around to character escaping issues found in .NET 4.0. Issue impacts URIs with trailing periods and slashes (/). Apparently, the URI encoding tool doesn’t escape them properly because of how Windows handles file paths.
  • Update: Build Links — Rules file updated to include refined definitions for the 6xx fields.
  • Update: MarcEdit Command-Line: program updated to include new build links functional updates
  • Update: COM object: Updated character encoding switching to simplify streaming functions.
  • Update: Validate Headings: Integrated rules file into checking.
  • Bug Fix: Validate Headings: headings validation was being tripped by the URI escaping issue in .NET 4.0. This has been corrected.
  • Update: RDA Helper: Finished code refinements
  • Update: Build Links — tool is now asynchronous
  • Enhancement: Build Links — Users can now select and build their own rules files
  • Enhancement: Build Links — Tool now includes a function that will track resolution speed from linked services and attempt to provide notification when services are performing poorly. First version won’t identify particular services — just that data isn’t being processed in a timely manner.
  • Bug Fix: Character Conversion — UTF-8 to MARC-8, the {dollar} literal isn’t being converted back to a literal dollar sign. This is related to removing the fall back entity checking in the last update. This has been corrected.

Updates can be picked up through the automated update tools in MarcEdit or via the downloads page: http://marcedit.reeset.net/downloads

 

–tr

FOSS4Lib Recent Releases: DSpace CRIS - 5.4.0

planet code4lib - Sun, 2016-02-28 00:26

Last updated February 27, 2016. Created by David Nind on February 27, 2016.
Log in to edit this page.

Package: DSpace CRISRelease Date: Thursday, February 25, 2016

FOSS4Lib Recent Releases: Koha - 3.22.4

planet code4lib - Sun, 2016-02-28 00:12
Package: KohaRelease Date: Saturday, February 27, 2016

Last updated February 27, 2016. Created by David Nind on February 27, 2016.
Log in to edit this page.

Monthly maintenance release for Koha. It includes three enhancements and 32 bug fixes.

See the release announcement for the details:

Koha 3.22.4 is the latest stable release of Koha and is recommended for new installations. The recommended installation method is to use the packages for Debian and Ubuntu, rather than the tar file or git.

Patrick Hochstenbach: Pen and Ink on a mountain

planet code4lib - Sat, 2016-02-27 06:46
Filed under: portaits, Sketchbook Tagged: art, black, illustration, ink, pen, portrait, rotring, sketchbook, white

District Dispatch: Congress stands still; technology, the courts and fair use marches onwards

planet code4lib - Fri, 2016-02-26 19:27

Guest Blog Post by Tom Lipinski*

Tom Lipinski is Dean and Professor at the School of Information Studies, University of Wisconsin—Milwaukee

As I was preparing the readings for my doctoral seminar in Information Policy class the other week I ran across a Congressional Budget Office report from 2004 [Copyright Issues in Digital Media (A Congressional Budget Office Paper)]. The last part of the report evaluates four courses of action: Forbearance (doing nothing), increase the use of compulsory licensing, revise the law in favor of copyright holders, or revise the law in favor of users. Fast-forward to more recent discussion [Copyright Policy, Creativity, and Innovation in the Digital Economy (The Department of Commerce Internet Policy Task Force), July, 2013) and Orphan Works and Mass Digitization: A Report of the Register of Copyrights, June, 2015] and the same time-worn discussion of changes in the copyright law occur without a viable solution.

Discussion of all four options continues; policy makers remain standing still. Thank goodness the wheels of justice, while at times slow moving, still turn. The courts have been addressing new technologies and copyright, often through the framework of fair use.

Reflecting on the intent of fair use we see its flexibility as well as predictability. Pamela Samuelson, observes in her law review entitled Unbundling Fair Uses [77 FORDHAM LAW REVIEW 2537 (2009)]: “The copyright fair use case law is more coherent and more predictable than many commentators seem to believe. Fair use cases tend to fall into common patterns…policy-relevant clusters …promoting freedom of speech and of expression, the ongoing progress of authorship, learning, access to information, truth telling or truth seeking, competition, technological innovation, and the privacy and autonomy interests of users. If one analyzes putative fair uses in light of cases previously decided in the same policy cluster, it is generally possible to predict whether a use is likely to be fair or unfair.” She is correct in that fair use cases do point the way forward. And while everyone is talking about the Authors Guild and Google litigations, other cases are reinforcing the public benefit and fair use of efforts to help Internet users corral and harness the vast amounts of information available. These cases are relevant to us in the library world where a prime function of our service mission is to help our patrons access the ever deepening sea of information through use of technological tools such as automated indexing, search and retrieval systems, data mining, etc. These tools of function are at the core of information work.

One recent case from the summer of 2015 underscores these fair uses and differentiates lawful from unlawful uses of the documents or resources these tools locate, identify and tag or retrieve. The case is Fox News Network, LLC TVEyes, Inc., 2015 WL 5025274 (S.D. N.Y.). TVEyes, Inc. provides a service that allows its subscribers to track the usage of words or phrases of interest that appear in new broadcasts (actually the transcripts via closed captioning of such broadcasts). The service is used by government agencies, law enforcement, businesses, NGOs, members of Congress, etc. It is not available to the general public. The 22,000 plus users of TVEyes search news broadcasts for a variety of reasons: “Government bodies use it to monitor the accuracy of facts reported…. Political campaigns use it to monitor political advertising and appearances of candidates….. Financial firms… public statements made by their employees for regulatory compliance…. White House to evaluate news stories and give feedback to the press corps…. United States Army uses TVEyes to track media coverage of military operations in remote locations…. Elected officials use TVEyes to confirm the accuracy of information reported… TVEyes provides substantial benefit to the public.”

A previous decision in Fox News Network, LLC TVEyes, Inc., 43 F.Supp.3d 379 (S.D.N.Y. 2014) concluded that “TVEyes’ copying of Fox News’ broadcast content for indexing and clipping services to its subscribers constitutes fair use.” The court found that the service “provides substantial benefit to the public” by allowing a user to “easily and efficiently text-search []… 27,000 hours of programming… most of which is not available online or anywhere else, to track and discover information.” So this is yet another case in which a commercial entity that mined, stored and made retrievable an archive of content protected by copyright was found to be a fair use.  Libraries can and do create such tools!

The fact-specific context of any fair use case is crucial. It is important to note that like other litigated mass digitization efforts that were determined to be fair use these scenarios do not involve simply making available large amounts of protected content for any and all seekers; rather snippets by Google, Inc., page numbers by the HathiTrust and here, new clips and not the entire news broadcast. Further the clipping service was available to a confined audience (albeit 22,000 plus subscribers) for a specific purpose other than as the news. TVEyes is not a competing news service but is used for specific purposes such as tracking frequency and trending of stories (news about the news), error and accuracy reporting and compliance monitoring.

Further, and like the Google Books case, it was unlikely that a subscriber would “go through the trouble of running countless searches to reconstruct a full broadcast.” Remember in the Google Books case, its search/retrieval algorithm and blackout patterns rendered that result impossible. Though possible here the judge found it too improbable to impact the legal analysis.

Fair use triumphs again!

The remaining four services TVEyes, Inc. provides its customers was the subject of concern in the 2015 decision: archiving by users with the contents stored solely on the TVEyes, Inc. server, sharing via email a URL link to a video clip, downloading and permanent storage of video clips on customer devices and a date-time search feature.

Content remains searchable and available on the TVEyes server for a period 32 days. A subscriber has the ability—once a clip has been identified within that 32 day window— to “archive” the clip, that is tag it so that repeat searches would not be needed to locate the clip again. The clip remains available to the subscriber indefinitely. However, the archived clips are not stored on the subscriber’s computer but remain on TVEyes’ server, but are easily retrievable by the subscriber. The court observed that the ability to archive clips for reviewing at a later date helped promote the public discourse necessary for democracy and the free exchange of ideas. As the archive function complements the service’s main search and index function, the court concluded it was fair use.

The remaining three features did not fare so well in terms of fair use. Subscribers can also share clips by sending a URL link. “The link is public, meaning the recipient does not need to possess TVEyes login credentials in order to access the video.” While this feature can further help users engage in commentary, criticism and other fair uses listed in the fair use statute the court observed the “substantial potential for abuse.”  There is no control over who could eventually access the link-clip or to what use that clip would be put. However the court was quick to add that once TVEyes, Inc. develops “reasonable and adequate protections” and demonstrates the effectiveness of those measures this feature might be considered fair use.

The court concluded that allowing subscribers to download and store clips on their own devices was not fair use as the feature “goes well beyond TVEyes’ transformative services of searching and indexing.” While the ability to “download unlimited clips to keep forever and distribute freely may be attractive” the court felt it insufficiently related to the transformative public benefit the service provides. A final service provided was the ability to search by date-time. Date-time constitute about 5.5% of all searches on the service. TVEyes argued that date-time is a necessary companion to its keyword searching, i.e., clips are located by searching the closed-captioning associated with a particular clip.  Closed-captioning is riddled with errors so time-date increases the chance the user will locate a relevant clip, but this is true if the user knows the exact date-time (up to a 10 minute window) of the original broadcast airing. The court however viewed the feature as a “content delivery tool for users who already know what they seek” and concluded the feature was not fair use. Stay Tuned: in late 2015 the judge cleared the way for the parties’ ability to appeal the district court to the United States Court of Appeals for the Second Circuit, the same circuit that decided the Google Books and HathiTrust cases.  The fair use determinations should be upheld on appeal.

Building a library of clips might remain prohibited but assuming the URL that subscribers share via email is to a lawfully made copy of the broadcast, sharing a URL to a non-infringing source should also be lawful. The URL itself is not protected by copyright, same as a street address is not. The date-time search feature might meet a similar fate as subscribers arguably use the feature to verify what was reported on a particular date. The court even used the example of Ted Cruz’s staff not being able to locate a particular interview even though staffers knew the date and broadcast of the interview that the closed captioning mechanism translated as “ted crews.” Like the other search feature only “segments” are locatable. If this is the case the feature should be fair use as well. It is similar to the function of cite-checking that many researchers use Google Books to accomplish.

*Note: Guest blogger Tom Lipinski is Dean and Professor at the School of Information Studies, University of Wisconsin—Milwaukee

The post Congress stands still; technology, the courts and fair use marches onwards appeared first on District Dispatch.

LibUX: “So, exactly how goth is the user experience?”

planet code4lib - Fri, 2016-02-26 04:18

The third episode in our new question-and-answer bonus series raises more questions than anything. Meg Ecclestone stumbles into a blooper reel. We hope you enjoy.

  • 0:32 – “I worry about the state of virtual reality for libraries.”
  • 0:45 – “I always tell people that in librarianship we need to dream smaller, but you have dreamed very small, Michael. Very small indeed.”
  • 2:16 – “How goth is the state of library user experience?”
  • 3:35 – Meg and Michael go to camp.

This is so Hot Topic. Meg Ecclestone

  • 5:05 – We are clearly off topic.
  • 7:48 – “On Friday, April 16th, Deathrock came to town.”

As always, you can can download the MP3.

You can subscribe to LibUX on Stitcher, iTunes, or plug our feed right into your podcatcher of choice. Help us out and say something nice. You can find every podcast on www.libux.co.

The post “So, exactly how goth is the user experience?” appeared first on LibUX.

HangingTogether: The MARC Field That Refused to Die

planet code4lib - Thu, 2016-02-25 22:43

Here at OCLC Research we look at a lot of bibliographic data. Usually in the aggregate, after having processed the 350+ million WorldCat records using our Hadoop cluster. One such instance is my “MARC Usage in WorldCat” project, where for the last 4 years I’ve reported on the occurrences of MARC elements, and in some cases, even the contents of particular subfields.

So as I was turning the crank on the 2016 data, I happened to notice an odd anomaly. MARC field 265 was made obsolete for some formats in 1983, and for the remainder in 1993. In 2013 there were 354,628 occurrences out of about 289 million records. In 2014 the number fell to a gratifying 158,465 out of 311 million records, and then disappeared entirely in the 2015 reporting. This was all due to work by our WorldCat Quality Control team, as they completed the conversion that they announced they would do in August 2014.

Now here is where it gets good. It’s back. We found 68 occurrences in 2016, with 3,616 holdings attached to these records. As soon as I alerted our Quality Control team, they swung into action, determined who the offending party was, notified them to please stop, and worked to clean up the existing instances. They are also working to put checks into place so these are caught on ingest.

Zombie metadata isn’t nearly as frightening as “The Walking Dead“, although for librarians it’s close. But it’s nice to know that when we find errors like this we have staff and procedures in place to take care of them.

Photo by Mike Mozart, CC-BY 2.0

About Roy Tennant

Roy Tennant works on projects related to improving the technological infrastructure of libraries, museums, and archives.

Mail | Web | Twitter | Facebook | LinkedIn | Flickr | YouTube | More Posts (93)

LITA: Vote for Us! DH Awards 2015

planet code4lib - Thu, 2016-02-25 22:12

We’re honored that the LITA blog has been nominated by the Digital Humanities Awards in the category “Best DH Blog Post or Series of Posts“. Though the DH Awards don’t point to any specific posts as the basis for their nomination, we’re guessing it’s because of posts like Grace Thomas’ post on using Omeka in digital library services, Bryan Brown’s post musing on what librarianship means, Lindsay Cronk’s exploration of text mining tools, or Nimisha Bhat’s post on scholarly engagement and Twitter. And that’s hardly scratching the surface of the awesome content we strive to produce for LITA blog readers!

We would love to have your vote! But hurry, since voting closes on Saturday, February 26. Vote for LITA blog on the DH Awards form.

Thanks, as always, for reading the LITA blog! As a reminder, if you’re looking for a place to share writing on a library technology topic (including digital humanities!), just let us know.

Open Library: 25,000 emails in three years

planet code4lib - Thu, 2016-02-25 22:11

A slightly more personal note here… it’s been a little over three years since I started working at Open Library and just this past week we hit a milestone of 25,000 emails sent. That’s slightly lower than the number of emails we get because some are just saying “Thank you!” and some we forward to other departments and yes, a few are spam. But the rest–the tech support, the early book returns, the reference questions, the merge requests–have been answered by me and Michelle and Laurel.

It’s been very gratifying to help keep Open Library’s ebook lending library open and thriving and very interesting to watch the ebook environment changing around us since we first opened in a much more limited fashion in 2005. Here’s to ten more years of free ebook lending and a continually improving ebook reader experience in the next ten years!

LITA: Google Cardboard

planet code4lib - Thu, 2016-02-25 21:33

Google Cardboard is getting a lot of press these days. It’s infiltrated fashion shows and classrooms and it’s coming for your Coke can. More importantly, it’s the next big thing for libraries. If you’re new to Cardboard, it’s essentially housing made of cardboard that turns your phone into a virtual reality (VR) viewer. The idea is simple, but the experience is nothing short of magical. I’ve been experimenting with my viewer for almost a year and the novelty still hasn’t worn off. Similar products include Oculus Rift and Samsung’s Gear VR, but they come with a hefty price tag. A Cardboard viewer, on the other hand, will run you about $10 or less; Google even provides the blueprints if you want to create your own from scratch. The low cost, minimal learning curve, and interactivity of Cardboard make it the perfect tool to engage your library patrons.

Here are five ways to start using Cardboard in your library:

1. Get crafty.
Before the VR experience begins, you’ve got a real DIY opportunity on your hands. Bust out the hot glue gun and invite your patrons to decorate your viewers, or better yet, liven up your next staff meeting with a craft session.

2. Create a virtual tour of your library system.
At Indiana University we have over 19 branch libraries and I’ve yet to hoof it to each one. We’re currently creating a tour of these libraries using Google’s photo spheres. We hope that the novelty of a VR tour will entice students to participate and the experience will expand their knowledge of the Libraries’ massive collections and resources.

3. Organize a field trip.
Google is now piloting their Expeditions Pioneer Program that creates virtual field trips for schools. The program is invitation only, but it’s easy to create your own program using Google Street View. Why not enhance your school visits with a trip around the world? How about an armchair travel program for adults?

4. Expand your 3D printing services.
You might not have the resources to reprint every design that comes through your library, but why not preserve it on the web? With Sketchfab you can view 3D objects directly in your web browser. Ask your patrons if you can upload their designs to Sketchfab and create a collection for your library. Place a Cardboard viewer next to your 3D printer to showcase their designs.

5. Host a VR Game Night.
I’ll admit that the number of decent Cardboard apps is limited, especially when it comes to games. That said, I’ve had good luck with Lamper VR Firefly Rescue, Titans of Space, and DinoTrek. These apps are all free from the Play Store and easy to play, but be warned, if you get motion sickness these apps will probably do a number on you.

Pages

Subscribe to code4lib aggregator