Latest crime data shows that the UK is getting significantly more ‘peaceful’. Last month, the Institute for Economics and Peace published the UK Peace Index, revealing UK crime figures have fallen the most of all EU countries in the past decade. Homicide rates, to take one indicator, have halved over the last decade.
But the British public still feels that crime levels are rising. How can opening up crime data play a part in convincing us we are less likely to experience crime than ever before?The ‘Perception Gap’
The discrepancy between crime data and perceptions of the likelihood of crime is particularly marked in the UK. Although it has been found that a majority of the public broadly trust official statistics, the figures are markedly lower for those relating to crime. In one study, 85% of people agreed that the Census accurately reflects changes in the UK, but only 63% said the same of crime statistics.Credibility of Police Data
Police forces have been publishing crime statistics in the UK since 2008, using their own web-based crime mapping tools or via the national crime mapping facility (http://maps.police.uk/ and http://www.police.uk). This has been purportedly for the purpose of improving engagement with local communities alongside other policy objectives, such as promoting transparency. But allegations of ‘figure fiddling’ on the part of the police have undermined the data’s credibility and in 2014, the UK Statistics Authority withdrew its gold-standard status from police figures, pointing to ‘accumulating evidence’ of unreliability.
The UK’s open data site for crime figures allows users to download street-level crime and outcome data in CSV format and explore the API containing detailed crime data and information about individual police forces and neighbourhood teams. It also provides Custom CSV download and JSON API helper interfaces so you can more easily access subsets of the data.
But the credibility of the data has been called into question. Just recently, data relating to stop-search incidents for children aged under-12 was proved ‘inaccurate’. The site itself details many issues which call the accuracy of the data into question: inconsistent geocoding policies in police forces; “Six police forces we suspect may be double-reporting certain types of incidents“; ‘siloed systems’ within police records; and differing IT systems from regional force to force.
In summary, we cannot be sure the ‘data provided is fully accurate or consistent.’The Role the Media Plays: If it Bleeds, it Leads
In response to persistent and widespread public disbelief, the policies of successive UK governments on crime have toughened: much tougher sentencing, more people in prison, more police on the streets. When the British public were asked why they think there is more crime now than in the past, more than half (57%) stated that it was because of what they see on television and almost half (48%) said it was because of what they read in newspapers [Ipsos MORI poll on Closing the Gaps. One tabloid newspaper, exclaimed just recently: “Rape still at record levels and violent crime rises” and “Crime shows biggest rise for a decade“. As the adage goes, If it Bleeds, it Leads.Crime Data and Mistrust of the Police
Those engaged in making crime figures meaningful to the public face unique challenges. When Stephen Lawrence was murdered in 1993, and the following public inquiry found institutional racism to be at the heart of the Met police, public trust towards the police was shattered. Since then, the police have claimed to have rid their ranks of racism entirely.
But many remain less than convinced. According to official statistics, in 1999-2000, a black person was five times more likely than a white person to be stopped by police. A decade later, they were seven times more likely. One criminologist commented: “Claims that the Lawrence inquiry’s finding of institutional racism no longer apply have a hollow ring when we look at the evidence on police stops.” [Michael Shiner reported in the Guardian].
Equally, the police distrust the public too. The murder of two young, female police officers in Manchester in 2012 ignited the long-rumbling debate over whether the police should be armed. So the divide between the police and the public is a serious one.A Different Tack?
In 2011, a review was undertaken by the UK Statistics Authority into Crime Data. Its recommendations included:
- Improving the presentation of crime statistics to make them more authoritative
- Reviewing the availability of local crime and criminal justice data on government websites to identify opportunities for consolidation
- Sharing of best practice and improvements in metadata and providing reassurance on the quality of police crime records.
It’s clear that the UK police recognise the importance of improving their publication of data. But it seems that opening data alone won’t fix the shattered trust between the public and the police, even if the proof that Britons are safer than ever before is there in transparent, easily navigable data. We need to go further back in the chain of provenance, scrutinise the reporting methods of the police for instance.
But this is about forgiveness too, and the British public might just not be ready for that yet.
Changes in this update:6.1.21 * 6.1.21 ** Bug Fix: Conditional Delete - When selecting regular expressions -- there were times when the process wasn't being recognized. ** Enhancement: Conditional Delete - This function use to only work when using the Regular Expression option. This now works for all options. ** Bug Fix: ValidateISBNs - Process would only process the first subfield. If the subfield to be processed wasn't the first one, it wouldn't be validated. ** Enhancement: ValidateISSN: Uses mathematical formula to validate ISSNs. ** Bug Fix: Generate Fast Headings (Stand alone tool) -- LDR fields could be deleted. ** Enhancement: Working to make the global edit functions a little more fault tolerant around record formatting. ** Enhancement: Generate MARC record from URL -- program generates MARC records from Webpages. If you pass it an LC URL, it will generate data from the MARCXML. At this point, only the Windows and Linux downloads were updated. I'll be replacing the Mac download with the first version of the native OSX build the July 4th weekend. You can get the updates either via the Automated updated tool or from the website at: http://marcedit.reeset.net/downloads --tr
Given a citations.xml file, this suite of software — the JSTOR Workset Browser — will cache and index content identified through JSTOR’s Data For Research service. The resulting (and fledgling) reports created by this suite enables the reader to “read distantly” against a collection of journal articles.
The suite requires a hodgepodge of software: Perl, Python, and the Bash Shell. Your milage may vary. Sample usage: cat etc/citations-thoreau.xml | bin/make-everything.sh thoreau
“Release early. Release often”.
Zotero 4.0.27, now available, brings some major new features, as well as many other improvements and bug fixes.
In Zotero for Firefox, it’s now easier than ever to save items from webpages.
Zotero senses information on webpages through bits of code called site translators, which work with most library catalogs, popular websites such as Amazon and the New York Times, and many gated databases.
In the past, there have been two different ways of saving web sources to Zotero:
- If Zotero detected a reference on a webpage, you could click an icon in the address bar — for example, a book icon on Amazon or a journal article icon on a publisher’s site — to save high-quality metadata for the reference to your Zotero library.
- If a site wasn’t supported or a site translator wasn’t working, you could still save any webpage to your Zotero library by clicking the “Create Web Page Item from Current Page” button in the Zotero for Firefox toolbar or by right-clicking on the page background and choosing “Save Page to Zotero”. In such cases, you might need to fill in some details that Zotero couldn’t automatically detect.
In Zotero 4.0.27, we’ve combined the address bar icon and the “Create Web Page Item from Current Page” button into a single save button in the Firefox toolbar, next to the existing Z button for opening the Zotero pane.
The new save button on a New York Times article
(Don’t be confused by the book icon in the address bar in the top left — that’s a new Firefox feature, unrelated to Zotero.)
You can click the new save button on any webpage to create an item in your Zotero library, and Zotero will automatically use the best available method for saving data. If a translator is available, you’ll get high-quality metadata; if not, you’ll get basic info such as title, access date, and URL, and you can edit the saved item to add additional information from the webpage. The icon will still update to show you what Zotero found on the page, and, as before, you can hover over it to see which translator, if any, will be used.
This also means that a single shortcut key — Cmd+Shift+S (Mac) or Ctrl+Shift+S (Windows/Linux) by default — can be used to save from any webpage.
The new save button also features a drop-down menu for accessing additional functionality, such as choosing a non-default translator or looking up a reference in your local (physical) library without even saving it to Zotero.
Additional save options
(This functionality was previously available by right-clicking on the address bar icon, though if you knew that, you surely qualify for some sort of prize.) The new menu will be used for more functionality in the future, so stay tuned.
Prefer another layout? In addition to the new combined toolbar buttons, Zotero provides separate buttons for opening Zotero and saving sources that can be added using Firefox’s Customize mode.
Custom button layout
With the separate buttons, you can hide one or the other button and rely on a keyboard shortcut, move the buttons into the larger Firefox menu panel, or even move the new save button between the address bar and search bar, close to its previous position. (Since the new save button works on every page, it no longer makes sense for it to be within the address bar itself, but by using the separate buttons you can essentially recreate the previous layout.)
While all the above changes apply only to Zotero for Firefox for now, similar changes will come to the Chrome and Safari connectors for Zotero Standalone users in a future version. For now, Zotero Standalone users can continue to use the address bar (Chrome) or toolbar (Safari) icon to save recognized webpages and right-click (control-click on Macs) on the page background and choose “Save Page to Zotero” to save basic info for any other page.
Making Zotero accessible to users around the world has always been a priority. Thanks to a global community of volunteers in the Zotero and Citation Style Language (CSL) projects, you can use the Zotero interface and also generate citations in dozens of different languages.
Previously, Zotero would automatically use the language of the Zotero user interface — generally the language of either Firefox or the operating system — when generating citations. While you’ve always been able to generate citations using a different language, doing so required changing a hidden preference.
You can now set the bibliography language at the same time you choose a citation style, whether you’re using Quick Copy, Create Bibliography from Selected Items, or the word processor plugins.
Choosing a bibliography language for Quick Copy
In the above example, even though the user interface is in English, the default Quick Copy language is being set to French. If an item is then dragged from Zotero into a text field, the resulting citation will be in French, using French terms instead of English ones (e.g., “édité par” instead of “edited by”).
The new language selector is even more powerful when using the word processor plugins. The bibliography language chosen for a document is stored in the document preferences, allowing you to use different languages in different documents — say, U.S. English for a document you’re submitting to an American journal and Japanese for a paper for a conference in Japan.
Note that, of the thousands of CSL styles that Zotero supports, not all can be localized. If a journal or style guide calls for a specific language, the language drop-down will be disabled and citations will always be generated using the required language. For example, selecting the Nature style will cause Zotero to use the “English (UK)” locale in all cases, as is required by Nature’s style guide.Other changes
Zotero now offers an “Export Library…” option for group libraries, allowing the full collection hierarchy to be easily exported. If you find yourself facing many sync conflicts, you can now choose to resolve all conflicts with changes from one side or the other. For Zotero Standalone users, we’ve improved support for saving attachments from Chrome and Safari on many sites, bringing site compatibility closer to that of Zotero for Firefox. And we’ve resolved various issues that were preventing complete syncs for some people.
There’s too much else to discuss here, but see the changelog for the full list of changes.Get it now
If you’re already using Zotero, your copy of Zotero should update to the new version automatically, or you can update manually from the Firefox Add-ons pane or by selecting the “Check for Updates” menu option in Zotero Standalone. If you’re not yet using Zotero, try it out today.
Last week’s landmark ruling from the Supreme Court on same sex marriage was routinely published on the Web as a PDF. Given the past history of URL use in Supreme Court opinions I thought I would take a quick look to see what URLs were present. There are two, both are in Justice Alito’s dissenting opinion, and one is broken … just four days after the PDF was published. You can see it yourself at the bottom of page 100 in the PDF.
If you point your browser at
you will get a page not found error:
Sadly even the Internet Archive doesn’t have a snapshot of the page available.
But notice it thinks it can get a copy of it still. That’s because the Center for Disease Control’s website is responding with a 200 OK instead of a 404 Not Found:zen:~ ed$ curl -I http://www.cdc.gov/nchs/data/databrief/db18.pdf HTTP/1.1 200 OK Content-Type: text/html X-Powered-By: ASP.NET X-UA-Compatible: IE=edge,chrome=1 Date: Tue, 30 Jun 2015 16:22:18 GMT Connection: keep-alive
At any rate, it’s not Internet Archive’s fault that they haven’t archived the Webpage originally published in 2009, because the URL is actually a typo. Instead it should be
which leads to:
So between the broken URL and the 200 OK for something not found we’ve got issues of link rot and reference rot all rolled up into a one character typo. Sigh.
I think a couple lessons for web publishers can be distilled from this little story:
- when publishing on the Web include link checking as part of your editorial process
- if you are going to publish links on the Web use a format that’s easy to check … like HTML.
The American Library Association is pleased to announce the appointment of Jenny Levine as the Executive Director of the Library and Information Technology Association, a division of the ALA, effective August 3, 2015. Ms. Levine has been at the American Library Association since 2006 as the Strategy Guide in ALA’s Information Technology and Telecommunications Services area, charged with providing vision and leadership regarding emerging technologies, development of services, and integration of those services into association and library environments. In that role she coordinated development of ALA’s collaborative workspace, ALA Connect, and provided ongoing support and documentation. She convened the staff Social Media Working Group and coordinated a team-based approach for strategic posting to ALA’s social media channels. In addition, she has been the staff liaison to ALA’s Games and Gaming Round Table (GameRT) and coordinated a range of activities, including the 2007 & 2008 Gaming, Learning, and Libraries Symposia and International Games Day @ your library. She developed the concept for and manages the Networking Uncommons gathering space at ALA conferences.
Prior to joining the ALA staff, Jenny Levine held positions as Internet Development Specialist and Strategy Guide at the Metropolitan Library System in Burr Ridge (IL), Technology Coordinator at the Grande Prairie Public Library District in Hazel Crest (IL), and Reference Librarian at the Calumet City Public Library in Calumet City (IL). She received the 2004 Illinois Library Association Technical Services Award and a 1999 Illinois Secretary of State Award of Recognition.
Jenny has an M.L.S. from the University of Illinois, Urbana-Champaign, and a B.S. in Journalism/Broadcast News from the University of Kansas, Lawrence. Within ALA, she is a member of LITA, GameRT, the Intellectual Freedom Round Table (IFRT), and the Gay, Lesbian, Bisexual, and Transgender Round Table (GLBTRT). She is also active outside ALA and belongs to the American Civil Liberties Union (ACLU), the Electronic Frontier Foundation (EFF), the ALA-tied Freedom to Read Foundation (FTRF), the Human Rights Campaign (HRC) and the Illinois Library Association (ILA).
Jenny Levine has been an active presenter and writer, including three issues of Library Technology Reports on Gaming & Libraries. Among the early explorers of Library 2.0 technologies, from the Librarians’ Site du Jour (the first librarian blog) to the ongoing The Shifted Librarian, she is active in a wide variety of social media.
Ms. Levine becomes executive director of LITA on the retirement of Mary Taylor, LITA executive director since 2001. Thanks go to the search committee for a thoughtful and successful process: Rachel Vacek, Thomas Dowling, Andromeda Yelton, Isabel Gonzalez-Smith, Keri Cascio, Dan Hoppe and Mary Ghikas.
- A Flaw In The Design, discussing the early history of the Internet and how the difficulty of getting it to work at all and the lack of perceived threats meant inadequate security.
- The Long Life Of A Quick 'Fix', discussing the history of BGP and the consistent failure of attempts to make it less insecure, because those who would need to take action have no incentive to do so.
- A Disaster Foretold - And Ignored, discussing L0pht and how they warned a Senate panel 17 years ago of the dangers of Internet connectivity but were ignored.
More below the fold.
The compromises at OPM and at Sony Pictures have revealed some truly pathetic security practices at both organizations, which certainly made the bad guy's job very easy. Better security practices would undoubtedly have made their job harder. But it is important to understand that in a world where Kaspersky and Cisco cannot keep their systems secure, better security practices would not have made the bad guy's job impossible.
OPM and Sony deserve criticism for their lax security. But blaming the victim is not a constructive way of dealing with the situation in which organizations and individuals find themselves.
Prof. Jean Yang of C-MU has a piece in MIT Technology Review entitled The Real Software Security Problem Is Us that, at first glance, appears to make a lot of sense but actually doesn't. Prof. Yang specializes in programming languages and is a "cofounder of Cybersecurity Factory, an accelerator focused on software security". Writing:
we could, in the not-so-distant future, actually live in a world where software doesn’t randomly and catastrophically fail. Our software systems could withstand attacks. Our private social media and health data could be seen only by those with permission to see it. All we need are the right fixes.
A better way would be to use languages that provide the guarantees we need. The Heartbleed vulnerability happened because someone forgot to check that a chunk of memory ended where it was supposed to. This could only happen in a programming language where the programmer is responsible for managing memory. So why not use languages that manage memory automatically? Why not make the programming languages do the heavy lifting?
Change won’t happen until we demand that it happens. Our software could be as well-constructed and reliable as our buildings. To make that happen, we all need to value technical soundness over novelty. It’s up to us to make online life is as safe as it is enjoyable.It isn't clear who Prof. Yang's "we" is, end users or programmers. Suppose it is end users. Placing the onus on end users to demand more secure software built with better tools is futile. There is no way for an end user to know what tools were used to build a software product, no way to compare how secure two software products are, no credible third-party rating agency to appeal to for information. So there is no way for the market to reward good software engineering and punish bad software engineering.
Placing the onus on programmers is only marginally less futile. No-one writes a software product from scratch from the bare metal up. The choice of tools and libraries to use is often forced, and the resulting system will have many vulnerabilities that the programmer has no control over. Even if the choice is free, it is an illusion to believe that better languages are a panacea for vulnerabilities. Java was designed to eliminate many common bugs, and it manages memory. It was effective in reducing bugs, but it could never create a "world where software doesn’t randomly and catastrophically fail".
Notice that the OPM compromise used valid credentials presumably from social engineering, so it would have to be blamed on system administrators not programmers, or rather on management's failure to mandate two-factor authentication. But equally, even good system administration couldn't make up for Cisco's decision to install default SSH keys for "support reasons".
For a more realistic view, read A View From The Front Lines, the 2015 report from Mandiant, a company whose job is to clean up after compromises such as the 2013 one at Stanford. Or Dan Kaminsky's interview with Die Zeit Online in the wake of the compromise at the Bundestag:
No one should be surprised if a cyber attack succeeds somewhere. Everything can be hacked. ... All great technological developments have been unsafe in the beginning, just think of the rail, automobiles and aircrafts. The most important thing in the beginning is that they work, after that they get safer. We have been working on the security of the Internet and the computer systems for the last 15 years.Yes, automobiles and aircraft are safer but they are not safe. Cars kill 1.3M and injure 20-50M people/year, being the 9th leading cause of death. And that is before they become part of the Internet of Things and their software starts being exploited. Clearly, some car crash victims are at fault and others aren't. Dan is optimistic about Prof. Yang's approach:
It is a new technology, it is still under development. In the end it will not only be possible to write a secure software, but also to have it happen in a natural way without any special effort, and it shall be cheap.I agree that the Langsec approach and capability-based systems such as Capsicum can make systems safer. But making secure software possible is a long way from making secure software ubiquitous. Until it is at least possible for organizations to deploy a software and hardware stack that is secure from the BIOS to the user interface, and until there is liability on the organization for not doing so, blaming them for being insecure is beside the point.
The sub-head of Mandiant's report is:
For years, we have argued that there is no such thing as perfect security. The events of 2014 should put any lingering doubts to rest.It is worth reading the whole thing, but especially their Trend 4, Blurred Lines, that starts on page 20. It describes how the techniques used by criminal and government-sponsored bad guys are becoming indistinguishable, making difficult not merely to defend against the inevitable compromise, but to determine what the intent of the compromise was.
The technology for making systems secure does not exist. Even if it did it would not be feasible for organizations to deploy only secure systems. Given that the system vendors bear no liability for the security of even systems intended to create security, this situation is unlikely to change in the foreseeable future.
The Access 2015 Organizing Committee is thrilled to announce that our speaker for the Dave Binkley Memorial Lecture is Molly Sauter!
Molly is a Vanier Scholar and PhD student in Communication Studies at McGill University in Montreal, Canada. She holds a masters degree in Comparative Media Studies from MIT, and is an affiliate researcher at the MIT Center for Civic Media at the MIT Media Lab and at the Berkman Center for Internet and Society at Harvard University. Molly has published widely on internet activism, hacker culture, and depictions of technology in the media. Her recent book, The Coming Swarm, examines the use of Distributed Denial of Service (DDoS) actions as a form of political activism.
2015 LITA Forum
November 12-15, 2015
Plan now to join us in Minneapolis, Minnesota, at the Hyatt Regency Minneapolis for the 2015 LITA Forum, a three-day educational event that includes 2 preconferences, 3 keynote sessions, more than 55 concurrent sessions and 15 plus poster presentations.
2015 LITA Forum is the 18th annual gathering of technology-minded information professionals and is a highly regarded annual event for those involved in new and leading edge technologies in the library and information technology field. Registration is limited in order to preserve the important networking advantages of a smaller conference. Attendees take advantage of the informal Friday evening reception, networking dinners and other social opportunities to get to know colleagues and speakers. Comments from past attendees:
- “Best conference I’ve been to in terms of practical, usable ideas that I can implement at my library.”
- “I get so inspired by the presentations and conversations with colleagues who are dealing with the same sorts of issues that I am.”
- “After LITA I return to my institution excited to implement solutions I find here.”
- “This is always the most informative conference! It inspires me to develop new programs and plan initiatives.”
Mx A. Matienzo
Director of Technology for the Digital Public Library of America, he focuses on promoting and establishing digital library interoperability at an international scale. Prior to joining DPLA, Matienzo worked as an archivist and technologist specializing in born-digital materials and metadata management, at institutions including the Yale University Library, The New York Public Library, and the American Institute of Physics.
Carson Block Consulting Inc. has led, managed, and supported library technology efforts for more than 20 years. He has been called “a geek who speaks English” and enjoys acting as a bridge between the worlds of librarians and hard-core technologists.
President of Digital Governance Solutions at ActiveStandards. In a 20-year career, Lisa Welchman has paved the way in the discipline of digital governance, helping organizations stabilize their complex, multi-stakeholder digital operations. Her book Managing Chaos: Digital Governance by Design was published in February of 2015 by Rosenfeld Media.
So You Want to Make a Makerspace: Strategic Leadership to support the Integration of new and disruptive technologies into Libraries: Practical Tips, Tricks, Strategies, and Solutions for bringing making, fabrication and content creation to your library.
Leah Kraus is the Director of Community Engagement and Experience at the Fayetteville Free Library.
Michael Cimino is the Technology Innovation and Integration Specialist at the Fayetteville Free Library.
Beyond Web Page Analytics: Using Google tools to assess searcher behavior across web properties
Rob Nunez, Robert L Nunez, Head of Collection Services, Kenosha Public Library, Kenosha, WI
Keven Riggle, Systems Librarian & Webmaster, Marquette University Libraries
for registration and additional information.
Join us in Minneapolis!
This and that for the end of June.
Themed color palettes from movies, cities, nature and more
Ever wonder about the physics behind guitar solos? Well here’s your answer…
An engaging feed of videos and music. A reminder of how many parallel experiences exist in the world.
The future of furniture, or just another folding table?
A design history of the disposable ‘Jazz’ cup
I recently participated in a training session about empathy, led by our wonderful Staff Development Specialist here at the Martin County Library System. The goal of this session was to define empathy and discuss how to show empathy for our patrons and co-workers. It got me thinking about empathy in regards to teaching technology. I frequently work with library patrons who are frustrated with technology. Many of these patrons are older adults who feel handicapped because they were not raised in the digital age.
I, on the other hand, was born born in the digital age. I learned how to use a computer in elementary school and technology has been present in my life ever since. It’s easy to forget this advantage and lose patience when you are teaching someone with a different background. In teaching classes and offering one-on-one technology help, I’ve picked up a few tips about how to empathize with your students.
If you find your patience wearing thin, think of a time when you struggled to learn something. For me, it’s learning to drive stick. I’ve tried several times and each attempt was more frustrating than the last. When I think about how nerve-wracking it is to be behind the wheel with my hand on the stick shift, I remember how scary it can be to learn something new. I often help patrons who have purchased a new device (iPad, smartphone, etc.) and they are terrified to do the wrong thing. Returning to my adventures with manual transmissions helps me understand where they’re coming from.
I was teaching a class a few weeks back and one patron was really struggling to keep up with the group. I started to get irritated by her constant questions, until halfway through when I realized that she looked exactly like my aunt. This immediately snapped me back to reality. If my aunt walked into a library I would want her to receive the best customer service possible and be treated with the utmost respect. My patience was instantly renewed, and I’ve used this trick successfully several times since by comparing patrons to my grandparents, parents, etc. Empathy is often defined as putting yourself in the other person’s shoes, but putting a loved one in the other person’s shoes can also do the trick.
I often hear the same complaints from patrons who are frustrated, confused, or overwhelmed by technology. I’ll admit it can be trying to listen to the same thing again and again, but I also recognize that listening to these grievances is very important. Sometimes it’s best to get those frustrations out right off the bat in order to set them aside and focus on learning. Listening is one of our best tools, and acknowledging that someone’s problem is valid can also be extremely helpful.
Do you have any tips for tech empathy?
Library of Congress: The Signal: We Did All That? NDSA Standards and Practices Working Group Project Recaps
The end of the school year often finds me thinking about time gone by. What did I work on and what can I show for it? The NDSA Standards and Practices Working Group members were in the same frame of mind so we recently did a survey of our projects and accomplishments since the NDSA launched in 2010. It’s an impressive list (if we do say so ourselves), especially once you realize that these topics come from the interests of our diverse membership. As co-chair of the working group, I’d like to share with you all of the the S&P-related blog posts to bring readers up-to-date with many of our topical and timely initiatives.
Video has been a hot topic in S&P recently. Several round-robin discussions led to a “Video Deep Dive” action team which developed and conducted the Stumbling Blocks to Preserving Video Survey to identify and rank issues that may hinder digital video preservation. The preliminary results led us to dig a little deeper in how we processed and analyzed the data so look for an update on this soon.
Preserving Digital and Software-Based Artworks
S&P hosted a two-part discussion with experts from four collecting institutions (San Francisco Museum of Modern Art, Museum of Modern Art, The Rose Goldsen Archive of New Media Art, and Smithsonian Institution Time Based Media Art project) to share their experiences in both preserving and providing access to digital art works and other new media. These complex digital objects materials are increasingly part of collections outside of traditional museum environments and cultural heritage institutions, including libraries and archives, will see more and more of this type of content in their collections.
S&P members contributed to a report that takes a measured look at the costs and benefits of the widespread use of the PDF/A-3 format, especially as it effects content arriving in collecting institutions. The report provides background on the technical development of the specification, identifies specific scenarios under which the format might be used and suggests policy prescriptions for collecting institutions to consider.
Staffing for Effective Digital Preservation Survey
S&P conducted a survey of 85 institutions with a mandate to preserve digital content about how they staffed and organized their preservation functions. In addition to an award-winning poster (PDF) at iPRES2012, S&P members produced a detailed report and deposited the raw data in ICPSR.
Along with our colleagues in the NDSA Infrastructure Working Group, S&P members helped author the NDSA publication, “Checking Your Digital Content: What is Fixity and When Should I Be Checking It?” (PDF). This resource provides stewards of digital objects with information about implementing fixity concepts and methods in a way that makes sense for their organization based on their needs and resources. Topics covered include definitions of fixity and fixity information, general approaches to fixity check frequency and comparison of common fixity information-generating instruments.
2015 National Agenda
S&P members also contributed significant input and informed actionable recommendations to the Organization Policies and Practice chapter of the NDSA 2015 National Agenda for Digital Stewardship.
Issues with archiving email proved to be another rallying point for S&P members who participated in initiating an informal Email Interest Group to discuss issues, projects and workflows to preserve email.
Compiling this review list for S&P proudly reminds me of how much we’ve done through our active and engaged membership. And I should mention that this post doesn’t even cover all our projects – just the ones with blog posts! Even with all we’ve done so far, S&P still has many issues and practices to explore.
DuraSpace News: DSpace in Vietnam with Registered Service Provider D & L Technology Integration and Consulting
Winchester, MA Efforts are increasing at institutions around the world to provide open access to global culture and scholarship including theses, dissertations, journals, digitized materials, special collections, maps, videos, audio recordings and other types of data. D & L Technology Integration and Consulting, a new DuraSpace Registered Service Provider located in Hanoi-Vietnam is part of that worldwide effort.
DPLA: American Association of School Librarians Names DPLA a 2015 Best App for Teaching & Learning
The Digital Public Library of America is extraordinarily grateful to be recognized as one of 2015’s Best Apps for Teaching & Learning by the American Association of School Librarians (AASL). Chosen for its embodiment of AASL’s learning standards and support of the school librarian’s role in implementing career and college readiness standards, this is DPLA’s second “Best of” award from the prestigious education-oriented division of the American Library Association. DPLA was recognized as a Best Website for Teaching & Learning in 2013.
“This recognition from AASL means so much to us, since school librarians have been such great advocates for DPLA, especially as we strive to make our materials useful to students,” said Dan Cohen, DPLA’s Executive Director. “This second award from AASL highlights that DPLA is available in multiple formats, including apps, a website, and other websites that incorporate our extraordinary content from collections across the United States.”
The Best Apps for Teaching & Learning recognition honors apps of exceptional value to inquiry-based teaching and learning as embodied in the AASL’s Standards for the 21st-Century Learner. The recognized apps foster the qualities of innovation, creativity, active participation, and collaboration and are user-friendly to encourage a community of learners to explore and discover. The apps were announced during the 2015 ALA Annual Conference in San Francisco.
The AASL, a division of the American Library Association, promotes the improvement and extension of library services in elementary and secondary schools as a means of strengthening the total education program. AASL’s mission is to empower leaders to transform teaching and learning.
To find out more about DPLA’s efforts around education, read the DPLA’s three-year strategic plan, published in January 2015, and its Whiting Foundation-funded research paper on using large digital collections in education, published in April 2015.
FOSS4Lib Recent Releases: Koha - Security and maintenance releases - v 3.20.1, 3.18.8, 3.16.12, 3.14.16
Last updated June 27, 2015. Created by David Nind on June 27, 2015.
Log in to edit this page.
Security and maintenance releases for Koha.
As these are security releases it is strongly recommended that you upgrade as soon as possible.
Special thanks also goes to Raschin Tavakoli and Dimitris Simos from the Combinatorial Security Testing Team of SBA Research for finding and reporting the security bugs.
See the release announcements for the details:
Libraries are in a revolution fueled by rapid advances in technology, and thus the roles, capabilities, and expectations of libraries are changing rapidly. National public policy for libraries must reflect these changes. Today the American Library Association (ALA) released a National Policy Agenda (pdf) for Libraries to guide a proactive policy shift.
“Too often, decision makers do not yet understand the extent to which libraries can be catalysts for opportunity and progress,” said ALA President Courtney Young in a press release. “As a result, investments in libraries and librarians lag our potential to contribute to the missions of the federal government and other national institutions. We must take concerted action to advance shared policy goals.”
The agenda was developed in concert with major library organizations that serve on a Library Advisory Committee for the Policy Revolution! initiative and with input from a public comment period. Funding for this project is provided by the Bill & Melinda Gates Foundation as part of a three-year grant that also supports efforts to deepen national stakeholder engagement and increase library advocacy capacity.
“Libraries cannot wait to be invited to ‘the table.’ We need proactive, strategic and aligned advocacy to support national policies that advance the public’s interest in the digital age and support libraries as essential community assets,” writes Deborah Jacobs, director of the Global Libraries Program at the Bill & Melinda Gates Foundation, in a foreword (pdf) to the agenda (pdf).
The agenda flows out of library values and the imperative of “opportunity for all,” as well as within a context of national political, economic and demographic trends. It seeks to answer the questions “What are the U.S. library interests and priorities for the next five years that should be emphasized to national decision makers?” and “Where might there be windows of opportunity to advance a particular priority at this particular time?”
The agenda articulates two broad themes—building library capacity to advance national priorities and advancing the public interest. Among the areas for capacity building are education and learning, entrepreneurship, and health and wellness. Public interest topics include balanced copyright and licensing, systems for digital content, and privacy and transparency. The agenda also identifies specific populations for which there are significant demographic shifts or bipartisan opportunities to address specialized needs.
“National decision makers often don’t understand the roles or capabilities of modern libraries,” said Alan S. Inouye, director of ALA’s Office for Information Technology Policy and co-principal investigator of the Policy Revolution! initiative. “Thus, an underlying imperative of the agenda is communication about how modern libraries contribute to society. Progress on specific policy goals is significantly impeded if this broader understanding is lacking.”
“Sustainable libraries are essential to sustainable communities,” said Ken Wiggin, president of the Chief Officers of State Library Agencies (COSLA), which is a grant partner. “I believe this agenda will help unify and amplify our voices at the national level and can be customized for state-level action, as well.”
Using the Agenda, the ALA Washington Office will match priorities to windows of opportunity and confluence to begin advancing policy goals—in partnership with other library organizations and allies with whom there is alignment.
While initiated at different times, the Policy Revolution! initiative dovetails with the new proposed strategic framework and plan for the ALA, which focuses on three Strategic Directions: information policy, advocacy and professional and leadership development. “Taken together, along with a growing focus on transforming libraries, we are ‘connecting the dots’ across the profession and strengthening our collective voice,” said Larra Clark, deputy director of ALA’s Office for Information Technology Policy and co-principal investigator of the Policy Revolution! initiative.
Attendees at the ALA Annual Conference in San Francisco can learn more about the agenda and related advocacy at two programs. On Saturday, June 27, from 1-2:30 p.m., Policy Revolution! Senior Policy Counsel and partner at Arent Fox, Alan Fishel, will lead an interactive program on Negotiating to Advocacy Success with Clark. On Sunday, June 28, from 3 to 4 p.m., ALA Incoming President-Elect Julie Todaro will join Inouye and Wiggin to discuss Dollars for Local Libraries. More information on the initiative also is available online at www.ala.org/oitp.
The American Library Association (ALA) this week awarded Kathleen DeLaurenti the 2015 Robert L. Oakley Memorial Scholarship. The Library Copyright Alliance, which includes ALA, established the Robert L. Oakley Memorial Scholarship to support research and advanced study for librarians in their early-to-mid-careers who are interested and active in public policy, copyright, licensing, open access and their impacts on libraries.
DeLaurenti serves as the arts librarian at the College of William and Mary, where she led a user-centered re-design of the Music Library, including adding new equipment, collections, and services. She also is the first librarian at William and Mary to receive a Creative Adaption Grant to begin a pilot project to help faculty incorporate Open Educational Resources into their courses. The Oakley scholarship will support DeLaurenti’s work in copyright education, focusing on students’ understanding of music licensing and copyright basics.
“The support of the Oakley Scholarship would allow me to not only continue the next phase of this project to create music copyright learning modules, but it would provide the resources to involve students in curricular development and module creation,” said DeLaurenti.
The Oakley Scholarship awards a $1,000 scholarship to individuals or a team of individuals who meet eligibility criteria to encourage and expand interest in and knowledge of these aspects of librarianship, as well as bring the next generation of advocates, lobbyists and scholars to the forefront with opportunities they might not otherwise have.
“The Oakley scholarship is intended to support librarians in non-administrative positions who are less likely to have the funds necessary to build on their copyright interests,” said Carrie Russell, program director of the ALA Program for Public Access to Information, in a statement. DeLaurenti’s project will ultimately be helpful to any librarian who works with library users with music copyright questions. Music copyright is about licensing, it’s complex, and has always been a topic of great interest to librarians.”
Law librarian and professor Robert Oakley was an expert on copyright law and wrote and lectured on the subject. He served on the Library Copyright Alliance representing the American Association of Law Libraries (AALL) and played a leading role in advocating for U.S. libraries and the public they serve at many international forums including the World Intellectual Property Organization (WIPO) and United Nations Educational Scientific and Cultural Organization (UNESCO). He served as the United States delegate to the International Federation of Library Associations (IFLA) Standing Committee on Copyright and Related Rights from 1997-2003.
Oakley testified before Congress on copyright, open access, library appropriations and free access to government documents and was a member of the Library of Congress’ Section 108 Study Group. A valued colleague and mentor for numerous librarians, Oakley was a recognized leader in law librarianship and library management who also maintained a profound commitment to public policy and the rights of library users.
The post Arts librarian receives 2015 Robert Oakley scholarship appeared first on District Dispatch.
This spring, I taught a technology course for pre-service teachers. In addition to my MLS, I have a master’s degree in educational technology, a graduated certificate in online teaching and learning, and an undergraduate degree in education. My own schooling had taught me the importance of making pedagogically sound decisions and never using technology for only the sake of using technology. I quickly learned though that making those pedagogically sound decisions when looking into the eyes of students was a bit more challenging than I had originally thought.Image made available under a Creative Commons Attribution 3.0 License from http://quality.ecampusalberta.ca/
As I reflected on my teaching after every class, I asked myself many questions including: How do we learn? How can I incorporate technology in a way that is beneficial for my students? How can I use technology in a seamless manner where the learning is not interrupted by inclusion of technology?
Once the spring semester ended and I was able to breathe, I started to think about how what I learned teaching a technology course could (and should) influence my work as a librarian. Overall, I think librarians do a pretty great job using technology, but I realized for me that many of the technology decisions I make in my day job as an academic librarian are not nearly as grounded in learning theory as I think they should be. When I was teaching a full course it was easier to think about theory and wrestle with these questions, but when I create libguides, build tutorials, make suggestions for the library website, and recommend new technology for the learning commons, how often do I first think about how we learn?
So here is my goal (I’m admitting it online and hoping the LITA community will support me in it), I want to start reading more books on learning theory and start using that knowledge to influence all aspects of my work, and specifically with the technology that I use since almost everything that I do is somehow connected to technology.
Current reading list:
What do you recommend that I read? Do you have any tips for connecting learning theory to non-teaching library technology responsibilities?
The following is a guest post by Barrie Howard, IT Project Manager at the Library of Congress.
The Digital Preservation Outreach and Education (DPOE) program is pleased to announce a successful outcome for two international Train-the-Trainer workshops. These workshops were recently held in Australia, and are the first of their kind to be held outside of the United States.
The first workshop (May 26-29, 2015) was hosted by the State Library Victoria in Melbourne, sponsored by a collaborative organization of public libraries in Victoria called the Public Libraries Victoria Network (PLVN). The second workshop (June 2-5, 2015) took place in Sydney at the State Library of New South Wales, sponsored by a ten member consortium of national, state and territory libraries of Australia and New Zealand, the National and State Libraries of Australasia (NSLA). In addition to these two international workshops, DPOE has previously delivered four domestic workshops, partnering with organizations across the nation.
The aim of the DPOE workshop is to produce a corps of trainers, who are equipped to teach others the basic principles and practices of preserving digital materials. In this way, DPOE’s “teach-a-person-to-fish” model extends the benefits of a workshop well beyond only those who can attend. There are many examples of DPOE trainers working together across jurisdictional and organizational boundaries to meet the needs of cultural heritage institutions of all shapes and sizes. DPOE trainers go on to develop training events of their own, and have delivered many webinars and workshops in the Midwest, Pacific Northwest, and Southeast regions of the United States, which will be replicated in regions across Australia in the coming year. Some of these examples have been highlighted in previous blog posts.
The DPOE Down Under workshops were well received due largely to the exceptional knowledge and leadership of three of the program’s anchor instructors: Mary Molinaro (University of Kentucky Libraries), Jacob Nadal (The Research Collections and Preservation Consortium), and Amy Rudersdorf (Digital Public Library of America). This extremely talented team has provided subject matter expertise to the program in the past. Over the last year, DPOE Program Manager George Coulbourne has convened two meetings of the core instructors to give the training curriculum a significant overhaul. The instructors worked with DPOE staff to review and revise training materials in anticipation of the back-to-back DPOE workshops in Australia, ensuring the curriculum is as relevant and up-to-date as ever.
The workshops are just one way that DPOE fosters outreach and education about digital preservation on a global scale. After a workshop, students graduate and enter into a vibrant network of practitioners, and continue to engage with each other–and the broader digital preservation community–online. DPOE supports this network by providing an email distribution list so practitioners can share information about digital preservation best practices, services, and tools, and to surface stories about their experiences in advancing digital preservation.
Additionally, DPOE maintains a training calendar as a public service to help working professionals discover continuing education, professional development, and training opportunities in the practice of digital preservation. The calendar is updated on a monthly basis, and includes training events hosted by DPOE trainers.
Updated 6/29/15 for typos.