Peter Murray: Thursday Threads: Patron Privacy on Library Sites, Communicating with Developers, Kuali Continued
Delivered by FeedBurner
In the DLTJ Thursday Threads this week: an analysis of how external services included on library web pages can impact patron privacy, pointers to a series of helpful posts from OCLC on communication between software users and software developers, and lastly an update on the continuing discussion of the Kuali Foundation Board’s announcement forming a commercial entity.
Before we get started on this week’s threads, I want to point out a free online symposium that LYRASIS is performing next week on sustainable cultural heritage open source software. Details are on the FOSS4Lib site, you can register on the LYRASIS events site, and then join the open discussion on the discuss.foss4lib.org site before, during and after the symposium.
Feel free to send this to others you think might be interested in the topics. If you find these threads interesting and useful, you might want to add the Thursday Threads RSS Feed to your feed reader or subscribe to e-mail delivery using the form to the right. If you would like a more raw and immediate version of these types of stories, watch my Pinboard bookmarks (or subscribe to its feed in your feed reader). Items posted to my Pinboard bookmarks are also sent out as tweets; you can follow me on Twitter. Comments and tips, as always, are welcome.Analysis of Privacy Leakage on a Library Catalog Webpage
My post last month about privacy on library websites, and the surrounding discussion on the Code4Lib list prompted me to do a focused investigation, which I presented at last weeks Code4Lib-NYC meeting.
I looked at a single web page from the NYPL online catalog. I used Chrome developer tools to trace all the requests my browser made in the process of building that page. The catalog page in question is for The Communist Manifesto. It’s here: http://nypl.bibliocommons.com/item/show/18235020052907_communist_manifesto. …
So here are the results.- Analysis of Privacy Leakage on a Library Catalog Webpage, by Eric Hellman, Go To Hellman, 16-Sep-2014
This is the first post in a series on software development practices. We’re launching the series with a couple of posts aimed at helping those who might not have a technical background communicate their feature requests to developers.- Software Development Practices: What&aposs the Problem?, by Shelly Hostetler, OCLC Developer Network, 22-Aug-2014
OCLC has started an excellent set of posts on how to improve communication between software users and software developers. The first three have been posted so far with another one expected today:
- Software Development Practices: What&aposs the Problem?
- Software Development Practices: Telling Your User&aposs Story
- Software Development Practices: Getting Specific with Acceptance Criteria
I’ve bookmarked them and will be referring to them when talking with our own members about software development needs.Kuali 2.0 Discussion Continues
…I thought of my beehives and how the overall bee community supports that community/ hive. The community needs to be protected, prioritized, supported and nourished any way possible. Each entity, the queen, the workers and the drones all know their jobs, which revolve around protecting supporting and nourishing the community.
Even if something disrupts the community, everyone knows their role and they get back to work in spite of the disruption. The real problem within the Kuali Community, with the establishment of the Kuali Commercial Entity now is that various articles, social media outlets, and even the communication from the senior Kuali leadership to the community members, have created a situation in which many do not have a good feel for their role in protecting, prioritizing, supporting and nourishing the community.- The Evolving Kuali Narrative, by Kent Brooks, “I was just thinking”, 14-Sep-2014
The Kuali Foundation Board has set a direction for our second decade and at this time there are many unknowns as we work through priorities and options with each of the Kuali Project Boards. Kuali is a large and complex community of many institutions, firms, and individuals. We are working with projects now and hope to have some initial roadmaps very soon.- Updates – Moving at the Speed of Light, by Jennifer Foutty, Kuali 2.0 Blog, 17-Sep-2014
As the library community that built a true next-generation library management system, the future of OLE’s development and long-term success is in our hands. We intend to continue to provide free and open access to our community designed and built software. The OLE board is strongly committed to providing a community driven option for library management workflow.- Open Library Environment (OLE) & Kuali Foundation Announcement, by Bruce M. Taggart (Board Chair, Open Library Environment (OLE)), 9-Sep-2014
Building on previous updates here, the story of the commercialization of the Kuali collaborative continues. I missed the post from Bruce Taggart in last week’s update, and for the main DLTJ Thursday Threads audience this status update from the Open Library Environment project should be most interesting. Given the lack of information, it is hard not to parse each word of formal statements for underlying meanings. In the case of Dr. Taggart’s post about OLE, I’m leaning heavily on wondering what “community designed and built software” means. The Kuali 2.0 FAQ still says “the current plan is for the Kuali codebase to be forked and relicensed under the Affero General Public License (AGPL).” As Charles Severance points out, the Affero license can be a path to vendor lock-in. So is there to be a “community” version that has a life of its own in under the Educational Community License while the KualiCo develops features only available under the Affero license? It is entirely possible that too much can be read into too few words, so I (for one) continue to ponder these questions and watch for the plan to evolve.Link to this post!
Last updated September 18, 2014. Created by mohit03 on September 18, 2014.
Log in to edit this page.
Transferring your house and also place of work are often very thrilling, since you also are generally on the point of check out a brand-new area, commence a brand-new living and possess entertaining. Unless you make it, although, you must move through a fairly exhausting along with strenuous timeframe, since you must set up packaging in addition to shifting. Specially when an individual transfer long-distance areas, it is advisable to select a suitable removing corporation that may control the work and also relieve ones tension.
How will you select the right treatment business?
To begin with, you'll want to seek the services of a genuine elimination business that may lead the particular removals method, however really not a robust gentleman who may have the truck. It is possible to inquire your buddies that have relocated lately as long as they employed any organization, after which, you'll want to validate their everyday living and it is permit with the relevant institution. You should make certain that the business is present for quite a while and this it's got every one of the right permit to control the particular large devices important for your home removals.
Requesting stories and also checking out the organization site is perfectly essential. You'll be able to constantly use the internet here and also Yahoo your treatment organization to find out and about in the event you will find just about any grievances as well as adverse feedback. Researching this testimonies you will find while using friends' as well as household tips is critical.
Its also wise to review rates as well as rates; a superb packers in addition to movers organization isn't going to often provide cheapest price, and you must big event a person examine the particular supplied products and services with all the charges supplied. Just like most businesses, removals organizations can also be prepared to take discussions. You are able to absolutely discuss the purchase price, particularly if a person call the corporation beginning ample, you'll be able for you to request a number of cheaper costs.
Comparing prices is additionally encouraged. Do not need achieve a deal, in addition to use the removals firm straight away, however, you must talk with a lot of businesses. When you continue deciding on this treatment corporation in the very last minute, you'll likely receive increased prices, since these businesses usually are ordered a little while ahead of time, so that you really should commence the procedure early on along with avoid your top days and nights in addition to holiday seasons.
You possibly can reduce your cost in case you begin providing all on your own. If you choose to make it happen, you should commence setting up along with supplying your personal property beginning. This kind of will assist you to stay away from demanding occasions in addition to worries. Take into account that each one of these have become time intensive, and also you'll want to always be ready upfront.Educational Community License
Last updated September 18, 2014. Created by sweta2806 on September 18, 2014.
Log in to edit this page.
Going into a fresh desired destination using total residence things could be facilitated along with easier should you timetable the activity which has a high quality as well as trusted going business (packers and also movers). There are lots of specialized movers in addition to packers firms in several metropolitan areas connected with Indian available. Although figuring out the best mover can be quite a small bit complicated and also demanding work.
Last updated September 18, 2014. Created by sweta2806 on September 18, 2014.
Log in to edit this page.
Regardless of whether you might be shifting across the street, relocating derived from one of vicinity to a different or maybe relocating through Jaipur to be able to someplace else; a fantastic transferring firm will likely be the true good friend. An excellent going business might help together with your come in the whole procedure producing procedure get simpler -- through packaging of the very most very first merchandise for your existing spot to unpacking in addition to ordering of the very most previous merchandise pictures brand-new position.
“Dear Congressional Leaders –
We write to urge you to bring to the floor S. 607 and H.R. 1852, the bipartisan Leahy-Lee and Yoder-Polis bills updating the Electronic Communications Privacy Act (ECPA). Updating ECPA would respond to the deeply held concerns of Americans about their privacy. S. 607 and H.R. 1852 would make it clear that the warrant standard of the U.S. Constitution applies to private digital information just as it applies to physical property….”
… So said ALA today and more than 70 other civil liberties organizations, major technology companies and trade associations — including the U.S. Chamber of Commerce — in a strong joint letter to the leaders of the House and Senate calling for the soonest possible vote on bills pending in each chamber (S. 607 and H.R. 1852) to update the woefully outdated and inadequate Electronic Communications Privacy Act. To reach every Member of Congress and their staffs, the letter also was published as a full page advertisement in Roll Call, a principal Capitol Hill newspaper widely read inside the Beltway and well beyond.
When last discussed in DD in mid-June, H.R. 1852 (the Email Privacy Act) had been cosponsored by a majority of all Members of the House. Today, 265 members have signed on but the bill still awaits action in Committee. With literally two work days remaining before the House and Senate recess for the November election, ALA and scores of its coalition partners wanted to remind all Members that these bills deserve a vote immediately after Congress returns in November.
Add your voice to that call too as Election 2014 heats up where you live! Attend a “Town Hall” meeting, call in to a talk radio show featuring a campaigning Congressperson, or simply call their local office and demand that Congress protect your emails, photos, texts, tweets and anything else stored in the “cloud” by voting on and passing S. 607 and H.R. 1852. Politics doesn’t get any more local and personal than the privacy of your electronic communications, which authorities don’t now need a warrant to pore over if they’re more than six months old.
Tell your Congressional Representative and Senators to update ECPA by passing S. 607 and H.R. 1852 as soon as they get back to Washington.
It can be difficult to respond to a question asked by a Member of Congress at a hearing when that person is talking about a different subject than you are and doesn’t know it. One observes a lot of talking past one another and frustration. One wants to stand up and say “wait a minute, you guys are talking about two different things,” but this kind of outburst is not appropriate at a Congressional hearing.
That happened today at the hearing called by U.S. House Judiciary Subcommittee on the Courts, Intellectual Property and the Internet. The topic was Chapter 12 of the copyright law and in particular, an administrative process conducted every three years by the U.S. Copyright Office called the 1201 rulemaking. But some thought the topic was digital rights management, and things got a little tense near the end of the hearing. Watch it for yourself.
There is a connection, and for clarity’s sake, let’s explore. The 1201 rulemaking was included in the Digital Millennium Copyright Act (DMCA) as a “safety valve” to ensure that technological protection measures (also known as digital rights management!) employed by rights holders to protect content would not also prevent non-infringing uses of copyrighted works, like analyzing software for security vulnerabilities, for example. Ask anyone, and they will tell you that the rulemaking is brutal. It’s long, convoluted and borders on the ridiculous. During this process, the U.S. Copyright Office evaluates specific requests for exemption from Section 1201’s otherwise blanket prohibition on “circumvention,” e.g., breaking pass codes, encryption or other digital rights management schemes in order to make a non-infringing use of a particular class of copyrighted works. In order to make such an argument, however, one who wants an exemption to the anti-circumvention provision must already have broken the anti-circumvention provision in order to make a non-infringing use of the work because you cannot speculate that a non-infringing use is possible without demonstrating that it is so.
The process can last eight months and includes writing detailed comments for submission, a reply comment period, two days of roundtables sometimes held in two or three places in the United States, and finally time for the U.S. Copyright Office in collaboration with the National Telecommunications and Information Administration (NTIA) to write a lengthy report with recommendations to the Librarian of Congress what class of works with technological protection measures can be circumvented for the next three years. Whew!
The Library Copyright Alliance (LCA) submitted comments arguing that the process certainly can be improved. Key LCA recommendations included that exemptions be permanent instead of lasting only three years, and that the NTIS (which has a better understanding of technology and innovation)administer the 1201 rulemaking process instead of the U.S. Copyright Office.
The good news. A baby step may have been taken. All of the witnesses agreed that some exemptions should be permanent so people do not have to reargue their case every three years. In addition, the Copyright Office already has made a suggestion to improve the rulemaking process, writing recently in the Federal Register:
Unlike in previous rulemakings, the Office is not requesting the submission of complete legal and factual support for such proposals at the outset of the proceeding. Instead, in this first step of the process, parties seeking an exemption may submit a petition setting forth specified elements of the proposed exemption and review and consolidate the petitions naming the list of proposed exemptions for further consideration.
Stay tuned for more news on the Copyright Office’s so-called “triennial” 1201 rulemaking which gives new meaning to the adage that “god (or the devil, if you prefer) is in the details.”
The Technical Advisory Committee will hold an open call on Tuesday, September 23 at 1:00 PM EDT. The agenda can be found below. To register, follow the link below and complete the short form.
- Tech road map: comments and feedback
- How the committee can help
- Topic: OS projects currently used / maintained by DPLA
- Topic: What SDKs / API libraries are desired but not yet extant
- Ingestion code: current state and next steps
- Current active development (CDL, other?)
- Updating the current license
All written content on this blog is made available under a Creative Commons Attribution 4.0 International License. All images found on this blog are available under the specific license(s) attributed to them, unless otherwise noted.
CHICAGO — The Library and Information Technology Association (LITA), a division of the American Library Association, is pleased to announce that applications are being accepted for three Scholarships:
LITA/Christian Larew Memorial Scholarship (sponsored by Baker & Taylor)
LITA/LSSI Minority Scholarship (sponsored by Library Systems and Services, LLC)
LITA/OCLC Minority Scholarship (sponsored by Online Computer Library Center)
The scholarships are designed to encourage the entry of qualified persons into the library technology field. The committees seek those who plan to follow a career in library and information technology, who demonstrate potential leadership, who hold a strong commitment to the use of automated systems in libraries and, for the minority scholarships, those who are qualified members of a principal minority group (American Indian or Alaskan native, Asian or Pacific Islander, African-American or Hispanic).
Candidates should illustrate their qualifications for the scholarships with a statement indicating the nature of their library experience, letters of reference and a personal statement of the applicant’s view of what he or she can bring to the profession, with particular emphasis on experiences that indicate potential for leadership and commitment to library automation. Economic need is considered when all other criteria are equal. Winners must have been accepted to an ALA recognized MLS Program.
You can apply for LITA scholarships through the single online application hosted by the ALA Scholarship Program. The ALA Scholarship Application Database will open Sept. 15.
References, transcripts and other documents must be postmarked no later than March 1, 2015 for consideration. All materials should be submitted to American Library Association, Scholarship Clearinghouse, c/o Human Resource Development & Recruitment, 50 East Huron Street, Chicago, IL 60611-2795. If you have questions about a LITA Scholarships please email the LITA Office at email@example.com.
The winners will be announced at the LITA President’s Program at the 2015 ALA Annual Conference in San Francisco.
Nominations are being accepted for the 2015 LITA/Library Hi Tech Award, which is given each year to an individual or institution for outstanding achievement in educating the profession about cutting edge technology through communication in continuing education within the field of library and information technology. Sponsored by the Library and Information Technology Association (LITA), a division of the American Library Association (ALA), and Library Hi Tech, the award includes a citation of merit and a $1,000 stipend provided by Emerald Group Publishing Limited, publishers of Library Hi Tech. The deadline for nominations is December 1, 2014.
The award, given to either a living individual or an institution, may recognize a single seminal work or a body of work created during or continuing into the five years immediately preceding the award year. The body of work need not be limited to published texts, but can include course plans or actual courses and/or non-print publications such as visual media. Awards are intended to recognize living persons rather than to honor the deceased; therefore, awards are not made posthumously. More information and a list of previous winners can be found at http://www.ala.org/lita/awards/hitech in the Awards and Scholarships section.
Currently serving officers and elected officials of LITA, members of the LITA/Library Hi Tech Award Committee, and employees and their immediate family of Emerald Group Publishing are ineligible.
Nominations must include the name(s) of the recipient(s), basis for nomination, and references to the body of work. Electronic submissions are preferred, but print submissions may also be sent to the LITA/Library Hi Tech Award Committee chair:
California State University, Los Angeles
5151 State University Dr
Los Angeles, CA 90032-4226.
The award will be presented at the LITA President’s Program during the 2015 Annual Conference of the American Library Association in San Francisco.
Emerald is a global publisher linking research and practice to the benefit of society. The company manages a portfolio of more than 290 journals and over 2,350 books and book series volumes. It also provides an extensive range of value-added products, resources and services to support its customers’ needs. Emerald is a partner of the Committee on Publication Ethics (COPE) and works with Portico and the LOCKSS initiative for digital archive preservation. It also works in close collaboration with a number of organizations and associations worldwide. www.emeraldgrouppublishing.com
Established in 1966, LITA is the leading organization reaching out across types of libraries to provide education and services for a broad membership of almost 3,000 system librarians, library administrators, library schools, vendors and many others interested in leading edge technology and applications for librarians and information providers. For more information, visit www.lita.org , or contact the LITA office at 800-545-2433, ext. 4268; or e-mail: firstname.lastname@example.org.
For further information, contact Mary Taylor at LITA, 312-280-4267.
Recently I mentioned to someone that the library speaker circuit is male-dominated, and she was surprised to hear it. It’s certainly a thing that feels overwhelming from the inside — I’ve been part of a 40% female speaker lineup in front of a 90% female audience — but maybe it’s not as much of a thing as I think?
Well. I counted speaker diversity at LITA Forum once; I can count keynote speakers at big library conferences too.
The takeaway: not as bad as I thought gender-wise but still pretty bad for a field that’s 80% female — except, oddly, library technology does better than the average. On the other hand, if you’re looking for non-white keynoters…it’s pretty bad.
In national-scale US/Canadian library conferences…
- 43% of speakers are female.
- 74% of speakers are white, 14% black, 7% Asian, 4% Hispanic.
In national-scale US/Canadian library technology conferences…
- 57% of speakers are female.
- 71% of speakers are white, 0% black, 21% Asian, 7% Hispanic.(Ouch. I did not want to type that zero.)
I honestly didn’t expect library tech to do better than the average, gender-wise. This is partly a function of tiny little sample size – only 14 keynoters. But it’s also a reminder that a few people can have a lot of leverage. A big part of what you’re seeing here is that code4lib decided to care: code4lib members went out of their way to nominate female keynoters, and keynoters who can speak to feminist issues, and in the open vote that ensued, the two winners were female. LibTechConf organizers went out of their way to solicit diverse speakers, too. And either of them alone tips the scale to majority female keynoters in libtech.
Thanks, code4lib and LibTechConf. You’re awesome.Details
I was looking specifically at keynote speakers — the ones who get invited, paid, and put on a stage in front of the full audience. The ones we showcase as representatives of our values and interests; the ones we value most, metaphorically and literally. The ones we ask.
Not everyone uses the term “keynote”; I also counted “opening/closing general session”, “plenary”, and (in the case of ALA Midwinter, which lacks all of those things) “auditorium speaker series”.
I looked at the most recent iteration of the following conferences:
AALL, AASL, Access, ACRL, ALA Annual, ALA Midwinter, ALSC national institute, ASIS&T, code4lib, DLF, LibTechConf, LITA Forum, MLA, OLA Super Conference, OLITA Digital Odyssey, PLA, and SLA. (YALSA’s Symposium doesn’t seem to have keynoters.)
That’s pretty much what I thought of off the top of my head, biased toward libtech since that’s where I have the most awareness. Happy to add more and update accordingly!Reminder: why I do this
This is what I ask: when you walk into a room, count. Count the women. Count the people of color. Count by race. Look for who isn’t there. Look for class signs: the crooked teeth of childhoods without braces, worn-out shoes, someone else who is counting. Look for the queers, the older people, the overweight. Note them, see them, see yourself looking, see yourself reacting.
This is how we begin.
– Quinn Norton, Count
Nominations are invited for the 2015 Frederick G. Kilgour Award for Research in Library and Information Technology, sponsored by OCLC, Inc. and the Library and Information Technology Association (LITA), a division of the American Library Association (ALA). The deadline for nominations is December 31, 2014.
The Kilgour Research Award recognizes research relevant to the development of information technologies, in particular research showing promise of having a positive and substantive impact on any aspect of the publication, storage, retrieval and dissemination of information or how information and data are manipulated and managed. The Kilgour award consists of $2,000 cash, an award citation and an expense paid trip (airfare and two nights lodging) to the ALA Annual Conference.
Nominations will be accepted from any member of the American Library Association. Nominating letters must address how the research is relevant to libraries; is creative in its design or methodology; builds on existing research or enhances potential for future exploration; and/or solves an important current problem in the delivery of information resources. A curriculum vita and a copy of several seminal publications by the nominee must be included. Preference will be given to completed research over work in progress. More information and a list of previous winners can be found at
Currently-serving officers and elected officials of LITA, members of the Kilgour Award Committee and OCLC employees and their immediate family members are ineligible.
Send nominations by December 31, 2014, to the Award jury chair:
Purdue University Libraries
504 W State St
West Lafayette, IN 47907-4221
The Kilgour Research Award will be presented at the LITA President’s Program on June 29th during the 2015 ALA Annual Conference in San Francisco.
Founded in 1967, OCLC is a nonprofit, membership, computer library service and research organization dedicated to the public purposes of furthering access to the world’s information and reducing library costs. More than 72,000 libraries in 170 countries have used OCLC services to locate, acquire, catalog, lend, preserve and manage library materials. Researchers, students, faculty, scholars, professional librarians and other information seekers use OCLC services to obtain bibliographic, abstract and full-text information when and where they need it. For more information, visit www.oclc.org.
LITA is the leading organization reaching out across types of libraries to provide education and services for a broad membership including systems librarians, library administrators, library schools, vendors and many others interested in leading edge technology and applications for librarians and information providers. For more information, visit www.lita.org, or contact the LITA office by phone, 800-545-2433, ext. 4268; or e-mail: email@example.com
For further information, contact Mary Taylor at LITA, 312-280-4267.
The Library and Information Technology Association (LITA), a division of the American Library Association (ALA), is pleased to offer an award for the best unpublished manuscript submitted by a student or students enrolled in an ALA-accredited graduate program. Sponsored by LITA and Ex Libris, the award consists of $1,000, publication in LITA’s refereed journal, Information Technology and Libraries (ITAL), and a certificate. The deadline for submission of the manuscript is February 28, 2015.
The purpose of the award is to recognize superior student writing and to enhance the professional development of students. The manuscript can be written on any aspect of libraries and information technology. Examples include digital libraries, metadata, authorization and authentication, electronic journals and electronic publishing, telecommunications, distributed systems and networks, computer security, intellectual property rights, technical standards, desktop applications, online catalogs and bibliographic systems, universal access to technology, library consortia and others.
At the time the unpublished manuscript is submitted, the applicant must be enrolled in an ALA-accredited program in library and information studies at the masters or PhD level.
To be eligible, applicants must follow the detailed guidelines and fill out the application form at:
Send the signed, completed forms by February 27, 2015 to the Award Committee Chair,
Kennesaw State University
1200 Chastain Rd NW MD# 0009
Kennesaw, GA 30144-5827.
Submit the manuscript to Sandra electronically at
by February 28, 2015.
The award will be presented at the LITA President’s Program during the 2015 ALA Annual Conference in San Francisco.
About Ex Libris??
Ex Libris is a leading provider of automation solutions for academic libraries. Offering the only comprehensive product suite for electronic, digital, and print materials, Ex Libris provides efficient, user-friendly products that serve the needs of libraries today and will facilitate their transition into the future. Ex Libris maintains an impressive customer base consisting of thousands of sites in more than 80 countries on six continents. For more information about Ex Libris Group visit www.exlibrisgroup.com.
Established in 1966, LITA is the leading organization reaching out across types of libraries to provide education and services for a broad membership including systems librarians, library administrators, library schools, vendors and many others interested in leading edge technology and applications for librarians and information providers. For more information, visit www.lita.org, or contact the LITA office by phone, 800-545-2433, ext. 4268; or e-mail: firstname.lastname@example.org
For further information, please contact Mary Taylor at LITA, 312-280-4267.
New vacancy listings are posted weekly on Wednesday at approximately 12 noon Central Time. They appear under New This Week and under the appropriate regional listing. Postings remain on the LITA Job Site for a minimum of four weeks.New This Week
Visit the LITA Job Site for more available jobs and for information on submitting a job posting.
Looks like Matt’s been spamming the roundup.
A different magazine delivered every month? Sounds cool.
The Idea Factory is the best thing I’ve read about the organization and process required to pump out innovation.
A short video doc on the @internetarchive. Love the IA culture and work.
A typeface for development work. Refined and monospaced.
The library is trusted. Deeply.
The following is a guest post by Patrick Rourke, an Information Technology Specialist and the newest member of the Library’s Viewshare team.
I made my first forays into computing on days when it was too cold, wet or snowy to walk in the woods behind our house, in a room filled with novels, atlases and other books. Usually those first programming projects had something to do with books, or writing, or language – trying to generate sentences from word lists, or altering the glyphs the computer used for text to represent different alphabets.
After a traumatic high school exposure to the COBOL programming language (Edsger Dijkstra once wrote that “its teaching should be regarded as a criminal offense” (pdf)), in college I became fascinated with the study of classical Greek and Roman history and literature. I was particularly drawn to the surviving fragments of lost books from antiquity – works that were not preserved, but of which traces remain in small pieces of papyrus, in palimpsests, and through quotations in other works. I spent a lot of my free time in the computer room, using GML, BASIC and ftp on the university’s time sharing system.
My first job after graduation was on the staff of a classics journal, researching potential contributors, proofreading, checking references. At that time, online academic journals and electronic texts were being distributed via email and the now almost-forgotten medium of Gopher. It was an exciting time, as people experimented with ways to leverage these new tools to work with books, then images, then the whole panoply of cultural content.
This editorial experience led to a job in the technical publications department of a research company, and my interest in computing to a role as the company webmaster, and then as an IT specialist, working with applications, servers and networking. In my spare time, I stayed engaged with the humanities, doing testing, web design and social media engagement for the Suda On Line project, who publish a collaborative translation and annotation of the 10th century Byzantine lexicon in which many of those fragments of lost books are found.
My work on corporate intranets and my engagement with SOL motivated me to work harder on extending my programming skills, so before long I was developing web applications to visualize project management data and pursuing a master’s degree in computer science. In the ten years I’ve been working as a developer, I’ve learned a lot about software development in multiple languages, frameworks and platforms, worked with some great teams and been inspired by great mentors.
I join the National Digital Information Infrastructure and Preservation Program as an Information Technology Specialist, uniting my interests in culture and computing. My primary project is Viewshare, a platform the Library makes available to cultural institutions for generating customized visualizations – including timelines, maps, and charts – of digital collections data. We will be rolling out a new version of Viewshare in the near future, and then I will be working with the NDIIPP team and the Viewshare user community on enhancing the platform by developing new features and new ways to view and share digital collections data. I’m looking forward to learning from and working with my new colleagues at the Library of Congress and everyone in the digital preservation community.
In the past I have used Netbeans as my preferred environment for developing Islandora code, I also tried Eclipse and others periodically to see if they had any new must have features. At Drupalcon Portland in 2013 I noticed many of the presenters were using PHPStorm and developers spoke highly of it, so I thought I should give it a try.
Most of the code for Islandora is PHP but some of the opensource projects we rely on are written in Java or something else, so instead of trying out PHPStorm I download a trial of Intellij IDEA Ultimate Edition which has the functionality of PHPStorm (via a plugin) plus support for many other languages and frameworks.
My first impressions of IDEA Utlimate Edition were good. It was quick to load (compared to Netbeans) and the user interface was snappy, there was no lag for code completion etc. I also really liked the Darcula theme which was easy on the eyes. My first impression of IDEA was enough to make me think it was worthwhile to spend a bit more time using it. The more I used it, the more I liked it! I have been using IDEA as my main IDE for a year now.
IDEA has many plugins and supports many frameworks for various languages so initial configuration can take some time, but once you have things configured it works well and runs smoothly. Islandora has strict coding standards, and IDEA is able to help with this; we are able to point it at the same codesniffer configuration that the Drupal Coder module uses. IDEA then highlights anything that does not conform to the configured coding standards. It will also fix a lot of the formatting errors if you choose to reformat the code. The PHP plugin also has support for Mess Detector, Composor etc.
I also like the PHP debugger in IDEA. You can have several different configurations setup for various projects. While the debugger is a useful tool, I have run into some situations where it opens a second copy of a file in the editor, which can cause issues if you don't notice.
You can also open an ssh session within IDEA which is great for running stuff like git commands. The editor does have built in support for git and svn etc. but I prefer to use the command line for this and in Intellij I can do this while still in the IDE.
IDEA has good support for editing xml files and running/debugging transforms within the IDE.
Overall, Intellij IDEA Ultimate is definitely worth trying! It is a commercial product so you'll have to be prepared to buy a license after your trial. However, they do have a free community edition; be sure to check whether it supports PHP. Most of the functionality I discussed here is also available in PHPStorm which is cheaper but it doesn't support languages other than PHP, HTML etc. If you are part of an opensource project you can apply for an opensource license (Islandora has one), if you qualify you may get a free license.
Well, yes, almost any month could be “FCC month” with the number of proceedings that affect libraries and our communities, but September has been particularly busy. Monday we entered the next round of E-rate activity with comments in response to the Federal Communication Commission’s Further Notice of Proposed Rulemaking (emphasis added), and closed out a record-setting public comment period in relation to promoting and protecting the Open Internet with two public filings.
I’ll leave it to Marijke to give the low-down on E-rate, but here’s a quick update on the network neutrality front:
ALA and the Association of College & Research Libraries (ACRL) filed “reply” comments with a host of library and higher education allies to further detail our initial filing in July. We also joined with the Center for Democracy & Technology (CDT) to re-affirm that the FCC has legal authority to advance the Open Internet through Title II reclassification or a strong public interest standard under Section 706. This work is particularly important as most network neutrality advocates agree the “commercially reasonable” standard originally proposed by the FCC does not adequately preserve the culture and tradition of the internet as an open platform for free speech, learning, research and innovation.
For better or worse, these filings are just the most recent milestones in our efforts to support libraries’ missions to ensure equitable access to online information. Today the FCC is beginning to hold round tables related to network neutrality (which you can catch online at www.fcc.gov/live). ALA and higher education network neutrality counsel John Windhausen has been invited to participate in a roundtable on October 7 to discuss the “Internet-reasonable” standard we have proposed as a stronger alternative to the FCC’s “commercially reasonable” standard.
The Senate will take up the issue in a hearing today, including CDT President and CEO Nuala O’Connor. And a library voice will again be included in a network neutrality forum—this time with Sacramento Public Library Director Rivkah Sass speaking at a forum convened by Congresswoman Doris Matsui on September 24. Vermont State Librarian Martha Reid testified at a Senate field hearing in July, and Multnomah County Library Director Vailey Oehlke discussed network neutrality with Senator Ron Wyden at part of an event in May.
This month ALA also filed comments in support of filings from the Schools, Health and Libraries Broadband (SHLB) Coalition, State E-rate Coordinators Alliance (SECA) and NTCA—the Broadband Coalition calling for eligible telecommunications carriers (ETCs) in the Connect America Fund to connect anchor institutions at higher speeds than those delivered to residents. Going further, ALA proposes that ETCs receiving CAF funding must serve each public library in its service territory at connection speeds of at least 50 Mbps download and 25 Mbps upload. Access and affordability are the top two barriers to increasing library broadband capacity, so both the Connect America Fund and the E-rate program are important components of increasing our ability to meet our public missions. AND we presented at the Telecommunication Policy Research Conference! Whew.
Buckle your seat belts and stay tuned, because “FCC Month” is only half over!
We were pleased to share yesterday that nearly 60,000 items from the Medical Heritage Library have made their way into DPLA, and we’re now doubly pleased to share that more than 148,000 items from the Government Printing Office’s (GPO) Catalog of U.S. Government Publications (CGP) are now also available via DPLA.
To view the Government Printing Office in DPLA, click here.
Notable examples of the types of records now available from the GPO include the Federal Budget, laws such as the Patient Protection and Affordable Care Act, Federal regulations, and Congressional hearings, reports, and documents. GPO continuously adds records to the CGP which will also be available through DPLA, increasing the discoverability of and access to Federal Government information for the American public.
“GPO’s partnership with DPLA will further GPO’s mission of Keeping America Informed by increasing public access to a wealth of information products available from the Federal Government,” said Public Printer Davita Vance-Cooks. “We look forward to continuing this strong partnership as the collection of Government information accessible through DPLA continues to grow”.
GPO is the Federal Government’s official, digital, secure resource for producing, procuring, cataloging, indexing, authenticating, disseminating, and preserving the official information products of the U.S. Government. The GPO is responsible for the production and distribution of information products and services for all three branches of the Federal Government, including U.S. passports for the Department of State as well as the official publications of Congress, the White House, and other Federal agencies in digital and print formats. GPO provides for permanent public access to Federal Government information at no charge through our Federal Digital System (www.fdsys.gov), partnerships with approximately 1,200 libraries nationwide participating in the Federal Depository Library Program, and our secure online bookstore. For more information, please visit www.gpo.gov.
To read the full GPO press release announcing its partnership with DPLA, click here.
All written content on this blog is made available under a Creative Commons Attribution 4.0 International License. All images found on this blog are available under the specific license(s) attributed to them, unless otherwise noted.
I’m writing up what I learned from teaching a jQuery workshop this past month. I’ve already posted on my theoretical basis and pacing. Today, stuff I did to create a positive classroom climate and encourage people to leave the workshop motivated to learn more. (This is actually an area of relative weakness for me, teaching-wise, so I really welcome anyone’s suggestions on how to cultivate related skills!)Post-it notes
I distributed a bunch of them and had students put them on their laptops when they needed help. This lets them summon TAs without breaking their own work process. I also had them write something that was working and something that wasn’t on post-its at the end of Day 1, so I could make a few course corrections for Day 2 (and make it clear to the students that I care about their feedback and their experience). I shamelessly stole both tactics from Software Carpentry.Inclusion and emotion
The event was conducted under the DLF Code of Conduct, which I linked to at the start of the course material. I also provided Ada Initiative material as background. I talked specifically, at the outset, about how learning to code can be emotionally tough; it pushes the limits of our frustration tolerance and often (i.e. if we’re not young, white men) our identity – “am I the kind of person who programs? do people who program look like me?” And I said how all that stuff is okay. Were I to do it over again, I’d make sure to specifically name impostor syndrome and stereotype threat, but I’ve gotten mostly good feedback about the emotional and social climate of the course (whose students represented various types of diversity more than I often see in a programming course, if less than I’d like to see), and it felt like most people were generally involved.
Oh, and I subtly referenced various types of diversity in the book titles I used in programming examples, basically as a dog-whistle that I’ve heard of this stuff and it matters to me. (Julia Serano’s Whipping Girl, which I was reading at the time and which interrogated lots of stuff in my head in awesome ways, showed up in a bunch of examples, and a student struck up a conversation with me during a break about how awesome it is. Yay!)
As someone who’s privileged along just about every axis you can be, I’m clueless about a lot of this stuff, but I’m constantly trying to suck less at it, and it was important to me to make that both implicit and explicit in the course.
Tomorrow, how ruthless and granular backward design is super great.