Winchester, MA The hosted DSpaceDirect repository service just got better!
From Thembani Malapela, Knowledge and Information Management Officer, Food and Agriculture Organization of the United Nations (FAO)
First download your data from your Google Search History. Yeah, creepy. Then install jq. Wait for the email from Google that your archive is ready and download then unzip it. Open a terminal window in the Searches directory, and run this:jq --raw-output '.event.query.query_text' *.json \ | sort | uniq -c | sort -rn | head -10
Here’s what I see for the 75,687 queries I’ve typed into google since July 2005.309 google analytics 130 hacker news 116 this is my jam 83 site:chroniclingamerica.loc.gov 68 jquery 54 bagit 48 twitter api 44 google translate 37 wikistream 37 opds
These are (mostly) things that I hadn’t bothered to bookmark, but visited regularly. I suspect there is something more compelling and interesting that could be done with the data. A personal panopticon perhaps.
Oh, and I’d delete the archive from your Google Drive after you’ve downloaded it. If you ever grant other apps the ability to read from your drive they could read your search history. Actually maybe this whole exercise is fraught with peril. You should just ignore it.
Part 2-an of Amazon crawl..
This item belongs to: data/ol_data.
This item has files of the following types: Data, Data, Metadata, Text
E-rate is nothing if not opportunistic and opportunity struck while hundreds of librarians combed congressional offices during National Library Legislative Day to talk with representatives about funding the Library Services and Technology Act (LSTA) and to support programs for school libraries as well as Net Neutrality and privacy issues. I was able to sneak two away from the Leg Day activities to meet with Tom Wheeler, Chairman of the Federal Communications Commission. ALA president Courtney Young and E-rate Task Force Chair, Kathi Peiffer presented Chairman Wheeler with the Council Resolution adopted at the 2015 Midwinter meeting honoring him for his leadership and vision during the recent E-rate Modernization proceeding.
For those of you who diligently followed the E-rate proceeding—spanning about 18 months—you are well aware of the many negotiations we took part in and the many issues we waded through to get from the initial Notice of Proposed Rulemaking to the final two Modernizations orders. You are also aware of the habit I got into of incorporating my family into my blog posts (well the poor things were subjected to long monologues on the various merits or our different proposals and were half neglected during the rush up to the Commission votes on the two orders). I am still mastering millennial speak, but I am confident the above title for the post accurately summarizes the heart and intent of the Resolution.Clear Vision, Firm Leadership, Strong Commitment (oh, and quite a bit of tenacity)
Because Chairman Wheeler began with a clearly articulated vision for the E-rate program as the Commission set about revamping the 18 year old program to fit the 21st century broadband needs of libraries and schools and the public they serve, he and the Commission staff gained our respect early in the proceeding. The willingness of the staff to listen to our numerous “pitches” and provide concrete feedback (and also ask challenging questions to justify our positions) helped develop confidence that our concerns were being heard and to the extent feasible addressed successfully—knowing ALA and libraries are but one stakeholder and statutes guide Commission action. It is rare that a stakeholder group can claim to be a partner in a rulemaking proceeding. In this instance it is an accurate claim. Though we wavered at times, Chairman Wheeler’s leadership and commitment to addressing all the barriers preventing libraries and schools from attaining the broadband capacity they need made it possible to put faith in the regulatory process.
We are pleased to have the opportunity to recognize the Chairman, the Commissioners, and the staff with a symbolic gesture that we hope captures the magnitude of our appreciation of the effort invested in bringing the Modernization proceeding to a successful close.
“Resolved, that the American Library Association, on behalf of its members:
- extends its deepest appreciation to Chairman Wheeler for his vision for connecting America’s libraries and schools to high-capacity broadband to best serve our communities nationwide.
- recognizes with gratitude Chairman Wheeler and the Commission for their unflagging work throughout the 18-month E-rate Modernization proceeding.”
While the drama of E-rate modernization is behind us, the more sedate but equally critical work of implementing the changes and getting the word out to library applicants is ongoing. The relationships we built and strengthened within the library community as well as outside remain important. As such, we continue to work on preparing for the 2016 funding year so libraries are fully able to take advantage of the opportunities laid out through the Modernization proceeding. We have continued our relationship with the Commission staff and with USAC and prepared guiding principles to serve as a reference document as we discover ways to partner with these agencies on behalf of libraries.
We are working closely with the Chief Officers of State library Agencies (COSLA) to assess where strong state leadership will have the most positive impact for local libraries. We have connected with the Public Library Association (PLA), the Association for Rural and Small Libraries (ARSL), the American Indian Library Association (AILA), the Association of Tribal Libraries, Archives, and Museums (ATALM), and the Urban Libraries Council (ULC) to learn more about the unique challenges their respective members may have, understand how our work can support other efforts underway for the library community, and see where we can make a unique contribution. And of course the E-rate Task Force is working on specific outreach and resources to support the field. The public face of E-rate is quiet at the moment, but rest assured, E-rate never sleeps.
I will admit to owning just over one bitcoin, largely as an experiment. It’s no more than I’m willing to lose outright, so you can say that I sleep soundly at night. But a while back I came across this piece at O’Reilly Radar and it changed my perspective on bitcoin entirely.
The author argues (persuasively in my opinion) that the real innovation isn’t bitcoin, but the blockchain which makes it possible. He describes five key concepts for understanding the blockchain, how it works, and how it can change how we do transactional activities on the web:
- “Decentralized consensus: …A decentralized scheme, on which the bitcoin protocol is based, transfers authority and trust to a decentralized virtual network and enables its nodes to continuously and sequentially record transactions on a public “block,” creating a unique “chain”: the blockchain…What’s important here is that with this degree of unbundling, the consensus logic is separate from the application itself; therefore, applications can be written to be organically decentralized, and that is the spark for a variety of system-changing innovations in the software architecture of applications, whether are money or non-money related.
- “The blockchain (and blockchain services): A blockchain is like a place where you store any data semi-publicly in a linear container space (the block). Anyone can verify that you’ve placed that information because the container has your signature on it, but only you (or a program) can unlock what’s inside the container because only you hold the private keys to that data, securely. So, the blockchain behaves almost like a database, except that part of the information stored — its “header” — is public…the blockchain acts as an alternative value transfer system that no central authority or potentially malicious third party can tamper with.
- “Smart contracts (and smart property): Smart contracts are the building blocks for decentralized applications.The basic idea behind smart contracts is that a transaction’s contractual governance between two or more parties can be verified programmatically via the blockchain, instead of via a central arbitrator, rule maker, or gatekeeper…The starting point that you assume when applying smart contracts is that third-party intermediaries are not needed in order to conduct transactions between two (or several) parties.
- “Trusted computing (or trustless transactions): When you combine the concepts behind the blockchain, decentralized consensus, and smart contracts, you start to realize they are enabling the spread of resources and transactions laterally in a flat, peer-to-peer manner, and in doing that, they are enabling computers to trust one another at a deep level…If you fast forward to a not-too-distant future, smart contracts and smart property will be created, dispensed, or executed routinely between consenting parties, without either of them even knowing that blockchain technology was the trusted intermediary.
- “Proof of work (and proof of stake):…The “proof of work” is a “right” to participate in the blockchain system. It is manifested as a “big enough hurdle” that prevents users from changing records on the blockchain without re-doing the proof of work. So, proof of work is a key building block because it cannot be “undone,” and it is secured via the strengths of cryptographic hashes that ensure its authenticity. But proof of work is expensive to maintain…and may run into future scalability and security issues because it depends solely on the miners’ incentives, which will be declining over time. An upgraded solution is “proof-of-stake,” which is cheaper to enforce but more expensive and more difficult to compromise. Proof of stake not only determines who gets to update the consensus, but it also prevents unwanted forking of the underlying blockchain.”
The author goes on to say “…we must see beyond the bitcoin promise to be the Internet of money, and into the blockchain’s promise to become a new development environment, just as web development was the new paradigm back in 1996.”
He identifies four emerging market segments for blockchain-based applications which you should see the original piece to find out more about. I’ll allow the author, William Mougayar, to have the last word:
“The reality is that the crypto-led computer science revolution is giving us concepts that go way beyond a one-currency type of scenario. Yes, bitcoin is programmable money, but the blockchain is also programmable value, programmable governance, programmable contracts, programmable ownership, programmable trust, programmable assets, etc. And we have barely scratched the surface on these applications.”
Well, interested Access-goers, you seem like a bunch of humans that would really enjoy helping out deserving humans. Well, do we have an opportunity for you!
You can do something awesome for yourself and for another by donating to our IndieGoGo campaign. We’re aiming to raise some money to offset the costs of attending the conference for those who wouldn’t normally be able to attend. We’ve already allocated money for two diversity scholarships, but we’re committed to raising as much money as possible to fund additional scholarships.
Check out the purpose of the scholarships and learn more about the campaign before making your contribution. I’m not going to lie, we have some sweeeeet perks, so get on it! We can’t be held responsible for how fast those puppies go*.
We hope you’ll consider donating so that AccessYYZ can be as inclusive and welcoming as possible!
*By puppies, we mean perks. We would never give away puppies.
In late April, the American Library Association (ALA) Washington Office had the pleasure of hosting 14 graduate students from the College of Information Studies at the University of Maryland. These students—all with interests in information policy—came to learn and discuss the wide range of our policy work and how we approach it—and enjoy a buffet of Chinese food. ALA Washington Office Executive Director Emily Sheketoff, ALA Office for Information Technology Policy (OITP) Deputy Director Larra Clark, and I represented ALA.
The discussions were wide ranging. “What is the nature and scope of information policy” deservedly took some of our time together. We spent time talking about the importance and role of coalitions and their benefits and challenges. Larra Clark led a discussion on net neutrality that served as a case study, featuring an important and challenging topic that has been (and remains) on our agenda. Overall, the discussions revealed the layers of the onion—the nuances of information policy and politics—underneath what’s reported in the popular press.
We were pleased to have the students visit us—and indeed one part of our mission is to cultivate interest and support of information policy in new entrants to the profession. The visit also builds on our myriad relationships and work with the College, and with professors John Bertot, co-director of the University of Maryland’s Information Policy & Access Center (iPAC), and Paul Jaeger, associate professor and iPAC co-director, in particular. For example, the ALA Office for Information Technology Policy is a partner on the in-process Digital Inclusion study funded by the Institute of Museum and Library Services, the successor to the Public Library Funding and Technology Access Study (PLFTAS), a long-time collaboration between ALA and the University of Maryland.
Included in the Maryland delegation were Johnna Percell, this year’s Google Policy Fellow here in OITP and Lynne Bradley, an adjunct instructor at the College and of course our former director of government relations. Many thanks to Lindsay Sarin for orchestrating the visit from the Maryland side and to Ros Reynolds, assistant director of Administration for the ALA Washington Office, for her assistance here in the office.
Photos from the event:
The post Univ. of Md. information policy students visit the Washington Office appeared first on District Dispatch.
Library of Congress: The Signal: Users, Use Cases and Adapting Web Archiving to Achieve Better Results
The following is a guest post from Michael Neubert, a Supervisory Digital Projects Specialist at the Library of Congress.In a blog post about six months ago I wrote about how the Library of Congress web archiving program was starting to harvest “general” internet news sites such as Daily Kos, Huffington Post and Townhall.com, as well as newer sites such as news.vice.com and verge.com.
Many of these sites are extremely large. How large? While not an exact count (and in fact, far from it), use of the “site” limiter in Google will provide a count of digital objects found and indexed by Google (which is a far larger number than the number of web pages, but gives some sense of relative scale to other sites). A “site:huffingtonpost.com” search in Google returns “about 3,470,000 results.” That is large.
When harvesting sites like these in a “traditional” way the harvesting starts at the home page and follows links found on pages that are in scope, capturing each page and the digital bits and pieces (such as JPEG images) so that playback software can recreate the page accurately later. For example, with HuffingtonPost.com that would mean links on the URL huffingtonpost.com and not links out to some other URL.
Unfortunately such sites are so large that the harvesting process runs out of time. Even though we were harvesting (to stay with this example) HuffingtonPost.com on a weekly basis, capturing whatever we could manage to get each time, there was no assurance that over time we would capture all the content published to the site even once, since with each harvest the process of winding through the site started over again but then followed a different path.
As we reviewed the early results of harvesting very-large-news-sites, I began to think about the different use cases that have been connected with web archiving by the International Internet Preservation Consortium.
What is the use case served by trying to capture the entire site in one visit (leaving aside completeness)? Presumably it is that the archive ends up with multiple copies of the same news item over time, perhaps showing that some changes were made for one reason or another. This is the accountability use case:
Crawling websites over time allows for modifications to content to be observed and analyzed. This type of access can be useful in ensuring accountability and visibility for web content that no longer exists. On one hand, companies may archive their web content as part of records management practices or as a defense against legal action; on the other, public web archives can show changes in governmental, organizational, or individual policy or practices.
This would be a fine use case for our efforts to support, but if we aren’t able to reliably harvest all the content at least once, it seems less than helpful for users of the archived site to have multiple copies of some news items and none of others (on a completely random basis, from the user perspective).
As it turns out, the IIPC has a different use case for “news in the 21st century“:
Blogs, Tweets, and status updates on social media are just as likely sources for news today as traditional newspapers and broadcasts. Traditional media have also adapted to deliver news online. Libraries and archives have always collected newspapers, these are the core collections of many local historical societies. If the news that is distributed online is not preserved there will be a huge hole in our collective memory.
For this use case, completeness of harvest, getting at least one copy of every news story published on the site, is a more useful outcome over time than having multiple copies harvested over a period of time of some of the news stories.And there is another use case that will be better served by completeness – text mining. The Library of Congress does not now support any type of text mining tools to interact with its web archives, but when it does (someday), completeness of capture of all that was published will be more important than multiple copies of mostly the same thing. But how do we achieve this so-called completeness, if not by attempting regular top-to-bottom harvesting?
Borrowing from an approach used by Icelandic colleagues, we have tried to achieve a more complete harvest over time by making use of RSS feeds provided by the news sites. Over the course of a week, RSS for articles that are produced by “target” news sites (such as HuffingtonPost.com) are aggregated into one-time use “seed lists” and the crawler then can go to the news site and harvest just those seeds, reliably. Although there is a certain extra effort in this approach in building custom seed lists week after week, over time (by which I mean years) it will assure completeness of capture. We should get at least one capture of every new item published moving forward in time. This is good.
We will also do occasional attempts (maybe twice a year) to harvest a whole site thoroughly to fill gaps, perhaps to help with completeness and to provide multiple captures of some content.
What will this mean for future users of these archived news sites? As we begin to make these sites that depend on harvesting-by-RSS-seed-URL available, users may find more broken link problems when browsing (although maybe not – it simply isn’t clear). While our present interface is an archive that reproduces the browse-the-site experience and it can be useful for users to have the “time machine” experience of browsing older versions of a site, this is not the only significant use case. If a user has a particular URL and wants to see it in the archive, browsing is not necessary. We still need to add textual search to support our users, but that would also offer a path around broken links. And over time (again, years) the completeness of coverage on an ongoing basis will build more reliability.
That is what a national library is supposed to be about – building significant collections over time, steadily and reliably. And this is where we want web archiving to be.
American Library Association President Courtney Young brought national attention to the history of library advocacy around privacy, surveillance and the USA PATRIOT Act this week when Beltway powerhouse publication The Hill published Young’s moving op-ed “Long Lines for Freedom.” In the op-ed, Young discusses the library community’s vocal involvement in opposing the PATRIOT Act:
Today, close to 400 librarians and library supporters from every state in the nation will line up to enter every Congressional office building. Each individual advocate in those queues will be effectively standing in for more than 10,000 of the well over 4,000,000 people who use public, academic, and school libraries in America every day.
As they wait patiently to clear security, the American Library Association’s citizen lobbyists will review the positions that they and the millions “behind them” urge Members of Congress to take on a sweeping range of issues: robust federal funding for school libraries and literacy programs, maximum taxpayer access to government information, copyright that correctly balances protection with sparking innovation, and assuring network neutrality.
No issue, however, will be closer to their hearts — nor is of more concern to enormous numbers of Americans — than finally recalibrating the nation’s privacy and surveillance laws. Change is urgently needed to restore to our laws respect for Americans’ civil liberties compromised in 2001, and still badly undermined today, by the USA PATRIOT Act, the outdated Electronic Communications Privacy Act, and several “cybersecurity” proposals under consideration.
The hundreds of librarians and library supporters “hitting the Hill” are part of another long line: one comprised of tens of thousands of their colleagues who, sometimes at great personal risk, have stood for many decades for the rights of library patrons’ reading, borrowing and now internet surfing records to be safe from sweeping and literally “un-warranted” bulk collection by law enforcement authorities and the indefinite retention of that information. Courageous librarians in Connecticut remain among the few to openly and legally challenge a National Security Letter seeking such records under the USA Patriot Act, and its associated total “gag order.” The case was withdrawn before a federal judge could review its merits, leaving the heroes known now as the “Connecticut Four” free to speak out, as they have for years, against the power and perils of this flawed law.
Moving forward, ALA’s Office of Government Relations will continue to work with privacy coalition partners to call for reforms to our nation’s surveillance laws. The Association has recently lent its support for the passage of an unweakened USA FREEDOM Act.
The cornerstone of Agile development is the sharing of information. An Agile team that does not communicate well is destined to fail: the focus on efficiency and short, independent development cycles means development moves at a rapid pace, and there is much slack in the timeline to account for communication hiccups. Therefore, the each member of the team needs to be aware of what everyone else is working on, as well as any impacts and dependencies between her assigned tasks and those of her coworkers. With sprints lasting two or three weeks each, it’s imperative that team members be proactive about sharing the status of their work. While this can (and should) happen as informal conversations between team members, Agile provides a tool to encourage frequent communication: the daily standup.
What it Is
The daily standup is a meeting where team members discuss their own immediate goals and challenges. It is typically held at the beginning of the workday (to maximize the impact of the shared information) and should last no more than ten to fifteen minutes. You don’t actually have to be standing up; that is only a convention that encourages brevity and informality (almost as if you all ran into each other on the way to your desks!). It should take place somewhere close to the actual work area (a hallway or other open space works well); ideally this will be a place where the team has access to its chosen tracking tool (the big Post-It board, a projector screen showing the GreenHopper task board, etc.) Above all, it should be fast-paced and lively. A good standup allows the team to track its progress towards sprint deliverables, increases efficiency, and encourages open communication and commitment to the team.
What it Isn’t
The daily standup is not a status or planning meeting. If you need to discuss changing requirements or priorities, call a separate meeting. If you need to figure out how much team members are accomplishing, wait for the next one-on-one update (or look at your tracking tool!). There should be no need for minutes, and no complex technical discussions; if those are needed they should be held separately (right after the standup would be a great time) and involve only the necessary parties.
Pigs and Chickens
Who should attend daily standups? A common analogy used by Agile devotees is the one about pigs and chickens. If you think of the development process as a hearty breakfast plate, there are two types of contributors: the chickens, who supply the eggs, and the pigs, who supply the bacon/sausage/ham meat product (ignore the toast and OJ for now). The main difference between the two groups is the level of commitment. In software development, you can look at stakeholders as either chickens (product managers, executives, salespeople, anyone who provides input into the process) or pigs (developers, engineers, and anyone else who performs actual development work). Standup attendance is mandatory for the pig contingent, and they will all be expected to contribute; chicken attendance is optional (although the Product Manager should be there every day) and they should only listen, unless they are called on to provide context.
The Three Questions
During a typical standup, core team members (they usually dislike being referred to as pigs, no matter how useful the above analogy may be) will take turns and briefly answer three questions:
- What did I work on yesterday?
- What am I working on today?
- Is there anything or anyone keeping me from completing he work described in #2?
That’s it! Once everyone has had a turn, the meeting is over, and the team gets on with the day’s work. If an issue came up during the meeting (usually during question #3), then the relevant parties can meet to discuss a good solution, but again, that’s another meeting. Remember, this is not about planning or negotiation. The goal of the standup is for every team member to find out what everyone else is up to, increasing mutual awareness and strengthening team relationships and communication paths, and to surface potential issues or conflicts so they can be dealt with before they cause bigger problems.
How efficient do you think your team’s standup meetings are? Do you find it difficult to avoid planning or technical rabbit holes? What’s your favorite technique for keeping standups lively?
As LITA President, one of my initiatives for my presidential year was to improve the member experience. I’ve been doing this by applying user experience concepts that I’m familiar with in my everyday job to effect change and improve the overall experience for current members and those who are on the fence about joining. The LITA member experience encompasses all aspects of a member’s interaction with the association, including its programming, educational opportunities, publications, events, and even other members. The LITA Board, Committees, and ad hoc Task Forces have been instrumental to making a positive difference in the experiences of our members.
Therefore, it was important for me to pick someone for my President’s Program at ALA Annual that is considered to be an expert on user experience (UX). Accomplished information architect and author Louis Rosenfeld and LITA President Rachel Vacek will discuss the curious world of user experience at the 2015 LITA President’s Program at the ALA Annual Conference in San Francisco, on Sunday, June 28th, 2015, from 3-4 p.m. I know that he’s excited to tackle your questions on the importance of UX research and what libraries can do to provide better experiences.
As we approach ALA Annual, I put together a fantastic team of people to help me plan not one, but TWO exciting events that lead up to the President’s Program. Here they are:
I hope that you consider participating in these exciting opportunities to share what YOU know about UX with others in the library community, and learn from others. And I look forward to seeing you in San Francisco at my President’s Program with Lou Rosenfeld!
Rachel Vacek, and the LITA President’s Program Planning Team: Amanda Goodman, Whitni Watkins, Isabel Gonzalez-Smith, Haley Moreno, Bohyun Kim, along with additional help from our friends on the RUSA User Experience Design Committee, Lauren McKeen and Rachael Cohen
In case you haven’t noticed, user experience (UX) is all the buzz in libraries lately. If you aren’t already in the thick of a UX project now, you’re likely thinking of ways to start one.
Accomplished information architect and author Lou Rosenfeld will discuss the curious world of user experience at the 2015 LITA President’s Program at the ALA Annual Conference in San Francisco, on Sunday, June 28th, 2015, from 3-4pm. He’ll tackle your questions with moderation from LITA President Rachel Vacek on the importance of UX research and what libraries can do to provide better experiences. This is a conversation you won’t want to miss!
As we approach ALA Annual, the LITA President’s Program Planning Team invites you to share your library’s successful UX projects through the LITA President’s Program Contest: Great Library UX Ideas Under $100. Join us as we celebrate library innovation!
Have you done something great to improve the physical or web user experience in your library? Were you able to complete the entire project for less than $100? Tell us about it!
Using the form below, submit a description of your UX project. Include a brief description of the problem, the project’s goals and desired outcomes, methodology or assessment, any technology utilized, the budget, what changes were implemented, and a description of how those changes improved the library users’ experience. The project can be in progress or recently completed and can be related to any aspect of library services: technology, spaces, resources, or programs – so long as it is user-centered, improved the UX, and has been or will be completed with a budget of $100 or less!
- Submissions need to include a brief description of the problem, the project’s goals and desired outcomes, methodology or assessment, any technology utilized, the budget, what changes were implemented, and a description of how those changes improved the library users’ experience.
- Librarians and library staff from all types of libraries are encouraged to participate. Projects must have occurred within a library setting.
- Project descriptions must be 500 words or less.
- You do not need to be a LITA member to participate in the contest.
- Only one submission per project will be considered.
- Projects completed prior to January, 2014 will not be considered.
To enter, complete the Contest Submission Form by May 31, 2015 by 12:00 AM CDT
Selection Process and Criteria
The LITA’s President’s Program Planning Team will select one project winner. Notable entries may receive mention on the LITA Blog or other ALA platforms. Winning submissions will be notified via email the week of June 15, 2015.
Submissions will be judged on the following criteria:
- Innovation: How did you use little resources to creatively assess and solve a problem?
- Simplicity: How easily could this project be implemented elsewhere?
- Impact: How did your project positively affect the library users’ experience?
- A personal one-year, online subscription to Library Technology Reports, a $249 value! You’ll get online access to 8 brand new issues of Library Technology Reports, all written by leading practitioners. This has been generously donated by ALA Tech Source.
- Lunch with President Program Speaker, Lou Rosenfeld, and LITA President, Rachel Vacek on June 28th prior to the event
- Recognition at the LITA President’s Program at ALA Annual on Sunday, June 28, 2015
- Recognition on the LITA Blog
- Publication of the winning entry in WEAVE, the Journal of Library User Experience
May 31, 2015, 12:00 AM CDT
Participate in LITA’s first ever UX Twitter Chat!
When when this happen?
- Friday, May 15th, 2-3 p.m. with moderators Amanda (@godaisies) and Haley (@hayleym1218)
- Friday, May 29th 2-3 p.m. EDT with moderator Bohyun (@bohyunkim)
- Friday, June 12th 2-3 p.m. EDT with moderator Whitni (@_whitni)
- Friday, June 19th 2-3 p.m. EDT with moderator Michael (@schoeyfield)
What is the UX Twitter Chat?
Share your challenges, successes, techniques, and workflows that you have developed to improve user experience (UX) at your library. Use #litaux to participate. Additionally, you can ask questions that may be selected to be answered by proclaimed UX expert, Lou Rosenfeld during the LITA President’s Program at Annual. Submit your questions ahead of time to the moderator. The moderators will ask questions in the Q1, Q2, and Q3 format and followers will use #litaux and answer back in the A1, A2, and A3 format, as you may have see in the #libchat or #inaljchat.
When participating, be constructive. Be respectful. No attacking others. And use the hashtag #litaux.
Who’s behind this?
This chat is inspired by the 2015 LITA President’s Program at ALA Annual and put together by LITA President Rachel Vacek (@vacekrae), her planning team (Amanda Goodman, Whitni Watkins, Isabel Gonzalez-Smith, Haley Moreno, Bohyun Kim, Rachael Cohen, and Lauren McKeen). To make this even more awesome, they are partnering with the LITA UX Interest Group and Weave, the Journal of Library User Experience.
We hope you’ll join us!
While the United States was in the midst of the Civil War, the country was also making one of its greatest breakthroughs in transportation—the Transcontinental Railroad. From the railroad’s war-weary beginnings, to the last Golden Spike at Promontory Summit in Utah on May 10, 1869, the railroad’s development forever changed American travel and communication. It also had long-reaching and irrevocable impacts on the lives of Native Americans and Chinese immigrant laborers, who bore the brunt of the treacherous tunneling and track-laying across the country. Our newest exhibition “Building the Transcontinental Railroad” explores the railroad’s construction and its impact on American culture and westward expansion.
This exhibition was created as part of the DPLA’s Digital Curation Program by the following students as part of Professor Krystyna Matusiak’s course “Digital Libraries” in the Library and Information Science program at the University of Denver: Jenifer Fisher, Benjamin Hall, Nick Iwanicki, Cheyenne Jansdatter, Sarah McDonnell, Timothy Morris and Allan Van Hoye.View the exhibition
Featured image credit: “Does not such a meeting make amends?” a 1869 print showing an allegorical linking of the Transcontinental Railroad at Promontory Summit, Utah. Courtesy of the Library of Congress.
AWS is very profitable: $265 million in profit on $1.57 billion in sales last quarter alone, for an impressive (for Amazon!) 17% net margin.The post starts by supposing that Amazon spun out AWS via an IPO:
One of the technology industry’s biggest and most important IPOs occurred late last month, with a valuation of $25.6 billion dollars. That’s more than Google, which IPO’d at a valuation of $24.6 billion, and certainly a lot more than Amazon, which finished its first day on the public markets with a valuation of $438 million. It concludes:
The profitability of AWS is a big deal in-and-of itself, particularly given the sentiment that cloud computing will ultimately be a commodity won by the companies with the deepest pockets. It turns out that all the reasons to believe in AWS were spot on: Amazon is clearly reaping the benefits of scale from being the largest player, and their determination to have both the most complete and cheapest offering echoes their prior strategies in e-commerce.Thompson's post is a must-read; I've only given a small taste of it. But it clearly demonstrates that even AWS overall is very profitable, let alone the profitability of S3, its storage service, which I've been blogging about for more than three years.
Peter Murray: Thursday Threads: Library RFP Registry, Transformed Libraries talk at IMLSfocus, DIY VPN
Delivered by FeedBurner
Welcome spring in the northern hemisphere! Thoughts turn to fresh new growth — a new tool to help with writing documents for procuring library systems, a fresh way to think about how libraries can transform and be transformed, and spring cleaning for your browsing habits with a do-it-yourself VPN.
Feel free to send this to others you think might be interested in the topics. If you find these threads interesting and useful, you might want to add the Thursday Threads RSS Feed to your feed reader or subscribe to e-mail delivery using the form to the right. If you would like a more raw and immediate version of these types of stories, watch my Pinboard bookmarks (or subscribe to its feed in your feed reader). Items posted to are also sent out as tweets; you can follow me on Twitter. Comments and tips, as always, are welcome.Library Technology Guides Procurement Registry
The Library Technology Guides Procurement Registry provides a free and convenient service to allow libraries to post procurement opportunities for technology products. The registry focuses on library-specific technology products including integrated library systems, library services platforms, discovery products, and RFID technologies.
By aggregating information and documents related to current and historic procurement projects, the repository helps libraries and vendors gain insight in to the requirements that libraries expect to be fulfilled by these products.
Documents and data from past procurement projects have been collected from the Web sites of libraries and other publicly available sources. Requests for Proposals available in this Registry should be assumed to be copyrighted by the institutions that issued them or consultants that developed them and any content in those documents should be incorporated into other procurement document only with the explicit permission of the copyright holders or in accordance with any associated Creative Commons licenses. These documents provide useful background and perspective libraries developing their own requirements for new technology procurement projects and for vendors to inform their development priorities.– Library Technology Guides Procurement Registry
Marshall Breeding is building a useful resource for anyone that needs to purchase a library technology system of some sort through an RFP process. As the description above says, this registry will have valuable insights from others that have gone through the same process. I would encourage users not to get “tunnel vision” though — not simply adopt one of these documents as their own. The process of creating an RFP is an important exploration of what the institution needs and how it will score responders. Don’t use the registry as a shortcut but as a source of inspiration for structuring your own document and as a check to make sure you’ve thought of all of the important components.[Andromeda Yelton’s] #IMLSFocus remarks [on libraries transforming and being transformed]
I was going to talk about why ongoing tech training is hard, the nuts and bolts of pedagogy, and what you can do to help. Maybe I still will in Q&A. But right now, 40 miles north of us, Baltimore is burning. Or it isn’t: it is ten thousand people protesting peacefully against many years of secret violence, violence kept secret by habitual gag orders, with national media drawn like moths to the mere handful of flames. The stories I hear on Twitter are not the same as the stories on CNN. And we, as cultural heritage institutions, are about our communities and their stories, and about which stories are told, which are made canon, and how and why.
So I want to talk about how technology training and digital platforms can either support, or threaten, our communities and their ability to tell their stories, and to have their stories reflected in the canonical story that we build when we build a national platform. I want to make it explicit what we are doing in this room, today, is about deciding whose stories get told, by whom, and how. Whose are widely recognized as valid, and whose are samizdat, whose get to reach our corridors of power only through protest and fire.– My #IMLSFocus remarks, by Andromeda Yelton, 1-May-2015
Last week at the first of three IMLS Focus events, Andromeda Yelton spoke passionately about the need for openness in the systems that make up our libraries — that openness of application interfaces are needed to transform the users of a library as much as they are needed to transform the library itself. To listen/watch her remarks, go to the panel recording and skip ahead about 2 minutes. And I had to look up samizdat in Wikipedia to get the context.How to setup your own private, secure, free* VPN on the Amazon AWS Cloud in 10 minutes
So, we all know the benefits of using a VPN like privacy, anonymity, unblocking websites, security, overcoming geographical restrictions and so on. However, it has always been hard to trust a VPN provider who could potentially log and intercept your internet traffic! Launching a private VPN server will give us the best of what a VPN truly offers. This guide will walk you through all the steps to running your own VPN server in about 10 minutes.– How to setup your own private, secure, free* VPN on the Amazon AWS Cloud in 10 minutes, Webdigi, 17-Mar-2015
This is pretty straight forward. Amazon offers a free year of a micro-sized compute instance to new Amazon Web Services Customers (hence the asterisk in the title). The blog post comes with instructions on how to set it up and how to configure your Android, Mac, and Asus router to the new VPN. The same sort of instructions would work for iOS, Windows and Linux systems — the software running on the AWS server supports the standard PPTP and L2TP with IPSEC VPN protocols.Link to this post!