You are here

Feed aggregator

District Dispatch: The Goodlatte, the bad and the ugly…

planet code4lib - Thu, 2014-09-18 20:55

My Washington Office colleague Carrie Russell, ALA’s copyright ace in the Office of Information Technology Policy, provides a great rundown here in DD on the substantive ins and outs of the House IP Subcommittee’s hearing yesterday. The Subcommittee met to take testimony on the part of the 1998 Digital Millennium Copyright Act (Section 1201, for those of you keeping score at home) that prohibits anyone from “circumventing” any kind of “digital locks” (aka, “technological protection measures,” or “TPMs”) used by their owners to protect copyrighted works. The hearing was also interesting, however, for the politics of the emerging 1201 debate on clear display.

First, the good news.  Rep. Bob Goodlatte (VA), Chairman of the full House Judiciary Committee, made time in a no doubt very crowded day to attend the hearing specifically for the purpose of making a statement in which he acknowledged that targeted reform of Section 1201 was needed and appropriate.  As one of the original authors of 1201 and the DMCA, and the guy with the big gavel, Mr. Goodlatte’s frank and informed talk was great to hear.

Likewise, Congressman Darrell Issa of California (who’s poised to assume the Chairmanship of the IP Subcommittee in the next Congress and eventually to succeed Mr. Goodlatte at the full Committee’s helm) agreed that Section 1201 might well need modification to prevent it from impeding technological innovation — a cause he’s championed over his years in Congress as a technology patent-holder himself.

Lastly, Rep. Blake Farenthold added his voice to the reform chorus.  While a relatively junior Member of Congress, Rep. Farenthold clearly “gets” the need to assure that 1201 doesn’t preclude fair use or valuable research that requires digital locks to be broken precisely to see if they create vulnerabilities in computer apps and networks that can be exploited by real “bad guys,” like malware- and virus-pushing lawbreakers.

Of course, any number of other members of the Subcommittee were singing loudly in the key of “M” for yet more copyright protection.  Led by the most senior Democrat on the full Judiciary Committee, Rep. John Conyers (MI), multiple members appeared (as Carrie described yesterday) to believe that “strengthening” Section 1201 in unspecified ways would somehow thwart … wait for it … piracy, as if another statute and another penalty would do anything to affect the behavior of industrial-scale copyright infringers in China who don’t think twice now about breaking existing US law.  Sigh….

No legislation is yet pending to change Section 1201 or other parts of the DMCA, but ALA and its many coalition partners in the public and private sectors will be in the vanguard of the fight to reform this outdated and ill-advised part of the law (including the triennial process by which exceptions to Section 1201 are granted, or not) next year.  See you there!

The post The Goodlatte, the bad and the ugly… appeared first on District Dispatch.

SearchHub: Say Hello to Lucidworks Fusion

planet code4lib - Thu, 2014-09-18 20:43

The team at Lucidworks is proud to announce the release of our next-generation platform for building powerful, scalable search applications: Lucidworks Fusion.

Fusion extends any Solr deployment with the enterprise-grade capabilities you need to deliver a world-class search experience:

Full support for any Solr deployment including Lucidworks Search, SolrCloud, and stand-alone mode.

Deeper support for recommendations including Item-to-Query, Query-to-Item, and Item-to-Item with aggregated signals.

Advanced signal processing including any datapoint (click-through, purchases, ratings) – even social signals like Twitter.

Enhanced application development with REST APIs, index-side and query-time pipelines, with sophisticated connector frameworks.

Advanced web and filesystem crawlers with multi-threaded HTML/document connectors, de-duping, and incremental crawling.

Integrated security management for roles and users supporting HTTPs, form-based, Kerberos, LDAP, and native methods.

Search, log, and trend analytics for any log type with real-time and historical data with SiLK.

Ready to learn more? Join us for our upcoming webinar:

Webinar: Meet Lucidworks Fusion

Join Lucidworks CTO Grant Ingersoll for a ‘first look’ at our latest release, Lucidworks Fusion. You’ll be among the first to see the power of the Fusion platform and how it gives you everything you need to design, build, and deploy amazing search apps.

Webinar: Meet Lucidworks Fusion
Date: Thursday, October 2, 2014
Time: 11:00 am Pacific Daylight Time (San Francisco, GMT-07:00)

Click here to register for this webinar.

Or learn more at http://lucidworks.com/product/fusion/

John Miedema: Wilson iteration plans: Topics on text mining the novel.

planet code4lib - Thu, 2014-09-18 20:27

The Wilson iteration of my cognitive system will involve a deep dive into topics on text mining the novel. My overly ambitious plans are the following, roughly in order:

  • Develop a working code illustration of genre detection.
  • Develop another custom entity recognition model for literature, using an annotated corpus.
  • Visualization of literary concepts using time trends.
  • Collection of open data, open access articles, and open source tools for text analysis of literature.
  • Think about a better teaching tool for building models. Distinguish teaching computers from programming.

We’ll see where it goes.

DPLA: Nearly 100,00 items from the Getty Research Institute now available in DPLA

planet code4lib - Thu, 2014-09-18 20:03

More awesome news from DPLA! Hot on the heels of announcements earlier this week about newly added materials from the Medical Heritage Library and the Government Printing Office, we’re excited to share today that nearly 100,000 items from the Getty Research Institute are now available via DPLA.

To view the Getty in DPLA, click here.

From an announcement posted today on the Getty Research Institute Blog:

As a DPLA content hub, the Getty Research Institute has contributed metadata—information that enables search and retrieval of material—for nearly 100,000 digital images, documentary photograph collections, archives, and books dating from the 1400s to today. We’ve included some of the most frequently requested and significant material from our holdings of more than two million items, including some 5,600 images from the Julius Shulman photography archive, 2,100 images from the Jacobson collection of Orientalist photography, and dozens of art dealers’ stockbooks from the Duveen and Knoedler archives.

The Getty will make additional digital content available through DPLA as their collections continue to be cataloged and digitized.

All written content on this blog is made available under a Creative Commons Attribution 4.0 International License. All images found on this blog are available under the specific license(s) attributed to them, unless otherwise noted.

Alf Eaton, Alf: Archiving and displaying tweets with dat

planet code4lib - Thu, 2014-09-18 19:47

First, make a new directory for the project:

mkdir tweet-stream && cd $_

Install node.js (nodejs in Debian/Ubuntu, node in Homebrew), update npm if needed (npm install -g npm) and install dat:

npm install -g maxogden/dat

dat is essentially git for data, so the data repository needs to be initialised before it can be used:

dat init

Next, start the dat server to listen for incoming connections:

dat listen

Data can be piped into dat as line-delimited JSON (i.e. one object per line - the same idea as CSV but with optional nested data). Happily, this is the format in which Twitter’s streaming API provides information, so it's ideal for piping into dat.

I used a PHP client to connect to Twitter’s streaming API as I was interested in seeing how it handled the connection (the client needs to watch the connection and reconnect if no data is received in a certain time frame). There may be a command-line client that is even easier than this, but I haven’t found one yet…

Install Phirehose using Composer:

composer init && composer require fennb/phirehose:dev-master && composer install

The streaming API uses OAuth 1.0 for authentication, so you have to register a Twitter application to get an OAuth consumer key and secret, then generate another access token and secret for your account. Add these to this small PHP script that initialises Phirehose, starts listening for filtered tweets and outputs each tweet to STDOUT as it arrives:

Run the script to connect to the streaming API and start importing data: php stream.php | dat import -json

The dat server that was started earlier with dat listen is listening on port 6461 for clients, and is able to emit each incoming tweet as a Server-Sent Event, which can then be consumed in JavaScript using the EventSource API.

I’m in the process of making a twitter-stream Polymer element, but in the meantime this is how to connect to dat’s SSE endpoint:

var server = new EventSource(‘http://your-dat-ip-address:6461/api/changes?data=true&style=sse&live=true&limit=1&tail=1’); server.addEventListener('data', function(event) { var item = JSON.parse(event.data).value; // do something with the tweet });

Patrick Hochstenbach: Hard Reset

planet code4lib - Thu, 2014-09-18 18:52
Joining Hard Reset a playground for illustrators to draw cartoons about a post apocalyptic world. These doodles I can draw during my 20 minute commute from Brugge to Ghent.Filed under: Comics Tagged: art, cartoon, comic, comics, commute, copic, doodle,

Jonathan Rochkind: Umlaut 4.0 beta

planet code4lib - Thu, 2014-09-18 18:39

Umlaut is an open source specific-item discovery layer, often used on top of SFX, and based on Rails.

Umlaut 4.0.0.beta2 is out! (Yeah, don’t ask about beta1 :) ).

This release is mostly back-end upgrades, including:

  • Support for Rails 4.x (Rails 3.2 included to make migration easier for existing installations, but recommend starting with Rails 4.1 in new apps)
  • Based on Bootstrap 3 (Rails 3 was Bootstrap 2)
  • internationalization/localization support
  • A more streamlined installation process with a custom installer

Anyone interested in beta testing? Probably most interesting if you have an SFX to point it at, but you can take it for a spin either way.

To install a new Umlaut app, see: https://github.com/team-umlaut/umlaut/wiki/Installation


Filed under: General

Andromeda Yelton: jQuery workshop teaching techniques, part 3: ruthless backward design

planet code4lib - Thu, 2014-09-18 17:04

I’m writing up what I learned from teaching a jQuery workshop this past month. I’ve already posted on my theoretical basis, pacing, and supporting affective goals. Now for the technique I invested the most time in and got the most mileage out of…

Ruthless backward design

Yes, yes, we all know we are supposed to do backward design, and I always have a general sense of it in my head when I design courses. In practice it’s hard, because you can’t always carve out the time to write an entire course in advance of teaching it, but for a two-day bootcamp I’m doing that anyway

Yeah. Super ruthless. I wrote the last lesson, on functions, first. Along the way I took notes of every concept and every function that I relied on in constructing my examples. Then I wrote the second-to-last lesson, using what I could from that list (while keeping the pacing consistent), and taking notes on anything else I needed to have already introduced – again, right down to the granularity of individual jQuery functions. Et cetera. My goal was that, by the time they got to writing their own functions (with the significant leap in conceptual difficulty that entails), they would have already seen every line of code that they’d need to do the core exercises, so they could work on the syntax and concepts specific to functions in isolation from all the other syntax and concepts of the course. (Similarly, I wanted them to be able to write loops in isolation from the material in lessons 1 and 2, and if/then statements in isolation from the material in lesson 1.)

This made it a lot easier for me to see both where the big conceptual leaps were and what I didn’t need. I ended up axing .css() in favor of .addClass(), .removeClass(), and .hasClass() – more functions, but all conceptually simpler ones, and more in line with how I’ve written real-world code anyway. It meant that I axed booleans – which in writing out notes on course coverage I’d assumed I’d cover (such a basic data type, and so approachable for librarians!) – when I discovered I did not need their conceptual apparatus to make the subsequent code make sense. It made it clear that .indexOf() is a pain, and students would need to be familiar with its weirdness so it didn’t present any hurdles when they had to incorporate it into bigger programs.

Funny thing: being this ruthless and this granular meant I actually did get to the point where I could have done real-world-ish exercises with one more session. I ended up providing a few as further practice options for students who chose jQuery practice rather than the other unconference options for Tuesday afternoon. By eliminating absolutely everything unnecessary, right down to individual lines of code, I covered enough ground to get there. Huh!

So yeah. If I had a two-day workshop, I’d set out with that goal. A substantial fraction of the students would feel very shaky by then – it’s still a ton of material to assimilate, and about a third of my survey respondents’ brains were full by the time we got to functions – but including a real-world application would be a huge motivational payoff regardless. And group work plus an army of TAs would let most students get some mileage out of it. Add an option for people to review earlier material in the last half-day, and everyone’s making meaningful progress. Woot!

Also, big thanks to Sumana Harihareswara for giving me detailed feedback on a draft of the lesson, and helping me see the things I didn’t have the perspective to see about sequencing, clarity, etc. You should all be lucky enough to have a proofreader so enthusiastic and detail-oriented.

Later, where I want to go next.

Open Knowledge Foundation: Announcing a Leadership Update at Open Knowledge

planet code4lib - Thu, 2014-09-18 15:05

Today I would like to share some important organisational news. After 3 years with Open Knowledge, Laura James, our CEO, has decided to move on to new challenges. As a result of this change we will be seeking to recruit a new senior executive to lead Open Knowledge as it continues to evolve and grow.

As many of you know, Laura James joined us to support the organisation as we scaled up, and stepped up to the CEO role in 2013. It has always been her intention to return to her roots in engineering at an appropriate juncture, and we have been fortunate to have had Laura with us for so long – she will be sorely missed.

Laura has made an immense contribution and we have been privileged to have her on board – I’d like to extend my deep personal thanks to her for all she has done. Laura has played a central role in our evolution as we’ve grown from a team of half-a-dozen to more than forty. Thanks to her commitment and skill we’ve navigated many of the tough challenges that accompany “growing-up” as an organisation.

There will be no change in my role (as President and founder) and I will be here both to continue to help lead the organisation and to work closely with the new appointment going forward. Laura will remain in post, continuing to manage and lead the organisation, assisting with the recruitment and bringing the new senior executive on board.

For a decade, Open Knowledge has been a leader in its field, working at the forefront of efforts to open up information around the world and and see it used to empower citizens and organisations to drive change. Both the community and original non-profit have grown – and continue to grow – very rapidly, and the space in which we work continues to develop at an incredible pace with many exciting new opportunities and activities.

We have a fantastic future ahead of us and I’m very excited as we prepare Open Knowledge to make its next decade even more successful than its first.

We will keep everyone informed in the coming weeks as our plans develop, and there will also be opportunities for the Open Knowledge community to discuss. In the meantime, please don’t hesitate to get in touch with me if you have any questions.

District Dispatch: Free webinar: Helping patrons set financial goals

planet code4lib - Thu, 2014-09-18 14:51

On September 23rd, the Consumer Financial Protection Bureau and the Institute for Museum and Library Services will offer a free webinar on financial literacy. This session has limited space so please register quickly.

Sometimes, if you’re offering programs on money topics, library patrons may come to you with questions about setting money goals. To assist librarians, the Consumer Financial Protection Bureau and the Institute of Museum and Library Services are developing financial education tools and sharing best practices with the public library field.

The two agencies created the partnership to help libraries provide free, unbiased financial information and referrals in their communities, build local partnerships and promote libraries as community resources. As part of the partnership, both agencies gathered information about libraries and financial education. Their surveys focused on attitudes about financial education, and how librarians can facilitate more financial education programs.

Join both groups on Tuesday, September 23, 2014, from 2:30–3:30p.m. Eastern Time for the free webinar “Setting money goals,” which will explore the basics of money management. The webinar will teach participants how to show patrons to create effective money goals.

Webinar Details

September 23, 2014
2:30–3:30p.m. Eastern
Join the webinar (No need to RSVP)

  • Conference number: PW8729932
  • Audience passcode: LIBRARY

If you are participating only by phone, please dial the following number:

  • Phone: 1-888-947-8930
  • Participant passcode: LIBRARY

The post Free webinar: Helping patrons set financial goals appeared first on District Dispatch.

OCLC Dev Network: Reminder: Developer House Nominations Close on Monday

planet code4lib - Thu, 2014-09-18 14:45

If you've been thinking about nominating someone – including yourself - for Developer House this December, there’s no time like the present to submit that nomination form.

Open Knowledge Foundation: Launching a new collaboration in Macedonia with Metamorphosis and the UK Foreign & Commonwealth Office

planet code4lib - Thu, 2014-09-18 14:07

As part of the The Open Data Civil Society Network Project, School of Data Fellow, Dona Djambaska, who works with the local independent nonprofit, Metamorphosis, explains the value of the programme and what we hope to achieve over the next 24 months.

“The concept of Open Data is still very fresh among Macedonians. Citizens, CSOs and activists are just beginning to realise the meaning and power hidden in data. They are beginning to sense that there is some potential for them to use open data to support their causes, but in many cases they still don’t understand the value of open data, how to advocate for it, how to find it and most importantly – how to use it!

Metamorphosis was really pleased to get this incredible opportunity to work with the UK Foreign Office and our colleagues at Open Knowledge, to help support the open data movement in Macedonia. We know that an active open data ecosystem in Macedonia, and throughout the Balkan region, will support Metamorphosis’s core objectives of improving democracy and increasing quality of life for our citizens.

It’s great to help all these wonderful minds join together and co-build a community where everyone gets to teach and share. This collaboration with Open Knowledge and the UK Foreign Office is a really amazing stepping-stone for us.

We are starting the programme with meet-ups and then moving to more intense (online and offline) communications and awareness raising events. We hope our tailored workshops will increase the skills of local CSOs, journalists, students, activists or curious citizens to use open data in their work – whether they are trying to expose corruption or find new efficiencies in the delivery of government services.

We can already see the community being built, and the network spreading among Macedonian CSOs and hope that this first project will be part of a more regional strategy to support democratic processes across the Balkan region.”

Read our full report on the project: Improving governance and higher quality delivery of government services in Macedonia through open data

Dona Djambaska, Macedonia.

Dona graduated in the field of Environmental Engineering and has been working with the Metamorphosis foundation in Skopje for the past six years assisting on projects in the field of information society.

There she has focused on organising trainings for computer skills, social media, online promotion, photo and video activism. Dona is also an active contributor and member of the Global Voices Online community. She dedicates her spare time to artistic and activism photography.

Ed Summers: Satellite of Art

planet code4lib - Thu, 2014-09-18 13:26

… still there

FOSS4Lib Recent Releases: BitCurator - 0.9.20

planet code4lib - Thu, 2014-09-18 12:36

Last updated September 18, 2014. Created by Peter Murray on September 18, 2014.
Log in to edit this page.

Package: BitCuratorRelease Date: Friday, September 5, 2014

Peter Murray: Thursday Threads: Patron Privacy on Library Sites, Communicating with Developers, Kuali Continued

planet code4lib - Thu, 2014-09-18 10:58
Receive DLTJ Thursday Threads:

by E-mail

by RSS

Delivered by FeedBurner

In the DLTJ Thursday Threads this week: an analysis of how external services included on library web pages can impact patron privacy, pointers to a series of helpful posts from OCLC on communication between software users and software developers, and lastly an update on the continuing discussion of the Kuali Foundation Board’s announcement forming a commercial entity.

Before we get started on this week’s threads, I want to point out a free online symposium that LYRASIS is performing next week on sustainable cultural heritage open source software. Details are on the FOSS4Lib site, you can register on the LYRASIS events site, and then join the open discussion on the discuss.foss4lib.org site before, during and after the symposium.

Feel free to send this to others you think might be interested in the topics. If you find these threads interesting and useful, you might want to add the Thursday Threads RSS Feed to your feed reader or subscribe to e-mail delivery using the form to the right. If you would like a more raw and immediate version of these types of stories, watch my Pinboard bookmarks (or subscribe to its feed in your feed reader). Items posted to my Pinboard bookmarks are also sent out as tweets; you can follow me on Twitter. Comments and tips, as always, are welcome.

Analysis of Privacy Leakage on a Library Catalog Webpage

My post last month about privacy on library websites, and the surrounding discussion on the Code4Lib list prompted me to do a focused investigation, which I presented at last weeks Code4Lib-NYC meeting.
I looked at a single web page from the NYPL online catalog. I used Chrome developer tools to trace all the requests my browser made in the process of building that page. The catalog page in question is for The Communist Manifesto. It’s here: http://nypl.bibliocommons.com/item/show/18235020052907_communist_manifesto. …

So here are the results.

- Analysis of Privacy Leakage on a Library Catalog Webpage, by Eric Hellman, Go To Hellman, 16-Sep-2014

Eric goes on to note that he isn’t criticizing the New York Public Library, but rather looking at a prominent system with people who are careful of privacy concerns — and also because NYPL was the host of the Code4Lib-NYC meeting. His analysis of what goes on behind the scenes of a web page is illuminating, though, and how all the careful work to protect patron’s privacy while browsing the library’s catalog can be brought down by the inclusion of one simple JavaScript widget.

Series of Posts on Software Development Practices from OCLC

This is the first post in a series on software development practices. We’re launching the series with a couple of posts aimed at helping those who might not have a technical background communicate their feature requests to developers.

- Software Development Practices: What&aposs the Problem?, by Shelly Hostetler, OCLC Developer Network, 22-Aug-2014

OCLC has started an excellent set of posts on how to improve communication between software users and software developers. The first three have been posted so far with another one expected today:

  1. Software Development Practices: What&aposs the Problem?
  2. Software Development Practices: Telling Your User&aposs Story
  3. Software Development Practices: Getting Specific with Acceptance Criteria

I’ve bookmarked them and will be referring to them when talking with our own members about software development needs.

Kuali 2.0 Discussion Continues

…I thought of my beehives and how the overall bee community supports that community/ hive. The community needs to be protected, prioritized, supported and nourished any way possible. Each entity, the queen, the workers and the drones all know their jobs, which revolve around protecting supporting and nourishing the community.

Even if something disrupts the community, everyone knows their role and they get back to work in spite of the disruption. The real problem within the Kuali Community, with the establishment of the Kuali Commercial Entity now is that various articles, social media outlets, and even the communication from the senior Kuali leadership to the community members, have created a situation in which many do not have a good feel for their role in protecting, prioritizing, supporting and nourishing the community.

- The Evolving Kuali Narrative, by Kent Brooks, “I was just thinking”, 14-Sep-2014

The Kuali Foundation Board has set a direction for our second decade and at this time there are many unknowns as we work through priorities and options with each of the Kuali Project Boards. Kuali is a large and complex community of many institutions, firms, and individuals. We are working with projects now and hope to have some initial roadmaps very soon.

- Updates – Moving at the Speed of Light, by Jennifer Foutty, Kuali 2.0 Blog, 17-Sep-2014

As the library community that built a true next-generation library management system, the future of OLE’s development and long-term success is in our hands. We intend to continue to provide free and open access to our community designed and built software. The OLE board is strongly committed to providing a community driven option for library management workflow.

- Open Library Environment (OLE) & Kuali Foundation Announcement, by Bruce M. Taggart (Board Chair, Open Library Environment (OLE)), 9-Sep-2014

Building on previous updates here, the story of the commercialization of the Kuali collaborative continues. I missed the post from Bruce Taggart in last week’s update, and for the main DLTJ Thursday Threads audience this status update from the Open Library Environment project should be most interesting. Given the lack of information, it is hard not to parse each word of formal statements for underlying meanings. In the case of Dr. Taggart’s post about OLE, I’m leaning heavily on wondering what “community designed and built software” means. The Kuali 2.0 FAQ still says “the current plan is for the Kuali codebase to be forked and relicensed under the Affero General Public License (AGPL).” As Charles Severance points out, the Affero license can be a path to vendor lock-in. So is there to be a “community” version that has a life of its own in under the Educational Community License while the KualiCo develops features only available under the Affero license? It is entirely possible that too much can be read into too few words, so I (for one) continue to ponder these questions and watch for the plan to evolve.

Link to this post!

FOSS4Lib Updated Packages: Consider the Removal Agency that You Can Believe

planet code4lib - Thu, 2014-09-18 10:31

Last updated September 18, 2014. Created by mohit03 on September 18, 2014.
Log in to edit this page.

Transferring your house and also place of work are often very thrilling, since you also are generally on the point of check out a brand-new area, commence a brand-new living and possess entertaining. Unless you make it, although, you must move through a fairly exhausting along with strenuous timeframe, since you must set up packaging in addition to shifting. Specially when an individual transfer long-distance areas, it is advisable to select a suitable removing corporation that may control the work and also relieve ones tension.

How will you select the right treatment business?

To begin with, you'll want to seek the services of a genuine elimination business that may lead the particular removals method, however really not a robust gentleman who may have the truck. It is possible to inquire your buddies that have relocated lately as long as they employed any organization, after which, you'll want to validate their everyday living and it is permit with the relevant institution. You should make certain that the business is present for quite a while and this it's got every one of the right permit to control the particular large devices important for your home removals.

Requesting stories and also checking out the organization site is perfectly essential. You'll be able to constantly use the internet here and also Yahoo your treatment organization to find out and about in the event you will find just about any grievances as well as adverse feedback. Researching this testimonies you will find while using friends' as well as household tips is critical.

Its also wise to review rates as well as rates; a superb packers in addition to movers organization isn't going to often provide cheapest price, and you must big event a person examine the particular supplied products and services with all the charges supplied. Just like most businesses, removals organizations can also be prepared to take discussions. You are able to absolutely discuss the purchase price, particularly if a person call the corporation beginning ample, you'll be able for you to request a number of cheaper costs.

Comparing prices is additionally encouraged. Do not need achieve a deal, in addition to use the removals firm straight away, however, you must talk with a lot of businesses. When you continue deciding on this treatment corporation in the very last minute, you'll likely receive increased prices, since these businesses usually are ordered a little while ahead of time, so that you really should commence the procedure early on along with avoid your top days and nights in addition to holiday seasons.

You possibly can reduce your cost in case you begin providing all on your own. If you choose to make it happen, you should commence setting up along with supplying your personal property beginning. This kind of will assist you to stay away from demanding occasions in addition to worries. Take into account that each one of these have become time intensive, and also you'll want to always be ready upfront.

packers and movers bikaner
packers and movers in bareilly
movers and packers bareilly
movers and packers in agra
amroha packers and movers
alwar movers and packers

Package Links TechnologyLicense: Educational Community License

FOSS4Lib Upcoming Events: Necessary Questions to Ask Packers Movers Agencies

planet code4lib - Thu, 2014-09-18 06:05
Date: Thursday, September 18, 2014 - 02:00

Last updated September 18, 2014. Created by sweta2806 on September 18, 2014.
Log in to edit this page.

Going into a fresh desired destination using total residence things could be facilitated along with easier should you timetable the activity which has a high quality as well as trusted going business (packers and also movers). There are lots of specialized movers in addition to packers firms in several metropolitan areas connected with Indian available. Although figuring out the best mover can be quite a small bit complicated and also demanding work.

FOSS4Lib Upcoming Events: Responsibility of Specialized Jaipur Movers and Packers

planet code4lib - Thu, 2014-09-18 06:04
Date: Thursday, September 18, 2014 - 02:00

Last updated September 18, 2014. Created by sweta2806 on September 18, 2014.
Log in to edit this page.

Regardless of whether you might be shifting across the street, relocating derived from one of vicinity to a different or maybe relocating through Jaipur to be able to someplace else; a fantastic transferring firm will likely be the true good friend. An excellent going business might help together with your come in the whole procedure producing procedure get simpler -- through packaging of the very most very first merchandise for your existing spot to unpacking in addition to ordering of the very most previous merchandise pictures brand-new position.

District Dispatch: Celebrating Constitution Day the advocacy way

planet code4lib - Wed, 2014-09-17 23:08

 “Dear Congressional Leaders –

We write to urge you to bring to the floor S. 607 and H.R. 1852, the bipartisan Leahy-Lee and Yoder-Polis bills updating the Electronic Communications Privacy Act (ECPA). Updating ECPA would respond to the deeply held concerns of Americans about their privacy. S. 607 and H.R. 1852 would make it clear that the warrant standard of the U.S. Constitution applies to private digital information just as it applies to physical property….”

… So said ALA today and more than 70 other civil liberties organizations, major technology companies and trade associations — including the U.S. Chamber of Commerce — in a strong joint letter to the leaders of the House and Senate calling for the soonest possible vote on bills pending in each chamber (S. 607 and H.R. 1852) to update the woefully outdated and inadequate Electronic Communications Privacy Act.  To reach every Member of Congress and their staffs, the letter also was published as a full page advertisement in Roll Call, a principal Capitol Hill newspaper widely read inside the Beltway and well beyond.

When last discussed in DD in mid-June, H.R. 1852 (the Email Privacy Act) had been cosponsored by a majority of all Members of the House.  Today, 265 members have signed on but the bill still awaits action in Committee.  With literally two work days remaining before the House and Senate recess for the November election, ALA and scores of its coalition partners wanted to remind all Members that these bills deserve a vote immediately after Congress returns in November.

Add your voice to that call too as Election 2014 heats up where you live!  Attend a “Town Hall” meeting, call in to a talk radio show featuring a campaigning Congressperson, or simply call their local office and demand that Congress protect your emails, photos, texts, tweets and anything else stored in the “cloud” by voting on and passing S. 607 and H.R. 1852.  Politics doesn’t get any more local and personal than the privacy of your electronic communications, which authorities don’t now need a warrant to pore over if they’re more than six months old.

Tell your Congressional Representative and Senators to update ECPA by passing S. 607 and H.R. 1852 as soon as they get back to Washington.

The post Celebrating Constitution Day the advocacy way appeared first on District Dispatch.

District Dispatch: Getting on the same page

planet code4lib - Wed, 2014-09-17 21:59

It can be difficult to respond to a question asked by a Member of Congress at a hearing when that person is talking about a different subject than you are and doesn’t know it. One observes a lot of talking past one another and frustration. One wants to stand up and say “wait a minute, you guys are talking about two different things,” but this kind of outburst is not appropriate at a Congressional hearing.

That happened today at the hearing called by U.S. House Judiciary Subcommittee on the Courts, Intellectual Property and the Internet. The topic was Chapter 12 of the copyright law and in particular, an administrative process conducted every three years by the U.S. Copyright Office called the 1201 rulemaking. But some thought the topic was digital rights management, and things got a little tense near the end of the hearing. Watch it for yourself.

There is a connection, and for clarity’s sake, let’s explore. The 1201 rulemaking was included in the Digital Millennium Copyright Act (DMCA) as a “safety valve” to ensure that technological protection measures (also known as digital rights management!) employed by rights holders to protect content would not also prevent non-infringing uses of copyrighted works, like analyzing software for security vulnerabilities, for example. Ask anyone, and they will tell you that the rulemaking is brutal. It’s long, convoluted and borders on the ridiculous. During this process, the U.S. Copyright Office evaluates specific requests for exemption from Section 1201’s otherwise blanket prohibition on “circumvention,” e.g., breaking pass codes, encryption or other digital rights management schemes in order to make a non-infringing use of a particular class of copyrighted works. In order to make such an argument, however, one who wants an exemption to the anti-circumvention provision must already have broken the anti-circumvention provision in order to make a non-infringing use of the work because you cannot speculate that a non-infringing use is possible without demonstrating that it is so.

Broadcast live streaming video on Ustream

The process can last eight months and includes writing detailed comments for submission, a reply comment period, two days of roundtables sometimes held in two or three places in the United States, and finally time for the U.S. Copyright Office in collaboration with the National Telecommunications and Information Administration (NTIA) to write a lengthy report with recommendations to the Librarian of Congress what class of works with technological protection measures can be circumvented for the next three years. Whew!

The Library Copyright Alliance (LCA) submitted comments arguing that the process certainly can be improved. Key LCA recommendations included that exemptions be permanent instead of lasting only three years, and that the NTIS (which has a better understanding of technology and innovation)administer the 1201 rulemaking process instead of the U.S. Copyright Office.

The good news. A baby step may have been taken. All of the witnesses agreed that some exemptions should be permanent so people do not have to reargue their case every three years. In addition, the Copyright Office already has made a suggestion to improve the rulemaking process, writing recently in the Federal Register:

Unlike in previous rulemakings, the Office is not requesting the submission of complete legal and factual support for such proposals at the outset of the proceeding. Instead, in this first step of the process, parties seeking an exemption may submit a petition setting forth specified elements of the proposed exemption and review and consolidate the petitions naming the list of proposed exemptions for further consideration.

Stay tuned for more news on the Copyright Office’s so-called “triennial” 1201 rulemaking which gives new meaning to the adage that “god (or the devil, if you prefer) is in the details.”

The post Getting on the same page appeared first on District Dispatch.

Pages

Subscribe to code4lib aggregator