Friday, April 12, 2013

Computers in Libraries 2013 - Day 3 - Data : Digging Deeper & Displaying

This final presentation I attended at Computers in Libraries 2013 was one of the best.  Jeff Wisniewski of the Univesity of Pittsburgh presented this talk on how to effectively use Google Analytics to get relevant results.  It tied in beautifully with the metrics session I attended on day one and I wonder why they weren't in the same track.  Both emphasized the need to have statistics that mean something, and to get statistics that mean something you need to establish clear criteria of success.  Establishing clear criteria of success means you need to model out some set of individuals whom you are trying to target, either with your statistics or with your website, or for both.

In the case of Google Analytics many people (myself included) grab some numbers that look good and chart them out for reporting purposes.  Most of the time there is some bureaucratic agency that is asking for those numbers, so you make sure you have them so they will be happy.  However the numbers that are skimmed off of Google Analytics can be flawed to the point of uselessness.  Unique visitors can be skewed by individuals who use different devices or clear their cookies.  Average time on page can be skewed by one person who spends an age there because he got up to go to lunch or something.  Bounce rate may be important or it may mean nothing.  If you have a page redirecting people away from a website (like to a catalog or databases) you might really want a high bounce rate since you're running the website to provide access to those services.

Instead of relying on these canned numbers which are easy to obtain but likely meaningless, a better approach is to carefully determine what you want to look for and then figure out how you're going to get it.  Wisniewsky used the acronym of S.M.A.R.T. to enumerate the criteria your goals should meet : Specific,
Measurable, Attainable, Relevant, and Timely.  To get these goals we need to talk to users, find out what broad types our users fall into (high school students going to the web page for research, moms going to the web page to find out storytime hours, active citizens who want to know information about how their money is being spent, etc.) and then figure out where we want those people to be going.

Google Analytics can be used to map out pages you expect these categories of user to want to go to and the paths they are likely to follow on your website to get there.  These pages and paths are called "goals" and "funnels" respectively.  You can ask Google Analytics to provide reports for you then on how many people successfully navigated a path and attained a goal, and how much time they spent on a goal page, if the goal page is some kind of destination page that is worth spending time on.  Following these steps will lead to a more comprehensive and useful analysis of your website that you can use to make it better and you can figure out what kinds of people are using your site and in what volume.

Then you have useful numbers that mean something, and the state can have the number of hits you had in 2013 if it really wants to know.

Computers in Libraries 2013 - Day 3 - Staff Training : Experiments & Experiences

This three-part session covered quite a bit of interesting information about keeping staff educated about technology.

The first presentation was given by Leah White, of Northbrook Public Library, and Gwenyth Stupar, formerly of Northbrook Public Library but now of Barrington Area Library.  As Northbrook is where I live (although not where I work) I was quite interested in seeing what they were doing there in the matter of staff education.

Leah and Gwyn described the multiple points and steps in their staff education program to make staff more comfortable in dealing with technology and helping patrons with technology.  They emphasized the point that the best way to help patrons who are struggling with technology is to first train the staff so they can then help the patrons.  It's necessary to get staff away from having an attitude of "Sorry, I have no idea how that works" toward an attitude of "No, I don't have one of those, but let's figure out what we need to do to get it to work."

They found that it is important to invest in devices for their staff to use so staff could become comfortable with the devices in low-pressure situations on library time.  They also attached staff training to an existing routine of staff education, such as staff meetings.  They made it into an event that their staff could get excited about and created a detailed workbook that staff could use to go step-by-step through the process of using a device, even if they were intimidated and completely flummoxed by the devices.  They recommended creating a different workbook for each device and making the training of staff with the devices mandatory.

After piloting the program, they then rolled it out at Northbrook, eventually setting up a table at the train station at regular intervals so they could demonstrate downloadable ebooks to residents who might not come into the library.

Pamela Carson of Concordia University Libraries followed Leah and Gwyn with a presentation on lifelong learning and technology.  This was a more general talk about lifelong learning, what benefits it has, how to foster it, and what kinds of people are more likely to gravitate to informal learning.  As someone who as been learning many things informally for a long time I didn't find a great deal of new information in this presentation, but it did provide a nice overview of the subject and helped me consider that many people need to be encouraged to pursue this kind of informal education.  

The final presentation of the session was by Michael Sauers of the Nebraska Library Commission.  Michael briefly described the history that the state of Nebraska has had with the 23 things project that was extremely popular a couple years ago.  In 23 things, 23 different kinds of tasks were explored with library staff at the many different libraries that implemented the project, all of which had something to do with Web 2.0 concepts.  For instance staff had to use Facebook and Twitter, experiment with YouTube, and complete many different kinds of social networking tasks, writing a blog post about each experience.  The goal was to keep staff up-to-date with technologies that patrons were using so they would be better equipped to assist them.

After their initial 23 things project, the Nebraska Library Commission decided to keep going.  For some time now they've been adding a new thing every month and encourage librarians in the state of Nebraska to complete the newly introduced things in the period of the month when the thing has been introduced.  If the requisite blog post is submitted within the month, staff get a credit towards keeping their librarian certification current.  The Nebraska Library Commission has also started adding books to read to the list of possible things, rewarding credits for every 100 pages in the book (although the book has to be completed for any credits to be gained).

For libraries wishing to continue to push staff forward and keep them current, which was the whole goal of 23 things in the first place, it is wonderful that Nebraska keeps doing this.  Many of the things in 23 things have now become less important than they were at the time, while other sites, such as Pinterest, didn't exist at the time the 23 things project was launched.

Their Things and BookThings can be found on the Nebraska Learns site (featuring a charming photo of Carhenge).

Computers in Libraries 2013 - Day 3 - Inbound Marketing : Leading Edge Tools

After lunch I went to this rather dense, although well-presented, talk about the concept of inbound marketing made by John Heinrichs of Wayne State University.  Before the days of the Internet I think there was no such thing as "inbound marketing" or if it did exist it was difficult to accomplish.  Instead the primary means of  marketing would now be called "outbound marketing."  Outbound marketing is where there is some kind of interruption that attempts to draw your attention to a product.  Television advertising, posters, and telemarketing are all outbound marketing; you are doing something (watching a program, walking, reading at home) and then someone stops you from doing that thing and presents you with something in the hopes of catching your interest.

Inbound marketing focuses instead on fitting into the flow of someone already headed in your general direction.  Rather than focusing on interruption it focuses on getting permission and earning the interest of the person being marketed to.  Social media, white papers, and search engine optimization are all inbound marketing techniques.  You have something that is of interest to a group of people.  There are other people who are competitors who have similar information and you are all competing for the attention of some target group.  Your goal is to get the permission of these people to send them more information about what you can do for them, but you don't know who they are, so you do things to try and make it more likely that they will find you, find that you may have something of interest to them, and then provide a means for them to give you their information so you can give them more information.  The goal is to build a trust relationship between yourself and the person who is a prospective client.

Doing all of this requires some interesting tools and techniques, and that is something I found of particular interest in this presentation.  Heinrichs demonstrated a kind of relationship graph indicating where the students at his school were going for information that was created using an add-on to Excel called NodeXL as well as analytics data from the school website.  He apparently uses a service called HubSpot for the lion's share of the interesting analytics data that he is able to use, with a lesser amount coming from Google Analytics.

I particularly found it interesting how he was able to determine links and layouts that were more effective by randomly presenting different versions to people as they enter the site and then using an analytics project to keep track of which links or layouts were more effective at capturing the attention of the people whose attention he was pursuing.

It was a fascinating presentation and maybe a little over my head (especially after lunch) but I'm quite glad I went.

Computers in Libraries 2013 - Day 3 - SharePoint & WordPress

This second session of the last day was a good one and covered some topics I've wanted covered.

I will be the first to admit that the internal use website that we have for staff leaves a lot to be desired, but I have not either had time to find or time to implement something to adequately take its place.  This session had two presentations suggesting alternatives, and has pushed me a little more in the direction of using SharePoint, which we can probably get in a hosted form at relatively low cost to boot.

Patrick Nunez Rauber from Broward College gave the first presentation of the two which was about SharePoint.  Rather than providing a case study or the story about what happened at his library, he provided a nice overview of features from the standpoint of a user to make a case for using SharePoint, which is kind of what I was looking for in this session anyway.

Patrick began his presentation with this quote:

“The 19th century culture was defined by the novel.
The 20th century culture was defined by the cinema.
The culture of the 21st century will be defined by the interface.”

He proceeded by saying that SharePoint solves an interface problem, or more specifically the problem of organization and clarity.  SharePoint is a collaborative web-based solution for any spreadsheet, any Word document, any web site, essentially – anything MS Office. It makes heavy use of metadata in files to make it easier to find things.  It makes it easier for people in different departments or locations to collaborate on files and projects,.

The nature of using SharePoint in an organization means that important files wind up being organized and put into a system where they are accessible to others.  It makes migrating data from one person to another, in the case of their leaving an organization for example, a much smoother process.  By establishing documents like agendas in SharePoint you avoid making many duplicates that get emailed around and if changes are made in a document like an agenda it is unnecessary to send the file a second time to the intended recipients.

Patrick suggested the use of materials by Dux Sy, particularly his SharePoint for Project Management, as helpful resources in using the software.  In the case of public libraries, Patrick felt that the tracking of virtual reference and the creation of knowledge-bases (such as a wiki) are ideal applications of SharePoint.  Annual review and outreach documents also work very well on SharePoint.

Michelle Mizejewski of the University of California, San Francisco, provided an overview of a project they did to remake their staff network into something that was more heavily used and to try and address weaknesses in their existing network.  Since they already had WordPress running on site they make a system that used WordPress, saving time and resources.  They customized the WordPress environment using a theme called P2 that turns WordPress into a kind of Twitter-like forum, except without the 140 character limitation.

Although elements of the system have worked well, it has been hard to overcome elements in the work culture and get all employees to actively participate using the system.

Thursday, April 11, 2013

Computers in Libraries 2013 - Day 3 - Rethinking Digital Literacy for All Ages

This first session of day three featured two talks on the subject of digital literacy.

The first talk had a youth services perspective and was presented by Michele Farrell of the Institute of Museum & Library Services and Enid Costly of the Library of Virginia.

The talk started with a brief overview of the history of the role of public libraries in literacy in general and children's literacy in specific.  As computers and Internet technologies have become a more important part of functioning in society it has become more important to make sure children have access to such equipment and learn to properly use it, and in some cases use it for improving more conventional literacy skills.  The focus here is largely low income children as a statistic mentioned in the talk indicated that 83% of low income children who have completed third grade have not reached the ability to read at the third grade level.

The speakers mentioned a variety of resources that have been found useful with digital literacy and children:

StoryBlocks – a collection of 30-60 second videos to model songs, rhymes, and finger plays for young children

DaybyDay – Different activities each day that have been put together by different states (daybydaysc.org, daybydayva.org, daybyday.id.org)  Virginia will be translating DayByDay into Spanish.

Colorin' Colorado – A bilingual site from Colorado.

In addition to sites like this, libraries are doing a lot of work with iPads.  Casa Grande Library in Arizona is doing interactive storytimes with a tablet and in many places iPads are replacing the  AWE Early Literature Station computers.

Thy also mentioned a few literacy sources for different audiences.  Everyone On (www.everyoneon.org) and www.digitalliteracy.gov are general audience sites.  There's a site called
Project Enable (projectenable.syr.edu) which is a project to train school librarians to create effective library services for students with disabilities.

Following this talk was a talk I found more interesting and specific to my interests and the needs that I perceive at the library at which I work.  Matt Mongomery and Jeremy Snell from Mechanics' Institute Library gave this presentation.

Mechanics' Institute Library is a membership library in San Francisco with 4500 members.  They have many people who have historically come to the reference desk for assistance with e-readers, gadgets and computers.  In May 2012, in order to better serve people with these needs, they made themselves available for an hour on a regular basis and patrons could make a 15 minute appointment.  They rapidly discovered that for most issues, 15 minutes was not enough time.  They modified the program and in July 2012 they started a monthly 6 hour run with 30 minute appointments.

They created a schedule in a Google Docs spreadsheet and entered into the spreadsheet the topic on which the patron needed assistance.  They have so far assisted 69 people for a total of 38 hours.  Their members like it and they are getting unsolicited “thank you” emails from members.

They have found the following benefits from their program:

  • More one-on-one time with your service population
  • Learn about the unique digital literacy needs of your community
  • Grow your formal classes to address some of these needs

We already do some things like this, but I think we could learn from this program and start doing a broader kind of thing than what we are currently doing, which is either limited to discussion of library resources or not advertised and is scheduled on request by patrons that know we will help them will all sorts of things if they ask.

Computers in Libraries 2013 - Day 3 - Keynote : Uncertainty & Imagination


The final Keynote for Computes in Libraries 2013 was given by Daniel Rasmus.  I liked this presentation and even though Rasmus was introduced with the term "futurist", he quickly disabused the attendees of any notion that he was going to have any kind of certain vision of the future.  This keynote actually made nice counterpoint with the keynote of the day before which had an overwhelmingly positive vision for the future.

Rasmus identified himself as an "anti-futurist" indicating that he is rather critical of making predictions as we have no data from the future, only data from the past.

The first and main section of his keynote outlined eleven specific uncertainties that we have going forward:

Uncertainty 1 : How Will We Access Information?

Uncertainty 2 : How Will We Represent Books? (PDF, Scroll, tablet, e-ink, HTML 5, cuneiform)

Uncertainty 3 : How Low or High, Can We Go (media size and data storage density)  You can now get 128 GB on a microSD card when a few years ago 128 MB was state of the art.  54% of people claim to never use cloud computer yet 95% of them actually do

Uncertainty 4 : How Will We Find Stuff? Will we be using statistics, metadata, semantics, or will stuff somehow look for us?

Uncertainty 5 : What Do We Hire a Library to Do? (Learning Experience, Leisure/Pleasure/Shopping Experience, Outlet to Piracy, Memory, Internet Service Provider, Cultural Experience, Community Meeting Place, Source of data, Digital help desk?)  Only about 4,000+ books are in a lendable e-book form right now but 1.7 million books are in a lendable form in paper.

Uncertainty 6 : How Will We Represent Knowledge? (Does your ontology have an epistmology? Too Big to Know by Weinberger)

Uncertainty 7 : What will we need to know?  What jobs will exist that we don't even have right now?

Uncertainty 8 : What will be the role of place?

Uncertainty 9 : The Measure of Success (Productivity vs. Serendipity).  Does the efficiency of forcing customers to use self-check stations really outweigh the inefficiency of maintaining a checkout staff that engage people in conversation and can make suggestions.  Sometimes we make investments that we can't quantify very well because we are looking at them through the lens of industrial age thinking.

Uncertainty 10 : Who Will Document the Trust, Who Will Censor?

Uncertainty 11 : What Rights Management Model will Predominate? (Digital Rights Management vs. Digital Restriction Management)

Following the outlining of these certainties, Rasmus described what he saw as four possible futures toward which we may be headed.

Corporate lifeline – where we live to work and work to live.  Corporations are the dominant social entitiies.

Trial Separation – Globalization has fractured; countries and regions are turning inward to shore-up their own societies and infrastructures.  Nationalism and state control predominates, the world slowly dis-integrates.

Falling Skies – The old rules have stopped working completely.  Local networks manage where national policy fails.  People feel numb and there are loud calls for a constitutional convention in US

Freelance Planet – Large corporations have largely become holding companies.  Value-webs have taken the place of supply-chains.  Individuals create their own, contextual work environments.  Technological innovation is rampant and chaotic.

Rasmus ended with these recommendations for dealing with the uncertainty of the future:

  • Don't think about the future in a linear way
  • Document the uncertainties you face – put the uncertainties on the door so you constantly see it and think about them.  Correct answer for “what is going to happen with computers” is “I don't know but I have a robust way of thinking about it.” 
  • Consider the “ultimate” use or utility of a consumer technology an uncertainty until it becomes obsolete. You can't know that something is obsolete until it's gone.
  • Actively engage with the uncertainties when making strategic decisions
  • Use scenario planning to help think about possible ways the future may turn out, plan for contingencies and mitigate risks

Computers in Libraries 2013 - Day 2 - Innovative & Awesome Tech

This session was, and historically has never been, an average session.  This is the Tuesday night session  which is a mixture of raucous, irreverent fun and serious discussion.  This one was probably one of the better ones I've ever attended.

There were a total of 6 short presentations made by a total of 8 people in the evening.  The first was different from all of the rest and was made by veteran keynote speaker from prior years Lee Rainie, Director of the Pew Research Center's Internet & American Life Project.  He has a new book coming out titled Networked and he whipped through a bunch of details that Pew has learned about library use, just as he might in a keynote, but in a tiny fraction of the time.

The findings he reported were:

1. Libraries are beloved

  • 91% say libraries are important to their communities
  • 76% say libraries are important to them and their families

1a. Libraries stack up well vs. others (the military, firefighters).  At least on one statistical measure

2. People like librarians

  • 98% of “ever” library visitors say interactions are “very positive”
  • 81% of library visitors say librarians are “very helpful”

3. Libraries have rebranded themselves as tech hubs

  • 80% say borrowing materials is important
  • 77% say free access to computers and the Internet is “very important” service
  • 76% go to libraries for quiet spaces

4. E-book reading is growing; borrowing is just getting started

  • Growing awareness that this is a library feature (31% realize it's a feature)

5. People are open to even more tech at libraries
Interest Level in Technology Services at Libraries
ServiceVery LikelySomewhatNot too
Online research service37%36%26%
Cell app35%28%35%
Program to try out devices35%34%29%
GPS app to find stuff in the stacks34%28%36%
Personalized accounts to recommend new materials based on past reading29%35%34%

6. African Americans and Latinos are esp. enthusiastic

7. The public invites you to be more engaged in knotty problems

  • 85% want us to coordinate more with local schools
  • 82% want free literacy programs
  • 61% separate spaces for different services

8. Libraries have a pr problem / opportunity

  • 22% say that they know all or most of the service their libraries offer

9. There is churn in library use

  • 26% say they have increased use (many of those have kids)
  • 22% say decreased use (the Internet is a primary reason)

10. There is a truly detached population out there that matters to libraries

Rainie finished his presentation with a request for libraries to sign up to participate in Pew studies at http://libraries.pewinternet.org/participate/

Following Lee's presentation were the typical Tuesday evening presentations which were a mix of humor and infrormation.

Sara Kelley-Mudie from The Forman School presented on her frustration with the TTWWADI (That's The Way We've Always Done It) attitude and things she's done to get past this attitude.


Michael Peter Edson from the Smithsonian Institution recited from memory a poem he wrote about museums which was at the same time touching and hilarious (can be found at http://tinyurl.com/JackTheMuseum).


Caroline Jobe & Linsey Henley gave a more serious presentation on the history of the Village Learning Place in Baltimore as well as their involvement in the Little Free Library project (birdhouses for books).



Jason Griffey of the University of Tennessee at Chattanooga gave a fun and highly interactive talk about the LibraryBox Project, an open source hardware/software project to create low cost (~$35), battery powered, off-the-grid wireless devices that can be used to distribute documents (although not explicity stated, it was implied that this would be used for legal or legitimate freedom of speech purposes -- they are in use in places in China for instance -- rather than the illegal distribution of copyrighted media for which no exception has been given).

Finally Sheli McHugh and Kristen Yarmey of the University of Scranton gave a charming and funny presentation on the future promise (and to this point largely unfulfilled expectations) of NFC (Near Field Communications) titled NFC : From the Peak of Inflated Expectations to the Trough of Disillusionment.

It was a great evening session and provided lots to think about.

Wednesday, April 10, 2013

Computers in Libraries 2013 - Day 2 - In the Cloud : Personalized Virtual Desktop

Anastasia Diamond-Ortiz, C. J. Lynz & Olivia Hoge of Cleveland Public Library presented this really interesting talk discussing their efforts at providing virtualized desktop environments for their patrons.  This wasn't something I was really considering doing before the presentation and after the presentation I'm probably considering it even less, but it was truly interesting nonetheless.

Cleveland has had a problem that probably all public libraries have had in that patrons come in with special computer needs, like unusual software requirements or the desire to have some kind of persistent  desktop that can be made available for them.  In most libraries I'm familiar with this problem is addressed by either "I'm sorry..." or a kind of half-measure (e.g. having a computer that is not connected to the Internet on which they can install software, but after the computer is rebooted it returns to its prior state forgetting all that had happened).  In rare cases I've been able to add software for patrons who needed it and it was free and something I knew I could trust, but most of the time it's just something we can't do due to the nature that we have machines that need to be used by a wide variety of people.

Cleveland, which being in an urban area and having a lot of patrons who don't have computers at home but may need to do special things with computers to get certifications and the like, has had this problem on a pretty large scale.  They decided to address it by setting up a system called MyCloud in which users could setup an account with the library and check out a thin client laptop for 3 hours that was usable only within the library.  When using that thin client laptop, they were connected via a Citrix server to their own Windows 7 virtual machine with 5 GB of storage space.  They have administrative rights on the virtual machine and can install whatever they like and save whatever they like as long as they don't use up the 5 GB of storage space.  They take complete responsibility for the machine and if they accidentally delete all of their files or get the virtual machine horribly infected with viruses they have to live with the consequences of that, but because of the nature of the network and the virtual machines, no other users are affected.

It's an interesting system and it introduces a host of possibilities and a host of questions.  The back-end that Cleveland had to implement to make this work was enormously expensive and they are already out of hard drive space with only 75 people using the system.  They can (and I'm sure will) upgrade the storage space and the rest of the server will probably last them for a long time.  Cleveland is also considering some architectural changes to hopefully make it possible to store more virtual machines in less space (currently each takes 30 GB -- 25 for Windows 7 and 5 for patron use).

When patrons sign up for the service they have to take an hour-long class and agree to a waiver of rights.  The hour-long class has probably been something that has made patrons less eager to sign up for the service.  The waiver is something unusual for libraries.  Libraries (at least where I'm from) pride themselves on storing no data on anyone beyond what is absolutely necessary to run the library.  However in this case the library is, by the nature of the service, storing all sorts of files that the patron creates or downloads, some of which might be incriminating to a patron doing something illegal.  Because they are storing it they have to be able to comply with state and federal law if the data was subpoenaed, and because of that they have to make the patrons aware that they will turn over this data if asked to under court order.

It's a really interesting program because it pushes the boundaries of what libraries can do for their users at the same time that it opens up a kind of can of privacy issues for libraries and patrons.  It will be interesting to see what happens and if this idea catches on with other libraries and what the long-term consequences might be.

Computers in Libraries 2013 - Day 2 - Mobilizing the User Experience

This session, although not really horrible, was my least favorite of the day, and really the entire conference.

There were two presentations.  In the first presentation, the two presenters switched back and forth and highlighted features of the book Don't Make Me Think by Steve Krug and talked about how the principles of Krug's book should be applied to mobile websites.  The points were all valid, but the presentation felt like something anyone who read Don't Make Me Think and then browsed the web using a cell phone would come up with.

This was followed by a presentation about making mobile websites and covered the basics of what was necessary to make mobile-friendly websites work. They suggested using the theme titled "Responsive" on WordPress and a theme called "Omega" on Drupal for good responsive design websites.

After the second presentation someone from the audience suggested that JQuery Mobile is a great tool for making mobile websites. I'm generally familiar with JQuery (although I haven't had a need to use it much...yet) but hadn't heard anything about JQuery Mobile, so that might be a good thing to check out.

Computers in Libraries 2013 - Day 2 - UX & Accessibility Pecha Kucha

First, for the uninformed, "UX" is an abbreviation for "User Experience" and a Pecha Kucha is a kind of rapid presentation where the speakers are supposed to have 20 slides and can spend no more than 20 seconds on each slide.

With those definitions out of the way, this was a not entirely strictly enforced series of four pecha kuchas all on the topic of user experience and accessibility.

The first presentation was made by Randy Oldham of the University of Guelph.  He had a nice, short overview of four tools that can be handy for evaluating a website's accessibility:

WAVE – Web Accessibility Evaluation Tool. Free. Overlays tags on a view of a webpage where there are accessibility problems. 

W3C MarkupValidation Service - Free.  This is not as graphical as WAVE, but pretty easy to understand. Rather than checking specifically for accessibility, this checks for valid HTML, but valid HTML is the first step to accessible websites.

FANGS – Free Firefox Plugin. Shows you what text would be read by a screen reader on any site you test.  Helps you appreciate the situation that a blind user might encounter.

Web Developerextension – Free plugin for Firefox & Chrome. Has a lot of features including a display of the tab index (if you hit the tab key on a webpage what the progression of selections is) and the ability to run a detailed report on 508 compliance for a webpage.

Randy's presentation was followed by a presentation by Frank Cervone of Purdue University Calumet providing some tips and thoughts about accessibility.

Frank started out by emphasizing that more people have disabilities than web developers might think.  Of those aged 55 to 65, 36% have a disability of some kind.  Of those aged 22-44, 15% have a disability of some kind.

As a result, it is quite important to keep the disabled in mind when doing web design and universal design is a key.  It's also important not to let ill-informed dogmatism get in the way of good overall design.  In other words just because graphics can cause accessibility problems in designs you needn't always avoid using them.  You can and should use graphics, but you just need to remember to make them accessible.

There are many places where accommodations are often necessary that web designers may not consider, such as text supplemented by audio, audio supplemented by text, or video supplemented by transcription.

A designer needs to make sure that the user has control over magnification, scrolling text and suppressing pop-up windows.  Providing keyboard equivalents to mouse commands, such as access keys, or providing supplemental graphics or allowing the freezing of animated graphics can also help make a site more accessible.

Frank also recommended some additional tools including Fujitsu Accessibility Tools, a Reading Effectiveness Tool (it is best to write to a 6th grade level sites targeted to a general audience) and Project Cannect, which has a guide for creating accessible content.

Joanna Karpinski of the National Library of Medicine presented their findings from usability testing of the NIHSeniorHealth.gov website as part of a redesign project.

The site she was working on is designed for people aged 60 and older and when they noticed decreased usage of the site they did some usability testing to try and discover why.  They found that older adults really didn't like images. They also found that older adults didn't like navigation that had more than one level.

Scrolling, which they had intentionally avoided in a prior design, was found not to be too much of an issue, although having pages that were more than two screens was something to be avoided.

They also found that their users were particular about how search results of the site were displayed.  Users were happiest when they were presented no refinement options, the snippets in the search results had sufficient content (2-3 sentences), the search terms were highlighted in bold in snippets, and the colors were familiar.

After making the changes their users were happier with the site.

The fourth presentation was made by Shimin Chen of St. Joseph's University and was about the use of fonts in responsive web design.

I liked what I understood in his presentation, although unfortunately he had a very heavy accent that was difficult to comprehend. In his presentation he demonstrated the process of putting custom fonts and vector graphics on a website and setting up code in cascading style sheets that would properly scale the graphics on the fly based on the screen resolution.  He indicated a lot of sources for graphics and fonts including FontSquirrel, IcoMoon, Entypo, FreeVector, CSS Tricks and Fontello.  I think I really would have liked the presentation had I fully understood it and it has made me want to look more into this subject.

Computers in Libraries 2013 - Day 2 - Library Budget Trends & Spending Priorities for 2013


This presentation was given over the lunch break and provided some preliminary data for the next year of Information Today's Library Resource Guide survey of library budgets. The survey has been compiled annually with support from ProQuest. There was a lot of data presented in this presentation and I didn't get close to all of it, but it was quite interesting to see what kinds of things they are finding out about library expenditures and how they've shifted over the past few years.

Before going over library numbers, the presenter mentioned that publishers were asked “How important are librarians?”  35% of the publishers answered "not as important as they used to be" while 65% said that librarians were "important."

The presenter then asked for a show of hands of those attending about their role in budgets and deemed that librarians were important based on the hands.

The figures presented for 2013 represented the responses of 511 participants: 38% Academic libraries, 34% Public libraries, 10% Special libraries, 5% Government libraries, and 12% Other.

Some example numbers provided were:

Library Budgets Overall
Change2011201220132014
Increase:31%31%35%33%
Same:25%24%27%30%
Decrease:41%41%32%22%

IT Budgets
Change2011201220132014
Increase29%35%33%30%
Same38%36%35%35%
Decrease10%9%5%6%

Online subscriptions
Change2011201220132014
Increase41%42%47%39%
Same31%36%29%32%
Decrease13%12%10%8%

Cloud Computing
Use?201120122013
Currently7%12%16%
Plan to use13%22%22%
No plans42%37%34%
Don't know37%30%29%

Greatest challenges
Challenge201120122013
Maintaining services on tight budgets83%82%79%
Keeping up with IT64%68%69%
Identifying new funding43%44%38%
Keeping facilities open / maintaining operational levels37%32%26%
Staff recruitment / retention33%33%34%
Migrating print to digital31%35%35%
Competing with the Internet26%29%27%

Cloud & Social Media
Service201120122013
Social52%49%52%
Sharing web pages26%30%38%
Wiki & blog38%30%28%
Patron reviews24%24%24%
Document sharing / web apps15%18%22%

You can download 2012 report from the Library Resource Guide site.

Computers in Libraries 2013 - Day 2 - Out the Lamp

This was the second presentation at this year's CIL that featured John Blyberg of Darien Library, and this time he was the sole speaker.  His topic this time was trends away from elements of the standard LAMP (Linux, Apache, MySQL, PHP) stack that is used to drive a huge number of websites and the most popular content management systems (Drupal and Wordpress being notable examples).

First Blyberg went over the history of LAMP and its components, which range in age from 18 to 22 years making them all old enough to vote if they lived in the U.S. and were actual people (Linux, the 22 year old, could drink too!)

Then, after describing what made the components great and popular in the first place, he described several upstarts that threaten the position of the components (at least the "AMP" part of LAMP; Linux doesn't have much threatening it at the moment).  I had at least heard of most of these components, and in some cases I have heard people involved with the projects themselves interviewed on FLOSS Weekly, but it was good to have a concise run-down of them going over what makes them attractive and what the consequences of their success will be.

The big shift that Blyberg pointed out was that from a process driven model to an event driven model.  In a process driven model each time something happens on a server a new instance of the program needed is started up.  He likened it in his presentation to a restaurant that has a different server and a different chef for every single table where someone might be served.  There are some inefficiencies in this model (as might be obvious from the restaurant description) that have been acceptable so far, but as the Internet has had higher and higher levels of traffic, servers have wound up having to become more and more robust to handle the levels of traffic using process driven systems.

In the event driven model the situation is more like restaurants that we do have (because that's the way they can make money).  At least for a small restaurant there might be one chef who can cook all of the food for 15-20 people at a time and there can be one server who can handle taking orders and getting the food to the customers.  That is essentially analogous to an event-driven model.

Nginx and Node.js were the primary event-driven software packages that are changing the web landscape.


Nginx – An event-driven web server.   If Apache is program that has a million options but the average user only needs 6, Nginx  is a program that doesn't do much more than those 6 things but does the 6 things really well.  The project is 8 years old, low-resource, well-suited to virtual environments, easy to configure, and fast.  Of the software Blyberg mentioned, this is the only one that I have so far actually used.

Node.js – A server-side Javascript environment that can be used in a billion ways, but was born to be a highly flexible, powerful, event-driven web server.  It is 4 years old, built on Google's v8 Javascript engine, is self-contained, has its ownbuilt-in web server, is non-blocking, has asynchronous I/O, and a package manager to easily add features.

Node.js has the potential of replacing both Apache and PHP in many situations as it can both serve the web page (what Apache does) and provide the server side code to provide different, conditional versions of a web page (what PHP does).

Often (although not always) working closely with event driven software and in many cases replacing the M in LAMP (MySQL) are a a family of (so called) NoSQL projects which avoid some of the speed bottlenecks that MySQL can have.  Blyberg mentioned two of these as well.

MongoDB – A 4 year old extremely popular NoSQL (or document) database which was built for scalability, was designed to woo SQL developers away from MySQL (and maybe another SQL project called PostGres), and has gained broad support.

CouchDB – An 8 year old NoSQL database which has a fanatical user base, was built for distributed and off-line environments, great for mobile devices (is extremely common on Android phones), is an official Apache project, has an administrative interface built-in, and speaks exclusively in JSON (JavaScript Object Notation) making it play quite nicely with Node.js.

Blyberg briefly mentioned the popular search tools Apache Solr and Sphinx in that they work well with both MongoDB and CouchDB.

Redis – This is a 3 year old NoSQL database, stores data in memory, has an option to store to disk, is perfect for data caching, is used in high-I/O applications, and is ridiculously fast.  It was the primary project Blyberg mentioned that I'd never heard of, although given its nature I don't see myself really needing to use it either.

Blyberg ended by saying that these tools will be in the next few years be giving us a new family of Content Management Systems that will be much faster and more powerful than the CMSs we use today.

Computers in Libraries 2013 - Day 2 - Hacking 101 : Protecting Your Site & Visitors

Blake Carver from LISHost presented at this first session on Tuesday which was a nice overview of good security practice.  There wasn't a whole lot that was new to me in this session, but it's always good to have a bit of refresher and to see what others are doing and recommending.


Blake introduced the subject by emphasizing that everyone is a target, bad things are going to happen, and the best you can do in computer security is to try to get better, even though the security problems seem to be getting worse faster than we get better.

He then looked at several different categories of points of vulnerability and examined what simple steps can be followed to make you a less attractive target.

For desktop computer security he emphasized that staying safe takes more than just a firewall & antivirus.  Some actions that you can take for security are:

  • Quickly patch / update everything
  • If you carry it, put a password on it
  • Don't Trust Nothin' and Nobody
  • Backup your stuff
  • Use a second form of authentication (Google Authenticator, Yubikey)
  • Do Good Passphrases - the bad grammar is intentional here.  If you use a passphrase like lIbrariansIsTehBestestPeeple$EyeN0 it is much harder to hack than LibrariansAreTheBestPeopleIKnow.  A passphrase (similar to the examples here) is better than a password as it's easier to make it long one you can remember.  “The time it takes to crack a password is the only true measure of its worth.”

Also on passwords Blake suggested checking trying to crack them (as a strength test, not for evil purposes) using OCL Hashcat.  He mentioned that a better policy than "Your password must be at least 8 characters long, contain one number, one special character, and both upper and lower case" is the policy "All passwords must be at least 20 characters."  You should assume that someone, somewhere will successfully steal the password you are using so reusing a password is a bad idea.

With web browsers it's not about what's most secure, it's about what is widely used.  Browsers are less attractive than Java, Flash & Reader and if you can use alternatives to those products you will be safer.  Also studies have shown that users are most likely to run across malicious code on the Internet via content delivery networks and ads, which unfortunately means any site that uses content delivery networks and ads, which is most of them.

Some actions to take:

  • Use Two (or more) Updated Browsers
  • Know Your Settings
  • Plugins / Ad-Ons / Whatevs:
    • Something to limit JavaScript
    • Ditch Java, Flash & Acrobat
    • Something to Force HTTPS
    • Something to Block Ads

Web Servers have the same uses for hackers that desktop computers have, but are more valuable since they tend to have a lot of resources and are always on

If you're a sysadmin:

  • Always encrypt passwords
  • Don't limit passwords
  • Keep everything updated
  • Educate the entire library staff
  • Watch file/directory permissions
  • Lock down as many programs as possible
  • Lock down PHP as much as possible


If you need to secure WordPress (and the like)

  • Keep it updated
  • Use good passwords
  • Make the admin login something other than "Admin"
  • file permissions
  • use security and backup plugins/modules
  • ModSecurity
  • Watch your logs
  • External monitoring


It's important to make your library defensible (invulnerable just isn't possible).  Some things to do to help library security are:

  • Automate patching, backups, scans
  • Use a Password Manager
  • Train
  • Secure PACs: Remove Java & Acrobat Reader
  • Enhanced Mitigation Experience Toolkit


Your security things are seatbelts not force fields.

Blake suggested Kali Linux and/or the PWN Pad for testing your network.

His final thoughts on the topic are:

  • Do good pass phrases
  • Be Paranoid
  • Keep Everything Updated
  • Do something to make the bad guys job harder

Tuesday, April 9, 2013

Computers in Libraries 2013 - Day 2 - Keynote with Storm Cunningham

Tuesday's keynote was presented by Storm Cunningham, author of The Restoration Economy, Rewealth!, two books about revitalization and the economy.  Cunnigham sketched out a model about the three ways that libraries might play a role in revitalizing their communities.   The first two of these strategies Cunningham went over very quickly.  They were the physical strategy, in which revitalization occurs because a lot of money is spent changing how a library is situated and/or functions so that it can play a different role, and the supportive strategy, in which libraries do what they've always done in providing meeting space, giving people access to materials and resources that help people develop themselves, etc.

The third strategy Cunningham called the catalytic strategy.  In this strategy libraries with their technological resources can provide a kind of coordinating and organization building role in getting people to play an active role in revitalizing a community.

To explain this further he described three trends that play a role in the revitalization of communities: restorative development, citizen leadership, and crowd technologies.

Cunningham described development as happening in three modes/phases.  First there is new development which is where someone decides to build a new subdivision/town/city/country.  This kind of development destroys whatever was there before and has been the most common in our history.  It also results in a lot of economic growth but there is a finite limit as to how far you can pursue this.  Second is maintenance or conservative development.  This is where existing structures that need repairs are fixed or areas that have not been depleted are saved and turned into national parks and the like.  This creates some economic growth, but not nearly as much as new development.

Finally Cunningham described restorative development where a previously developed area has fallen badly into decay and work is done to completely restore or redevelop the area to bring life back into it.  Cunningham described this as the future of economic growth as you can never do too much of it and you never really run out of things you can do.

Citizen leadership is where citizens take on projects that are normally reserved for local governments, normally because the local governments don't take them on.  In some cases citizen action of this sort forces local governments to act.  Cunningham provided examples such as Connect Tampa Bay, a citizen organized public transit system, and the High Line project in Manhattan as citizen leadership actions.

Crowd technologies are crowd-funding, crowd-mapping, and crowd-sourcing.  These are new adaptations that take advantage of the connectivity provided by the Internet and was ultimately where Cunningham felt libraries could play an important role.  He provided several examples where these crowd technologies were making an impact and in a much quicker way than conventional methods have been able to.  For example, in contrast to the High Line project, which took years to spur New York politicians into action, the Low line project (converting an underground trolley track section into an underground, naturally lit garden) got over $100,000 in funding through a Kickstarter project and seems to progressing at an incredibly fast pace.

Cunningham felt opportunities for libraries rested in:
  • More projects started by library users
  • More ability for citizens to gains support
  • More ability to get funded quickly
  • More ability to position the library at the heart
  • More ability for libraries to accelerate crowd-powered citizen-led revitalization
I am excited by crowd technologies and have participated in some Kickstarter campaigns (although nothing of the sort that he described).  I'm not sure that his optimistic vision of a bright future through such technologies will come true, but I certainly hope that it might.

Computers in Libraries 2013 - Day 1 - The Next Big Thing


The last session on day one was kind of a feel good sharing session where libraries were encouraged to share what they were doing that was innovative.  It was completely free form with the microphone going to anyone who had an idea they wanted to share, including frustrations.

It was interesting to hear people excited about various plans they were executing, frustrated about some of the insular qualities of libraries (quite a few people were non-librarians expressing their frustration with working with stuck-in-the-mud librarians), or looking forward to executing new ideas.  One person expressed the feeling that libraries have spent so much time fighting against technology that they are now facing a revolution and that libraries need to embrace the new role that they can play in supplementing peoples' lives before it is too late.

Some things that I thought were interesting and are representative of the statements (if far from a comprehensive list) were:
  • One library wants to run Seti@Home on their public computers to use otherwise wasted CPU cycles
  • One is looking at helping patrons sign up to take EdX classes (www.edx.org)
  • A couple libraries are making an effort to use iPads in an active support role.
  • One library indicated that it has been working on large-scale (20 foot wide screens) visualization spaces, but isn't quite sure what to do with them. Another library with visualization spaces indicated that it shows a lot of locally created content all of the time.
  • One library is connecting a local teen technology club to seniors who get devices they've never used out of boxes to start getting used to using them.
  • The State Library of South Carolina is starting a media contest for teens. They are taking equipment and expertise to libraries in different communities.
  • One library looking at ways to innovate in and revitalize their reference services after half of their reference area has been turned into a maker space.
  • One library indicated it is looking at replacing its catalog computers with inexpensive, lightweight Rasperry Pi computers.

One comment that I found particularly useful from near the end of the session was this: Libraries are only as relevant as they are today. If you wait too long to adopt a format, you aren't going to be relevant. You also need to be able to tell a story behind how a patron can use a resource.  If you cannot tell a story about how your patrons would be using a technology, maybe you shouldn't be spending a lot of resources on it.

Library in Trinidad established a small online class to plant a home kitchen garden after bringing in experts (high food costs from importing in Trinidad). Has been an extremely successful program for them.

Computers in Libraries 2013 - Day 1 - Metrics, Value & Funding

This was probably the best session I went to on day one, a day that had many sessions that looked really good in conflicting time blocks.

Rebecca Jones of Dysart & Jones Associates (who had been the interviewer in the morning's keynote) and Moe Hosseini-Ara Markham (California) Public Library led this interactive session about the process of convincing stakeholders that objectives are being met and that money is being well spent.

Moe started out with a story about how his library had a program with several local authors that had very low attendance. The board saw it as a failure because the attendance was low. After talking to the board the library went to the authors and talked to them and the authors thought it was a huge success because they got to connect with about 10 readers in a meaningful way. The question of whether this was a success or failure depended on who the real audience was and what they expected.

That story formed the foundation for the rest of the session.

Jones provided a definition of stakeholders as those people who can "can put a stake of success underneath the organization or a stake through the heart of it" underlying the importance of getting them on your side.  The process of getting them on your side involves 1.) understanding what they want (what are their political goals) and then 2). aligning your success to their success so that they can be happy about what's going on.

It's necessary for all of the programs in a library to establish one or two metrics that you want to collect that will help determine the success of those programs.  That metric may not be just the number of people that attended.  There is a cycle here that can be used to manage the whole process: Understand the context → Align strategies & objectives → Identify services & programs → define measures → manage measurement data → translate data into outcomes & impacts → Communicate results → Understand the context ....

Jones and Hosseini-Ara then led the attendees through a hypothetical program, in this case a job skills program that a public library would put on to demonstrate the use of the logic model in planning and evaluating such a program.  You identify the inputs going into the program (what staff, technology, finance, marketing, etc are needed) see what the outputs of those inputs are (the people who sign up, the sessions held, the program itself), examine the outcome of those outputs (what the people who attend learn, what the library and staff learn through the process), and finally examine the impact for the metric.  In this case the best way to get an impact is not to say "8 people attended the program" but to do a follow up survey to find out if they got jobs (as this is really the point of a job skills class, presumably).

If you have a job skills class and have 100 people show up and 1 person gets a job as a result, is it really somehow more effective for the community and the library than a program that only had 7 people sign up but every single one of them got a job as a result?  That kind of metric is easily understood and can be tailored to the needs of your stakeholders.  However measuring that kind of success needs the people planning the event to think through the process and to define what success is.  That way when a stakeholder asks, "Why did we spend $$$$ on a program that only had 7 attendees" you can generate some kind of meaningful response that will help them realize that the 7 attendees benefited in a positive way, and that benefit is a benefit to the community that will mean many times the $$$$ spent going back into the local economy (if, for instance, that is the concern of the stakeholder).

The entire staff needs to be engaged in this process. You need to tell staff what you are trying to achieve and why you are trying to achieve it. Jones suggested (quite strongly) using Yammer, a kind of internal, corporate Facebook for internal communication.

This session was time well spent.

Computers in Libraries 2013 - Day 1 - New Web Tech : Upping the Online Game


This session consisted of three different presentations.  There were many good ideas and I got a few good points out of each of the presentations.

Cynthia Orozco and Jamie Hazlitt of Loyola Marymount University presented a talk titled "Instalibrary : Make it Visual."  They described how their typical users (college students) prefer minimal text and lots of graphics.  Consequently using graphics in an effective way can be one of the best ways to communicate.  They then provided many examples of using graphics (sometimes even things like pictures of text) as effective ways to reach out to their users.  They also suggested searching sources like Flickr and YouTube for photos and videos that users have made of your institution and using them in a variety of contexts.  That seems something that could possibly encounter legal difficulties, but it certainly has potential.

Brian Smith from Reaching Across Illinois Library System (RAILS) described the process and benefits in having images rescaled/resized on the fly (FlexSlider and Mobile Detect are some ways to deal with the problem). There are many themes for Drupal that can help such as the theme "AdaptiveTheme" and the sites ThemeBrain and SooperThemes can be good sources for responsive designs.   Buying a good theme can be a smart approach rather than getting a free theme and then spending a lot of time fixing it.  There are some special techniques required for retina displays for best image quality.  If you can adopt a rescaling solution like this it is to your best advantage to upload the biggest size of image you can imagine using and then having all of the necessary scaling done on the fly for your site.

John Blyberg from Darien Library in Connecticut talked about about Twitter Bootstrap. It's an open-source collection of CSS, HTML & javascript that makes rapid development of pages much easier.  I have a set of problems where this solution would work extremely well and I look forward to taking advantage of it.

Computers in Libraries 2013 - Day 1 - 7 Deadly Sins of Library Websites


Casey Schacher & Paige Mano of the University of Wisconsin and Tony Aponte of UCLA presented the results of a small scale study they did of library websites and what usability standards they violated. All of the websites (20 different sites) violated at least one out of the 2006 version of usability.gov's guidelines that they tested.  They then compiled a list of the top seven violated guidelines from their subset and enumerated them in their presentation.  They were:

1. Does not comply with Section 508 of rehabilitation act. -- They recommended using the WAVE accessibility tool to highlight accessibility problems.

2. Avoid unexplained library jargon that typical users will not understand

3. Ensure that the format of common items is consistent form one page to another.  One example is the inconsistent presentation of phone numbers

4. Elements such as colors, fonts and content location should be consistent across pages.  Menus shouldn't change (College of Southern Idaho was used as a negative example of this)

5. Organize information at each level of the web site so that it shows a clear and logic structure to your typical user.  Avoid the junk drawer (the page where you stuff all of the stuff that doesn't fit somewhere else) at all costs.

6. The page layout should help users find the most important information.  Use analytics to discover what the most important information is.

7. On an uncluttered display, all important search targets are clearly visible.  “Pretty graphics don't negate overabundance of text.”

I particularly liked the way they approached their project of determining the "7 deadly sins" and found the information provided quite good.

Monday, April 8, 2013

Computers in Libraries 2013 - Day 1 - 15 Web Trends for 2013


David Lee King from Topeka & Shawnee County Public Library presented this fun and informative overview of trends in web design as he sees them this year.  He first listed the trends, describing them and providing some examples and insight.  The trends he sees are:

1. Content First - “Good web design starts with content” The content that users want can be lost in the layout. Design in the absence of content is not design, it's decoration. White space is a key ingredient in promoting content.

2. Design Simplicity – Minimalist landing pages, white space (Salt Lake City public library is a good library example)

3. UX Centered Design – Think. Draw. Build. Repeat. “Don't make me think.”

4. App Style Interfaces – The imitation of the mobile space on the desktop. Might not be the best place to focus.

5. Responsive Design – Websites that just work on all sizes of devices.

6. No Skeuomorphism – Use a flat design – not fancy cute concept (not book spines as navigation)

7. HTML 5/ CSS / Javascript - David argued that things like Flash and Perl are dated.  I don't disagree on Flash at all, but I do disagree on Perl, which although an old web language is an incredibly versatile server-side language and its use is completely transparent and easily adaptable to modern web standards.  I made this point later in the Q&A section and he really didn't seem to want to argue, so either he saw my point or was just being polite.

8. Fixed Header Bars (e.g. Facebook – the blue bar that's always at the top no matter where you are on the page). Can be handy / can be annoying

9. Large photo in the background – can work on the right site.

10. CSS Transparency (background that can been seen through a column superimposed on it) – In the right setting it can be attractive.

11. Social Media badges - makes it easy for your users to share on their social media platforms of choice.

12. Infinite scrolling – David expressed no love for this trend, or even apathy when describing it.  I can't say I can really disagree either.  This is the behavior (most commonly seen by me when visiting Twitter) where you can never get to the bottom of the page.  You get to the bottom of the page and then it automagically loads more page for you to see.

13. Homepage feature tours (pointing out the navigation on the website – this might be a useful thing for libraries – do it if there's something you're “selling”)

14. Sliding Panels – A pretty feature but perhaps not one that is worth the effort.  His suggestion was to hire people with the skills and direct them elsewhere.

15. Parallax design – seems gimmicky and complicated and I didn't really get it.  The example he provided was a website that used images of jiggly bottles that somehow interacted with the mouse cursor.

In summary he found trends 1, 2, 3, 5, 7, and 11 to all be important.  He was pretty apathetic about trends 4, 8, and 12.  He had conditional respect for trend 6.  He thought trends 9 and 13 worth consideration if you had a good use for them.  He thought trends 10, 14 and 15 virtuousic excercises wherein leet coders could prove their skills, but not particularly practical.

I liked this session and it was a good way to start the day.

Computers in Libraries 2013 - Day 1 - Keynote with Brent Leary

Computers in Libraries 2013 officially kicked off this morning with a keynote from Brent Leary who does customer relations consulting with S&P and Microsoft as well as others.  It was quite an interesting, although short, keynote followed by a session where he answered questions directed to him by Rebecca Jones of Dysart & Jones (who I saw again in a much later in the day at a different session).

It was an unusual keynote for Computers in Libraries as he doesn't seem to have much of a library background and hasn't really studied them, and didn't really mention them much in the main part of his keynote, but he has considerable expertise in an area where libraries really need to pay attention.  His focus was on creating a relationship with your customers, what makes customers value you, and what makes them keep coming back.  Leary is apparently working on a book titled The Amazon Effect which investigates what has made Amazon such a popular company and what things it has done to make customers so happy and loyal that they keep coming back.

I got some of the greatest value out of this presentation in the Q&A section at the end where he answered some questions and made some interesting points.  He felt that libraries are potentially very well positioned to fill a role that Borders had filled and is now somewhat vacant that cannot be entirely filled by the Internet in facilitating face-to-face sharing between readers.

Speaking of the clients that he has worked with he stated that the most successful clients have been ones who are "willing to embrace cultural change", which is something that many libraries have difficulty with and will certainly be a challenge.

One of the things that I really liked hearing from him was that it's important to consider exactly how you are going to use a technology before adopting it.  He likened it to the situation where you might give a car to a teenager who was unprepared to drive it, and that agencies that have started using new technologies (like Facebook or Twitter) without properly thinking them through have sometimes done tremendous damage to themselves far outweighing any benefits gained by adopting the technology quickly.

Throughout the presentation Leary mentioned how important it is to use the information that agencies have about their customers to build a relationship with them, being careful not to "go to the dark side."  Jones asked for clarification on how an agency might know if it's going to the dark side or not, a concern that I certainly would have.  Leary's response was that when you are thinking of doing something with other people's data you should first consider what you would want other people to do with your data and use that as the guideline.  It seems like a pretty good guideline although perhaps challenging to properly adhere to.  For better or worse, libraries tend (by law or by policy) not really have much information about the checkout habits of their users, which is an advantage that companies like Amazon have in their ability to make recommendations.  There's a lot of data we can mine, but it's a little trickier to use it to build relationships with our users.

Computers in Libraries 2013 - Day 0 - Gadgets and Gaming Session

I started out this year's Computers in Libraries conference by attending the Gadgets and Gaming Session which was a nice relaxed demonstration of a variety of technologies and then an opportunity to experiment with them.

Probably the most fun of the items was a kind of laser tag set that uses a centralized wireless device connected to a computer to run the game, so it can count how many times someone has been shot and which people have been the most successful.  Apparently you can run multiple different kinds of games where people can only be shot so many times before they are "killed" or that they have to locate some kind of objective (it seemed to be a kind of RFID tagged brick that could be tapped with a gun).

The best (and at least to my initial thinking, only) real library-related program that you could (and was mentioned has been) done with these is a kind of after-hours Hunger Games program which has some obvious tie-ins to the trilogy and film.  It looked fun (and the high school kids that came by to play seemed to be having a blast) although I'm not sure that a program of this sort would quite justify the expense unless it could be done as more than a one-off type thing.

I preferred some of the smaller, more creative items they had there.

Box of LittleBits with a diagram
LittleBits
LittleBits is a cool kind of simplified electronics experimentation kit that allows kids to build simple circuits that do things without needing to solder or worry about frying out the electronics.  A small simple kit is kind of expensive (compared to just buying electronic components), but it seems to be quite robust and much more conducive to playing

I also liked the Lego Education computer programming/robotics set that allows kids to learn how to do basic programming/robotics in a familiar Lego setting and using a simplified programming language.  As Winnetka-Northfield has an existing Lego program I could see something like this being a logical spin off of it.

Someone sitting in the SoundEgg watching a video on a flat-panel screen
SoundEgg
I finally couldn't but help like the concept of the the SoundEgg.  It's a single person, heavily insulated movie viewing pod that keeps most of the sound out providing the person in it with a lovely movie watching experience.  I'm not sure what they cost and I doubt we'd ever get one, but it seems like the perfect thing for patrons who want to watch a movie.