A while ago I reported on the Pew Internet Project‘s November 2005 report on increased use of search engines. Here’s what I had to say at the time:
On an average day, about 94 million American adults use the internet; 77% will use email, 63% will use a search engine.
Among all the online activities tracked, including chatting and IMing, reading blogs or news, banking, and buying, not one of them includes searching a library OPAC.
Walt Crawford properly took me to task, noting:
The report that’s downloadable does show that people aren’t being asked an open-ended “what did you do on the Internet today?” question. They’re being asked to respond to a list. If “searching a library OPAC” isn’t on the list, it is absolutely guaranteed not to be in the results.
It’s taken me some time, but I’m finally following up on that point. The question seems to revolve around how the list of activities was generated, and to answer it I contacted project director Lee Rainie. Lee explained that the intent of the project and the surveys is to help us understand how people use the internet and does not consider other activities. Regarding the list of ten online activities in this survey, he noted that it was a list he chose as “an illustrative list, rather than comprehensive list.”
Lee was careful to emphasize the way he values libraries and wanted to be clear that though the Project has tracked 90 online activities in its many surveys, they haven’t yet asked internet users about their use of online library services. I don’t know if it was just because I was asking the questions, or if he’s been thinking about this for some time, but he did suggest that the project might include library-related questions in a future study.
I was putting Lee in a tough spot, as the real question we want him to answer is something along the lines of “did the survey not include questions about online library usage because it’s statistically insignificant or was it an oversight?” Lee is a smart guy, smart enough not to answer that — smart enough to avoid stepping into our internal debates — so the following is based on my continued research into the question, not my conversation with him.
As it turns out, while much of the most interesting data in the November 20 2005 report comes from the project’s phone survey, the report uses data from comScore to support those phone survey results. While Walt is right about the phone survey, the comScore data doesn’t appear subject to those limitations:
The comScore data cited in this report come from comScore Media Metrix, an internet audience measurement service that uses a massive cross-section of more than 1.5 million U.S. consumers who have given comScore explicit permission to confidentially capture their browsing and transaction behavior, including online and offline purchasing.
In a comment to my previous post, KateZ expressed some concern that the comScore data was only tracking top search engines; comScore offers many reports based on their usage tracking, the qSearch report is a keyword optimization tool and doesn’t reflect the full breadth of data harvested by the company. It doesn’t answer the question on its own, but can we not assume that a company that makes is business by tracking the every online activity of its research subjects would investigate any library-related activity if such activity was significant enough to reveal trends in consumer interest or behavior?
Elsewhere, in the PIP’s August 11 2004 report on The Internet and Daily Life, we find some detailed insights on how those phone survey questions are selected:
To assemble a good list of activities, we followed insights gained from previous research and divided online activities into four categories: information seeking; communications; transactions; and entertainment. We chose several examples for each category. These examples are not meant to cover all kinds of activities, but rather to represent everyday tasks and typical recreations that Americans enjoy. We chose activities that would broadly represent what the Internet has to offer, that would resonate with a broad audience, and that would tap into our understanding of the Internet use gained from our past research. Recognizing, of course, our choice of particular activities might influence the findings, we tried to observe the specific but then draw generalizations from our observations.
And in the November 2 2005 report on Teen Content Creators and Consumers, we learn that the project uses focus groups and small surveys with open ended questions to help shape their research and larger surveys. In that case:
Four focus groups were also conducted with a total of 38 high school and middle school students.
…teens took an online survey of multiple choice, open-ended and short-answer-style questions…
Full details on page 25 of the PDF.
So, I can’t really offer the answers we all want, but my gut feeling is that if library usage was a statistically significant activity for American internet users, the Pew Internet folks would have picked up on it and asked more detailed questions.
Sadly, I’ve been so slow to followup on all this that it may not matter anymore. OCLC released their Perceptions of Libraries and Information Resources report in early December. The report revealed that patrons are generally happier using search engines than their libraries when asked to rate both in terms of volume, quality, speed, and overall experience.
This is scary to some, but good news to the libraries that are willing to take advantage of it. It means the tools, the access, and the information literacy are all coming together for our patrons. Now it’s just up to us to participate.
I’ll be talking about this in my ALA Midwinter presentation, see you in San Antonio.