<$BlogRSDUrl$> Marcus P. Zillman, M.S., A.M.H.A. Author/Speaker/Consultant
Marcus P. Zillman, M.S., A.M.H.A. Author/Speaker/Consultant
Internet Happenings, Events and Sources


Friday, June 25, 2004  

THE DEEP WEB
http://snipurl.com/78rr

Because search engines skim only the top layers of Web pages, they miss most of what's available on what is called the "deep Web," and there may be as many as 500 billion Web pages hidden from the view of most search engines. Paul Duguid, co-author of "The Social Life of Information," says: "Google searches an index at the first layers of any Web site it goes to, and as you delve beneath the surface, it starts to miss stuff. When you go deeper, the number of pages just becomes absolutely mind-boggling." Librarians are now working with Google and other search engines to solve that problem. Daniel Greenstein of the University of California's California Digital Library, the digital branch of the University of California notes: "If you could use Google to just look across digital libraries, into any digital library collection, now that would be cool. It would help libraries achieve something that we haven't yet been able to achieve by ourselves, which is to place all of our publicly accessible digital library collections in a common pool." (New York Times 21 Jun 2004)[NewsScan Daily, 21 June 2004] This has been added to Deep Web Research Subject Tracer™ Information Blog.

posted by Marcus Zillman | 4:05 AM
archives
subject tracers™