Core system for national online science library

October 18, 2001

ITHACA, N.Y. -- Cornell University will build the central engine for the National Science Digital Library (NSDL), an online resource that will make high-quality source materials in science available to students from kindergarten through college.

The National Science Foundation (NSF) announced a grant of $1.56 million to Cornell's Digital Library Research Group to develop the core infrastructure for the project. This is software that will collect information from hundreds of sources in a wide range of formats and make it accessible to end users via the World Wide Web.

William Y. Arms, Cornell professor of computer science, is principal investigator for the project. Co-principal investigators are Carl J. Lagoze, Cornell digital library scientist; Dean Krafft, director of computing facilities in Cornell's Department of Computer Science; John Saylor, director of the Cornell Engineering Library; and Sarah E. Thomas, Cornell's C.A. Kroch University Librarian.

Cornell was one of several institutions submitting proposals for the core software. The Cornell group, working with Compaq Computer (now a division of Hewlett-Packard) will design a system of "portals" to provide access to a wide range of materials stored in many different formats, along with a catalog system and mechanisms others can use to create collections for specific purposes. Through the NSDL system, for example, educators in a particular state could find materials that enhance the state's public school curriculum and organize them for students and teachers.

"The goal is to make resources developed for research available to education," Arms explained. Many of the resources will be in what has been called the "deep web" -- information that previously has been available online only through the use of special software, or requiring specialized knowledge to navigate. The NSDL core will draw on a standard known as the Open Archive Initiative, spearheaded by Lagoze and Herbert Van De Sompel, former Cornell visiting assistant professor of computer science now at the British Library at St. Pancras in London. Using this format, the holders of digital collections can publish what librarians call "metadata," or data about data. In addition to showing what information is in a collection, the metadata will describe the format it which it is stored, and the NSDL core system will be able to use that information to access the data and pass it along to end users.

Collaborating on the legal and organizational aspects of the core project are Columbia University, the University of California, Santa Barbara; the Center for Intelligent Information Retrieval at the University of Massachusetts; and the San Diego Supercomputing Center. The University Corporation for Atmospheric Research will coordinate the overall effort.

The NSF also is funding a variety of projects nationwide to create collections of scientific materials for the online library. The prototype system already links to collections ranging from data on the deep structure of the earth and NASA planetary exploration photos to the biology of whales and a tour of the National Zoo. Also included are links to many resources for teachers.

When completed, the portal and cataloging system will run on computers at Cornell, but the library as a whole will be in a variety of locations around the country. The NSDL is scheduled to open to the public in the fall of 2002, with expansion planned over a four-year period. Arms believes the project will still be a work in progress for some 20 years.
-end-
Related World Wide Web sites: The following sites provide additional information on this news release. Some might not be part of the Cornell University community, and Cornell has no control over their content or availability. o Cornell Digital Library Research Group: http://www.cs.cornell.edu/cdlrg/

o The NSF award site:

-30-



Cornell University

Related Data Articles from Brightsurf:

Keep the data coming
A continuous data supply ensures data-intensive simulations can run at maximum speed.

Astronomers are bulging with data
For the first time, over 250 million stars in our galaxy's bulge have been surveyed in near-ultraviolet, optical, and near-infrared light, opening the door for astronomers to reexamine key questions about the Milky Way's formation and history.

Novel method for measuring spatial dependencies turns less data into more data
Researcher makes 'little data' act big through, the application of mathematical techniques normally used for time-series, to spatial processes.

Ups and downs in COVID-19 data may be caused by data reporting practices
As data accumulates on COVID-19 cases and deaths, researchers have observed patterns of peaks and valleys that repeat on a near-weekly basis.

Data centers use less energy than you think
Using the most detailed model to date of global data center energy use, researchers found that massive efficiency gains by data centers have kept energy use roughly flat over the past decade.

Storing data in music
Researchers at ETH Zurich have developed a technique for embedding data in music and transmitting it to a smartphone.

Life data economics: calling for new models to assess the value of human data
After the collapse of the blockchain bubble a number of research organisations are developing platforms to enable individual ownership of life data and establish the data valuation and pricing models.

Geoscience data group urges all scientific disciplines to make data open and accessible
Institutions, science funders, data repositories, publishers, researchers and scientific societies from all scientific disciplines must work together to ensure all scientific data are easy to find, access and use, according to a new commentary in Nature by members of the Enabling FAIR Data Steering Committee.

Democratizing data science
MIT researchers are hoping to advance the democratization of data science with a new tool for nonstatisticians that automatically generates models for analyzing raw data.

Getting the most out of atmospheric data analysis
An international team including researchers from Kanazawa University used a new approach to analyze an atmospheric data set spanning 18 years for the investigation of new-particle formation.

Read More: Data News and Data Current Events
Brightsurf.com is a participant in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for sites to earn advertising fees by advertising and linking to Amazon.com.