Science Tools Corporation
Copyright © 1997 - 2023 Science Tools Corporation All rights reserved
Disclaimer
About UsOur ValueProductsConsultingReference Reference Support
 

Glossary of Terms

A B C D E F G H I J K L M N O P Q R S T U V W X Y Z

(Contact us if there is something you would like to see posted)

A
>>>back to top<<<

ACID properties
An important concept for databases. The acronym stands for Atomicity, Consistency, Isolation, and Durability. The ACID properties apply to both ODBMSs and RDBMSs.

The ACID properties of a DBMS allow safe sharing of data. Without these ACID properties, everyday occurrences such using computer systems to buy products would be difficult and the potential for inaccuracy would be huge. Imagine more than one person trying to buy the same size and color of a sweater at the same time -- a regular occurrence. The ACID properties make it possible for the merchant to keep these sweater purchasing transactions from overlapping each other -- saving the merchant from erroneous inventory and account balances.

Acronyms
What do those darn Acronyms mean? Subsequent to the development of this glosssary starting in 1997, web search engines have done a pretty good job at documenting acronyms. We suggest that for any that are missing here, you try this site.

Adaptive-Orientation
An adaptive orientationn is a perspective which supports both the harnessing of existing resources and learning the ways of the user and adapting to them.

AIST
Advanced Information Systems Technology

ANSI
American National Standards Institute - The ANSI process serves all standardization efforts in the United States by providing and promoting a process that withstands scrutiny, while protecting the rights and interests of every participant. In essence, ANSI standards quicken the market acceptance of products while making clear how to improve the safety of those products for the protection of consumers.

ARC
Applications Research Center

Automated Re-Processing
Repeatability is a crutial aspect of performing valid science and a crutial aspect of validating science. When steps can be repeated exactly, they can be confirmed exaclty. But sometimes things go wrong - a mistaken input, a hardware failure, etc - and the ability to automate the precise re-execution of previous processing steps is a large benefit. Keeping the results easily distinguishable is a second vital aspect to automated re-processing.

B
>>>back to top<<<

BigSur
The BigSur System
is Science Tools' core product: a database-centric science system. Much of this web site discusses BigSur in detail.

Note also that BigSur was the name of the University of California, Berkeley, project in which The BigSur System was first developed. The term UniversityBigSur, therefore, referrs to the non-commercial UCB implementation.

C
>>>back to top<<<

CA
Cooperative Agreement

COTS
Commercial Off-The-Shelf [Software] or Connection-Oriented Transport Service

D
>>>back to top<<<

DBMS
Data Base Management System - A Database Management System (DBMS) is a software system that is used both to create databases and manage the information stored within them. The architecture of the DBMS will frequently determine or limit the possible uses of the databases it creates. Some DBMS's work best for creating single-user databases, while others can build databases that accommodate multiple users in larger corporate environments (provided by Information Technology Toolbox, Inc.).

DPS
Science Tools' Distributed Processing System consists of a database-centric arcitecture, and programs called Processing Daemons. These daemons, either the DemandEngine or EagerEngine, run on various client nodes participating in the environment and look to the database for work, dispatching processing when appropriate. This strategy is backwards in perspective to those of other distributed processing strategies in that the clients are the ones looking for work, rather than having a central node trying to dispatch to them.

Database-Centrism
Database-centrism is an architectural paradigm in which the management of data - in a database system - is the central element. Database-centrism recognizes the core role that data - and hence information - plays in an enterprise.

E
>>>back to top<<<

ECS
NASA's EOSDIS Core System.

EDPM
Event Driven Processing Model - Traditionally used in hardware control settings, the Event Driven Processing Model posits that external actions, called events, are the most pertitent aspect driving computation.

End-To-End
End-To-End is a term first
coined in the early 1990's by NASA during the implementation of the initial Mission To Planet Earth. The term has been used by different people for different purposes over the years, mostly incorrectly. We at Science Tools believe it should only be used to referr to the original concept: the handling of scientific data from when it is first created to when it is used by "end-users" in visualizations or for similar purposes. The term implies a work-flow model of processing of data, as well as aspects of care such as archival. Our BigSur System is specifically designed to handle the end-to-end challenge.

EOSDIS
NASA's Earth Observing System Data and Information System.

Epilogue
An Epilogue is that part of a processing function which cleans up after the main part of work is done. The Epilogue is responsible for updating meta-data repositories to report on the success of the process, what new objects were created, etc.

eScienceHubs
The fully distributed nature of Science Tool's BigSur system makes it possible to create interacting installations, each of which participates with one or more cooperating partner installations. We referr to these as eScienceHubs because they connect researchers along the lines of the metaphore of a wheel with a hub and some number of spokes. These hubs enable researchers to create places where collaboration can take place, where all members of the scientific enterprise can interact in real-time.

F
>>>back to top<<<

FGDC Meta-Data Standard
The Federal Geographic Data Committee Meta-Data Standard is...

"The Content Standard for Digital Geospatial Metadata was developed to identify and define the metadata elements used to document digital geospatial data sets for many purposes. These include metadata to: 1) preserve the meaning and value of a data set; 2) contribute to a catalog or clearinghouse and; 3) aid in data transfer."

Furhter information can be obtained here.

G
>>>back to top<<<

Grid
"Grid" computing provides a High-Performance-Computing system for work-flow, utility, EDPM, peer-to-peer, Clustering and Super-Computing purposes. Check out Science Tools' presentation on "Grid Computing - An Introduction" (pps - 81K) given to the ESIP Federation during its 2005 Winter Conference in Washington D.C., January 4-6 2005 by our Chief Scientist, Richard Troy.

H
>>>back to top<<<

High-Performance, Distributed, Scaleable Architecture
Distributed Processing System which permits scientific functions to be performed on any system in the network, as desired, with both process-level and system-level controls for operations staff to manage workload.

L
>>>back to top<<<

Learning
Learning is the process by which data is collected for the purposes of applying appropriate actions at a subsequent time. We referr to our software as having the attribute of learning and by this we mean that it can be taught, by providint it with information, such that it can later perform the appropriate actions as a means to implement a functioning system.

LIMS
Library Information Management System
Laboratory Information Management System

Lineage
Lineage is the collective history of the creation of a data object. Every data object has a lineage - it always has parents! At Science Tools' we take the lineage of objects seriously because it is the foundation upon which scientific defensibility rests in a computing environment. Without a known lineage, scientists and researchers cannot be sure of their data. With it, however, all manner of exploration of possibilities may be examined for the determination of correctness of algorithms, and other aspects of science. Having a known lineage is the foundation for scientific defensibility.

M
>>>back to top<<<

Meta-Data Management
Meta-data is the data about the data. Meta-data is a critical but often over-looked aspect of performing Science. In the case of Science Tools' products, we usually referr to meta-data management in the context of processing details which constitute a known "lineage" about each individual scientific object, all types and collections to which each belong, the processes that create them and the parental input objects to those processes.

N
>>>back to top<<<

Named Database Connection - NDC
A Named Database Connection is a set of data which is used to inform a system how to connect to a given database. The user merely referrs to the name of the connection as the data itself is stored in a way that the user cannot examine the data used to connect, but may utilize it if so authorized. Named Database Connections, therefore, provide both convenience and improved security.

NewDISS or New-DISS
In the next ten to fifteen years, NASA will deploy a new system for the dissemination of Earth Science Data. This system, currently called NewDISS, will replace the existing system, EOSDIS with a network that will increase the involvement of the participating Earth Scientist, and exploit existing and developing network capabilities to a further extent than currently possible.
Citations: "Exploit existing and developing network capabilities to a further extent than currently possible, Dr. Mark Mainer
"Technology Roadmap for NewDISS dated 1 April 2000 prepared for The Earth Science Technology Office, NASA Goddard Space Flight Center and prepared by The Aerospace Corporation, Dr. Mark Mainer

NFS
Network File System

P
>>>back to top<<<

Progressive-Utilization
Progressive Utilization is an attribute of a system which permits users to pick and choose features as desired and which does not force the use of aspects of the system which are not otherwise useful to the user.

Prologue
The prologue is that part of a processing step which prepares the environment for a process to run. Most commonly, a prologue is implemented as a function within the context of a process-wrapper that encapsulates a scientific process.

R
>>>back to top<<<

Relational Database Mangement System (RDBMS)
The Relational Data Model, developed by Todd Codd in 1969, allows multiple tables to be related to one another within a database. For example, one customer's information could be recorded in separate tables such as "Personal Information", "Marketing Efforts", and "Service Requests". The information stored in these tables will then relate back to the customer's main record. A relational database management system (RDBMS) also offers flexibility in terms of how the customer's data can be viewed. To access the information stored in relational databases, users can either build queries using the Structured Query Language (SQL), or they can utilize a user interface that translates their requests into SQL and displays the results. While the American National Standards Institute (ANSI) approved an early version of SQL as a standard, many RDBMS's also use customized, proprietary forms of the language (provided by Information Technology Toolbox, Inc.).

The RDBMS development continued in the early 1980s, largely by Science Tool's own Professor Michael Stonebraker at the University of California at Berkeley. The best known effort was known as the Ingres project, and was the foundation of no less than three commercial implementations - the offerings of Ingres Corporation, Oracle Corporation, and Britan Lee (SP?). RDBMSes have the attribute of relating tables of information, each table of which is like a file, each row of which is like a line, and each attribute of which is like a field. The proper terminology in the relational database world is: table, row, attribute, whereas the older file-system based information systems have files, records or lines, and fields. Database systems are said to have the ACID properties - that is, the transactions they perform must be Atomic, Durable.

S
>>>back to top<<<

SAIF
Canadian version of the U.S. FGDC Meta-Data Standard

SAN
Storage Area Network or System Area Network

SDPS
Scientific Data Processing Segment (SDPS) - The Science Data Processing Segment is a NASA construct which has the responsibility to meet the objectives of the EOS mission in data product search and ordering, processing, archiving, and distribution. The SDPS, or portions of it, are an integral part of each of the EOSDIS Distributive Active Archive Centers (DAACs). The SDPS produces Standard Products, including browse products such as subsetted, subsampled, and summarized data sets, as well as the associated metadata. These products are created during routine production processing and on demand in response to user requests. The SDPS supports data quality to meet all authorized scientists needs, supports the integration and testing of algorithms and associated software, stores all data sets and supports software for the duration of the mission, and provides user access to data and related information. The term "algorithm" refers to software delivered to the SDPS by a science investigator (PI, TL, or II) to be used as the primary tool in the generation of science products. This incorporates executable code, source code, job control scripts, as well as documentation. The SDPS also provides software toolkits to the EOSDIS users. These toolkits provide an integrated package of toolkit routines provided by the ECS elements that provide a wide range of common user services including data visualization tools, data transformation tools, data access services, and algorithm development resources.

SMP
Symmetric MultiProcessing

SQL

SQL is the Standard Query Language. First addopted in the 1980's, SQL has emerged as the singular vocabulary for manipulating data in the context of relational database management systems (RDBMSes). Our products require conformance to the SQL92 standard.

SRPP
Scientific Research Platform Provide r- a NASA construct associated with the EOS mission. Please see SDPS above as SRPP is a related term.

SRSP
Scientific Research Service Provider - a NASA construct associated with the EOS mission. Please see SDPS above as SRPP is a related term.

 

 

 
Feedback
Contact Us

website contact: Webmistress

Science Tools > Top Level