need to change your existing database, operating system, hardware,
file systems, RDBMSes, etceteras.
daemons look for work to be done and automatically initiates processing
automated processing on a single computer, within a site, or across
processes and restores connections as configured by administrator.
processes that may have failed, interrupted, or isn't really ready
between processes may be explicitly declared.
a work-flow through parent-child relationships and their processes.
may be assigned to groups and daemons.
load balancing and configures particular work to particular systems.
processes from the parent.
processing to continue when a daemon is shut down, or the originating
object isn't found.
to common meta-data repositories to find work.
the environment to be physically distributed and logically joined.
resources are available the client looks for work rather than having
a coordinator looking for a machine that is free.
the generation history, or lineage, of your scientific objects.
again struggle to recall just how a particular result was obtained.
details of how each object was created are preserved.
easy repeatability of processing steps with single changes.
and confirms findings.
hardware, software, people, data and knowledge into one system.
unify what hey already have.
connectivity between instances of data and meta-data.
collaborate to preserve data and meta-data permitting an uninterrupted
flow from data source, through all intermediate processing steps
to final use.
you to share and track results automatically, or works-in-progress,
with anyone (collaborators or the public) and provides controls
and tracking so you know what is going on.
demand, on a schedule, or as things are created.
secure connection services between client applications and services
resources by providing flexible connectivity capabilities on the
authorized piercing of firewalls and gateways, and provides further
redirection of a connection to any destination within a private
also be used in other applications which require secure transport
between components that communicate using the TCP/IP port mechanism.
Such applications are widespread and include uses such as video
encrypted, point-to-point, destination-end control and on-the-fly
target node does not have to be the final destination of the connection,
though encryption will end at that point in the network connection.
be used to augment existing network security strategies such as
VPN by filling a need where VPNs may be inappropriate or impossible
users the ability to write applications once and have them operate
against any modern RDBMS engine, despite the fact that all databases
have their own unique dialects.
migration of applications from one database vendor to another, or
can enable write-once-run-anywhere applications.
journaling services to all RDBMSes including logging and reloading.
of this writing, certifications exist for Oracle, Postgres, Informix,
Sybase and DB2. Others are pending, so contact us for current information.
connections are described in the file system and are protected by
file system access privileges.
simplified database access and isolation of sensitive configuration
data from unprivileged users.
users TRUE identities based on privileges and permissions.
55,000 lines of Java including 1,100 public methods
of Java classes and methods written to make the job of implementing
the very latest concepts in threat detection and avoidance, including
using multiple, simultaneous methods of user identification and
system identification to reduce risk.
commands, processes, or any form of an application that can speak
to every implementation of the STDB database schema.
Processing Features enabling remote management of processes and
daemons with digitally signed commands for security.
support, enabling your code to run unchanged against nearly any
managers can setup required logging which cannot be turned off by
duplicate copies as distinct from named objects (object oriented)
found scattered throughout an environment.
duplications allows administrators to choose how many copies of
objects they want where and when.
intended to replace platform specific disk backup and recovery tools,
but rather to augment them with a capability of managing individual
objects through time.
distributed, multi-site architecture: Source files may come from
any network and may be stored on any network, with the meta-data
about them being available on a third system.
the environment prior to calling C and Java applications
clever users from spoofing applications prior to execution including;
desktop tools (excel), Java-based GUI interfaces, interactive command
line access, batch processing and IDEs.
known vulnerabilities in environment variables such as PATH, CLASSPATH,
LD_LOAD_LIBRARY, and so on, and explicitly sets the variables before
calling application level programs.
real and effective user-IDs are tracked
application's process ID (not available in Java)
comprehensive environment for science, research, grid-computing, high-performance
computing and development enterprise management.
Processing System for "Grid" computing provides seamless
application access via ANY supported database. Any component (e.g.,
processes, objects, data, meta-data, etceteras) may exist on any
to learn about the holdings of other systems to manage their data
and/or meta-data to automate collaborative access. Runs your processes
so you won't be embarrassed when you lose your key staff (intern,
grad student) and "forget" how.
and teaches processing functions useful to your discipline so you
can build a library of capabilities, accessible from anywhere and
runable by anybody to whom you grant permissions.
to conventions and paradigms of the users (e.g., data-types, associative
relationships, processes, etceteras)
use as little capabilities as desired without being forced to use
unnecessary components or incorporate inter/intra/multi-disciplinary
environments/projects (virtual organizations).
data lineage from source to use and vice versa!
system-management functional tool for The BigSur System in a
Graphical User Interface form.
for those who aren't comfortable with command-line-interface.
Infrastructure (organizations, suites, people, computers), Processes
(process definitions, input parameters, output results), Objects
(individual or in sets, object data and/or meta-data), Relationships
(object to object, object to process and process to process), Databases
(DB access, logs, journaling), Security (individuals, objects, network,
direct control of the Distributed Processing System (DPS).
processing to the most appropriate host node(s).
network-reachable system can participate in performing work, even
over un-trusted networks.
super-computing systems may be created using new computers, or any
existing systems, including lower valued, fully depreciated (scrapped)
together a heterogeneous collection of systems.
all the DPS daemons on a single system
the administrator a means to manage processing on all systems, no
matter what node they are logged in on.