Open Security - usnistgov/xslt-blender GitHub Wiki

This three things go together. Ultimately none of them is possible without all of them.

Possible point of interest: Tim BL on "Least Power" (Wikipedia); any other citations regarding security aspects of OSS

Standards

A standardized declarative syntax for encoding, at any layer (data description or processing) provides for semantic integrity of the data over the long term, over the horizon. In an XML-based system, at least five (let's count) layers of standardization apply:

  1. Text / character encoding - Unicode and its implementations especially utf-8 (used here)
  2. Markup / XML - a syntax for descriptive text encoding
  3. Markup application layer - for example, NISO STS or NIST OSCAL - standardized vocabularies
  4. XSLT processing - defined by W3C, testable in commodity software
  5. Web context - HTML, Javascript, CSS, DOM APIs - capable environment for execution (further layering is possible)

Evanescence of implementations

The half-life of a web site is not known (is it? research). Government web sites are not necessarily better than private domain, academic or corporate sites. Undoubtedly there is a huge range.

The maintenance burden of a web site is the largest contributor to its 'morbidity and mortality'. Within this context it must be kept in mind that the maintenance burden of a well-designed system will often be significantly lower, but this margin makes more of a difference at greater scales. So the application context is extremely variable in this respect also.

A site that requires no back end may stay viable for years, since there is nothing to break. As long as the technology to serve pages over the Internet (even if only a local Internet) is available, the dependency on current software applications being kept up to date can be reduced to the client or receiver. At the moment, the particular stack needed to run these projects is widely available. It may be that it can be 'cached' in some form in the public domain as well -- work to be done.

One risk of 'things working well' is the softening of skills, since there is less work to do and chances to practice. However, this should not be taken as a recommendation for things not working well, so much as a recognition of costs and risks that come with success. We avoid softening of skills by providing a platform where they can be sharpened by further exploration and testing. By definition, information siloes and self-contained application stacks hinder this if not making it altogether impossible. The cost of slowing down change is a loss of adaptability.

Accordingly, we see standards as an essential approach to the systemic risk of unsustainable siloing and dependencies. Simply stated, standards provide a positive externality with immediate n-squared (network) effects for any two or more parties to exchange who no longer have to negotiate a format.

Open source software (OSS)

By its nature OSS mitigates against some risks and some of the same risks as standards do. A robust open-source implementation may ultimately be viable on its own, if it has a community to support it. Even where communities may not take responsibility for innovation ahead of their vendors, even proprietary vendors benefit from the availability of open source.

Security

CIA confidentiality integrity accessibility

A proprietary binary or a bespoke database format is not accessible in the sense that the application on which it depends becomes part of the meaningful (duplicable) information set.

Among these, accessibility is actually first, since if data becomes inaccessible, I and C are no longer at issue.

Exposing the data in a legible format that can withstand changes in underlying technologies (in detail) is a sine qua non of data security. The only way to see to it that trouble and expense in ensuring confidentiality and integrity are not actually wasted, is to ensure a data set's continued accessibility and legibility.