dif

Drilling down: Open Standards

In this post, we drill down into what “standards” are, and the characteristics of an “open process” for developing standards. Supporting these open standards is where the bulk of DIF’s efforts and resources are focused.

· 8 min read

What standards are and what it means to make them openly


[This post was co-written by Kaliya Young and Juan Caballero]

In our last post in the series, we drilled down into a granular definition of “open-source” development and the thinking that goes into a choice of license. In this post, we drill down into what “standards” are, and the characteristics of an “open process” for developing standards. Supporting these open standards is where the bulk of DIF’s efforts and resources are focused. In the next post in the series, we will turn to how open source and open standards work together to create new business models and strategies with real-world consequences for cooperation, “coöpetition,” and healthy markets.

Tidy tinker's workshop
Photo by Jim Quenzer

It is worth noting up front that the term standard has two slightly different usages. One is related to quality assurance or business process compliance — think of marketing that references “the highest standards of _____ ”. This refers to specifications and metrics used to grade outputs in a regulated industry or sector, like “Grade A Beef”. These are set and enforced by some combination of regulators, private-sector auditors and industry associations. Outside of software, this is usually what people refer to by “standardization:” and a specialist in any industry can wax eloquent on the politics and the consequences of decisions by standards bodies fixing those specifications and metrics.

In software and other IP-driven industries like medicine or engineering, standards have more to do with control and portability of data, enforcing measurable compatibility with the products of others. A common metaphor for this kind of standardization is the width or “gauge” of railway tracks — how far apart the rails are is a somewhat arbitrary decision but if they are different between two countries or regions they will have completely distinct rail systems. Software standards work much the same way, and for this reason standardization is often a prerequisite of procurements from government or substantial investments in the private sector. No one wants to invest in locomotives if all the places they want to take it… use different rails.

In the software world, as in the world of trains, standards define a given market for products and services. Compliance tests make objective (and far less controversial) the question of whether or not a given product meets a given set of requirements. Explicitly-defined, testable protocols make products provably swappable and/or interoperable. Open standards processes try to define those tests and protocols in the open, with input from initial and future contenders in that market, speeding up the timeline to legitimacy by incorporating major players and incorporating widely-sourced input.

Standards processes, as inherited from tangible industries

One way to explain standards processes is to begin with some examples in the physical world, which we learned about as matters of fact in our education. These evolved to support the creation of precise manufacturing methods and to support more seamless commerce, giving stability and safety to commodity markets. Weights and measures are classic standards that support for both: after all, there is nothing natural about our units of measure or currencies, as anyone who’s used both metric and non-metric measures knows all too well. How long is something? How heavy is something? How much liquid is in a gallon or liter?

Standardizing these kinds of measures was a quantum leap for commerce and mercantilism: it gave everyone a common reference point and enabled accounting systems (and “ledgers”) of vastly wider scope and simplicity. The fact that standards have been decided at the international level means it can happen on a global scale. The metric system, for example, is defined by the International Bureau of Weights and Measures, an international standards development organization (SDO). There is little debate in 2020 about what a gram is (it’s the weight of one milli-liter of water), but things like tolerance or accuracy in weighing and marking system is still an ongoing matter of debate and specification carried out there.

Another physical world standard is the shipping container. A whole global infrastructure has been built around this standard-sized container that allows it to be put on truck beds, shipped on train cars and put in ships that go around the world. It also means if you can fit whatever your thing is inside that box it can get to almost anywhere in the world because there is a standards based infrastructure that can handle it. Massive economies of scale (which have terraformed geopolitics by enabling high-throughput, high-efficiency global trade networks) are unlocked by this kind of standard, which enables the movement of containers (in most cases with no knowledge, or no direct knowledge, of what is inside them) to become a kind of commodity whose price stabilizes and steadies far-flung trade. The analogy to the “packets” or “data points” of modern information technology has been a mainstay of thinking about software business models for decades, and W3C Verifiable Credentials are no exception.

Similar standards also govern the electricity coming out of the walls in our houses, which has become so reliable and ubiquitous over the last century that few people outside of the relevant industries think much about it, or how dangerous it would be if standards were loosened. The “amount” (load and speed) of electricity, as well as the physical form factor of plugs, wiring, and circuit boards are all standardized at the national or regional level. This creates regional economies of scale in both the delivery of energy as a resource and in the manufacturing of electricity-powered products. Indeed, much of software engineering as an academic discipline and a labor market, as well as many standards around data and their governing standards bodies, evolved out of the electrical and communications infrastructure that preceded the advent of modern software.

Standards processes, for bits and bytes

Digital technology also needs explicit and testable standards, deliberated by specialists and engineers in a transparent process for the common good and for the stability of huge systems of capital and human effort. As the internet has evolved, the bulk of this effort has focused on the definition of common protocols that allow information to be exchanged by different computer systems, potentially written in very different languages and operating across very different topographies, with very different inputs and automations and governance structures.

The protocols that make up the modern internet were originally created by the group that began building the ARPA network. In 1986 the Internet Engineering Task Force (IETF) formed and it is still the steward of many key protocols that form the basis of much of the internet, particularly around security and load-balancing at massive, infrastructure scale. The Worldwide Web Consortium (W3C) was formed in 1994 and works to develop the software standards for the World Wide Web’s core technologies: browsers and servers.

One example we all use every day is E-mail, or as it was once known, “electronic mail”. How addressing and discovery can work, the limits and parameters of a universally-recognizable address, etc are all written up in authoritative specification documents. Colloquially, these documents are often referred to as “the standard,” or “the RFCs” (“Request[s] for Comment” referring to the collective editorial process by which standards are written). Email is typical, however, in that a patchwork of multiple interlocking protocols are actually required to send and receive emails.

Although this list of protocols needs to move slowly to give the end-consumer stability and assurance, the list is actually in a state of permanent minor flux, as individual protocols are iterated and upgrades, support for older versions fades away, and now protocols are added that take advantage of security or performance enhancements reaching critical mass elsewhere. For decades, the dominant protocols in email have been Simple Mail Transfer Protocol (SMTP), Internet Message Access Protocol (IMAP), and underlying handshake/transport protocols like Transport Layer Security (TLS). As you can see clicking any of these links, the “standard” (current best practices and specifications) are compromised of a patchwork of iterating and component specifications with a more narrow scope, allowing for a kind of modular and incremental evolution allowing markets and applications to phase components and subsystems in and out over time without interruptions to service or sudden changes in user experience.

Another example we use every day is “web standards,” i.e., HTML, HTTP and CSS. These protocols are the standards that let any web server present information according to prescriptive formats which will be displayed in roughly the same way by any compliant web browser. Here, as in weights and measures, or electricity, there are always slippages and margins for error, though, as any front-end developer can tell you) This enables a diversity of web servers and a diversity of browsers. There are of course open source examples of each (Apache Server and Firefox Browsers being two examples) but there are also proprietary versions of them as well.

To many observers, the degree of openness in the codebase matters less than the diversity and complexity of organizations involved in the governance of the protocols: in this regard, control of “browser standards” might be endangered by increasing market power on the part of browser vendors owned by private entities that also have outsized and direct control over operating systems and app store platforms, upon which any competing browser would depend directly. Neither open source nor open standards are a guarantee of a healthy and open market, although both generally contribute to that end.

Standards and Proto-standards

Standards development is generally a slow process taking years, driven by a balance between (rough) consensus between stakeholders refining and iterating requirements and running code against which these requirements can be measured or considered. Depending on the context, stakeholders can include vendors or commercial actors, regulators, consumer advocacy groups, affected industries, and/or individuals. The terms “running” and “code” both cover a lot of territory, but it would be impossible to arrive at a standard without at least two independent, functioning pieces of code that have been tested, audited, and hardened, ideally by some deployment at scale.

In some cases, this is where an open standards process begins: two maximally independent implementations decide to cooperate for greater adoption and maturity, and seek out a venue for the relevant stakeholders to debate the merits and trade-offs of their current codebases and future variations or possibilities. Different standards bodies can be more or less public in their processes, more or less transparent in their results, and more or less complex in their rules of engagement: indeed, some operate according to a rulebook as complex as that of a parliament, and a style guide as exacting as an academic institution.

For the most open of standards, however, it is possible to work in the open, deliberatively and transparently, long before this “first” step. Working in an industry group, trade association, or other neutral venue can speed up the work towards a standards by front-loading the collaboration, peer-review, market-testing, and process legitimacy needed to get an idea ready for the market and standardization sooner. These “pre-standards” venues are like containers where all those participating have signed an IPR agreement up front that means their work is unencumbered by patents or royalties and safe from being front-run or patent-trolled.

The products of these pre-standards processes are often called “pre-standard specifications” or “proto-standards” (if they are more ambitious and protocol-shaped). Groups that develop proto-standards often have explicitly-defined processes for how to publish a proto-standard, as well as when and how to hand off a sufficiently matured proto-standard to an SDO process for more formal and authoritative standardization.

We should not overstate the distinction between standards and pre-standards, however: there are many shares of grey in between. “Standards-track” and “non-standards-track” specifications alike can be more mature or legitimate depending on the parties involved, the pertinent SDOs (or lack thereof), and of course the process used to create them. For this reason, work items and working groups at the DIF avail themselves of multiple procedural options and tailor their processes to the context, which is why “scoping” and “chartering” processes can take months to hammer out between organizations and their legal departments. This is also specifications developed at DIF without being further hardened in a more formal standards body can sometimes be called “standards,” in the sense that they are adopted in the industry and function as standards. How the market and the relevant industries treat, and trust, and rely on a specification is the ultimate judge of when it can be called the authoritative text for a standard process or procedure!

In our next drill-down, we’ll go into more detail on DIF’s processes, how you can get involved, and what decisions go into assuming an active role in a working group or work item.

Related Articles

DIF Monthly #32
· 11 min read
DIF Monthly #31
· 10 min read
DIF Monthly #30
· 9 min read
DIF Monthly #28
· 15 min read