Opinion: Standards Use and Abuse
IBM, Oracle, Informix, Sybase and a Cast of Others
"Standards are meant to be abused" - Anonymous
Standards are hard to to agree upon and even harder to enforce. Its simply a matter of self interest. "I" trumps "We". And in many cases I can depart a wee bit from the standards and nobody will notice Or 'they' won't have the time and resources to protest. And really where can 'they' go to get effective redress ? Indeed modern civilization is a reflection of the push and pull between various complexions of "I" and 'we".
But we want to look carrefully at two cases where the "we" standards worked well and where they did not. Networking and the Internet have blossomed because they had a long nurturing period in the universities and governement settings where IEEE, IETF, ISO and a score of other standards bodies were able to coalesce and incubate standards before the helter skelter of explosive commercial growth. Before Mosaic and then Netscape's browsers lit a fire under the Web in 1993-94 much of the ground work in terms of NNTP, Gopher, Mail, IP, and literally dozens of other network standards, both in hardware and software, had already been set in place
Since no one vendor dominated the fast emerging Internet landscape, all parties agreed to agree. And the CERN-based original HTTP/HTML developers led by Tim Berners-Lee, already working in a consortia environ, were not too far conceptually and dispositionally from setting up the W3C organization. W3C.org is intended to guide the Web towards a disciplined and open standards based complete fruition. CERN and the joint development in physics there provided a substantial model of what could be achieved. If one goes to W3C.org site today and peruse some of the standards that have emanated from there - XML, CSS, SOAP, to name but three - one can conclude the organization has had an ongoing positive influense on the development of the Internet and related software technologies.
Just as one example, XML has had a profound influence on the Web especially through XML-based Web Service standards such as SOAP, WSDL, UDDI and others which are helping to break one of the toughest nuts in IT and Computing - distributed, heterogeneous processing over a network of servers. But XML has also had a profound impact on Desktop data processing as well. XML is fast becoming the preferred method of storing persistent configuration or .ini file data and save files for programs. It is also becoming the medium of exchange or transfer of data between programs, particularly for ad-hoc, low volume but highly secure transactions. In sum, W3C and its standards have had a profoundly positive effect on computing
The SQL Debacle
It is ironic that one of the reasons for succes of XML in the standards arena is because another standard, SQL has largely failed. XML is takingon a number of database interaction and query standards that just failed to coalesce in ANSI SQL. We shall not assign blame but rather examine how such stalwarts of the IT communits such as IBM, Informix, Oracle, Sybase and a widespread cast of others could defeat themselves by in effect bypassing the very standards they approved.
The problem in the SQL standards arena is that vendors products still speak in dialects and the process and standards have proven so ineffective the standards community has effectively collapsed. Lets look at the first problem by examining the book SQL in a NutShell from O'Relly Press. Even a cursory examination show that the syntax for DML-Data manipulation language (comprising the basic Select, Insert, Update, Delete and other elementary database operations) are divergent in syntax about 1 in 6 times. There are many omissions and non-standard extras. Even this comparatively low level of divergence makes transferring SQL from one database to another (one of the key benefits of the relational model) very problematic. If DML is partially but annoyingly divergent then DDL-Data Definition Language(the command used to define, modify and partially administer databases) are effctively different dialects like French is to Italian is to Spanish. And as for stored procedures, triggers and database scripting - now you have no lingual agreement whatsoever.
These database lingual discrepancies have exacted a debilitating toll on software development. Isolated islands of information runs neck and neck with IT project failure rates for the dubious distinction of being the number one problem in IT development. How many times have you heard the lament - "We have the data I just can't get my hands on it with this program or system" ? Database and application silos dominate the IT horizon. There are two full blown IT industries - ETL-Extract, Transform and Link plus EAI-Enterprise Application Integration whose programs are dedicated to the process of connecting up those data and application silos. The number of data transfer programs far exceeds the number of useful, processing applications simply because for every "N" applications there may be factorial "N" conversion programs to transgfer data between them (so for 6 applications that could mean up to 30 conversion/transfer programs). It has been estimated that as large a chunk of development time is devoted to creating data transfer/exchange programs as fixing bugs or closing security holes.
But the problem of non-standard SQL is not limited to data transfer problems. Skill transfers; new database innovations in object methods; peer-to-peer and event/agent processing; and cross database system management have all floundered or been impeded by lack of good standards in the the SQL and database arena.
How Did It Happen?
ANSI - the American national Standards Institute started development of the SQL standard in 1982. ANSI made sense because they had supported other computing stanadrds such as Cobol and network databases. And many of the key, pioneering players including DEC, RDBMS originators IBM, Informix, Oracle,Sybase and others participated. However, between lip and cup some key things slipped over the next decade. By the time the time SQL 92 standard was released the following was occuring:
By SQL 99 the air was out of the balloon and despite the enormous amount of long time, work and effort put in by hundreds of participants; the process of database standarization had effectively dissipated as can be seen by the ANSI/INCITS listing of standards published since SQL 99:
> CAN/CSA-ISO/IEC 9075-1A-02 27-Aug-2003
Now these are all viable and worthy standards. The problem is that major database topics such as object frameworks, stored procedures standards, and the scripting/control of databases among others have been left either untouched or still locked in commiitee and process. The ANSI/INCITS' output and its signifigance pales in comparison to W3C standards during this same period.
Robert Frost said it best - "Good fences make good neighbours". Knowing the boundaries, limits and expected standards of performance makes for good product development among competing vendors. We all know the potential conflict between standards and innovation. But in the dynamic and razzle-dazzle arena of Web technology, W3C.org has clearly been beneficial to the fast devlopment and introduction of exciting new technologies. In contrast, the failure of ANSI SQL to produce effective new technology standards has inhibited database development. So much so that XML, XSD, XQuery (the W3C standards) are causing the most excitement in the database market. In addition, the new dynamics such as the availability of tiny and/or free SQL databases such as MySQL, Pointbase, SQLite is being driven in part by how they adhere to standards. But unfortunately the opportunity set or the efficient frontier of what makes a good database in not nearly as well defined as it could be.
>Micheal Gorman - A viewpoint
on cause of demise of ANSI SQL
Top of Page Tutorials Home