IPP> Standards Quandary: Speed Vs. Science

IPP> Standards Quandary: Speed Vs. Science

IPP> Standards Quandary: Speed Vs. Science

rdebry at us.ibm.com rdebry at us.ibm.com
Tue Dec 17 11:25:37 EST 1996


Classification:
Prologue:
Epilogue:


I thought that you all might like to see this article.  Clearly the IETF is
rethinking their process, but I believe we still can make significant headway
by simply agreeing amongst ourselves how it ought to be done, writing some code
to test out our ideas, and then taking the results back to the IETF.   Yes, we
ought to take their thoughts, criticisms, and sugegstions into account, but in
then end lets do what we think works best, based on prototyping!!


---------------------- Forwarded by Roger K Debry/Boulder/IBM on 12/17/96 09:22
AM ---------------------------


        BLDVMB.PAULAL @ D03AU005
        12/12/96 04:22 PM
Please respond to BLDVMB.PAULAL @ VM




To:


INTER at CTIVE WEEKNovember 18, 1996


Standards Quandary: Speed Vs.  Science


The old process is undeniably dead," said Carl Cargill, standards strategist
for Netscape Communications Corp.  Standards bodies can't simply take their
time to set conventions in the fast-moving, fast-converging computing and
communications fields.  Now, the "new process" often leads to bringing new
technology to market first and letting standards be established afterward.
"On-the-fly" standard-setting is increasingly the rule


By Peter Lambert and Carol Wilson


Whether it is a new high-speed telephone technology, such as Digital Subscriber
Line, or xDSL, modems; or, a component software technology, such as Object
Request Brokers, or ORBs; or a means of adding three-dimensionality to World
Wide Web pages, setting standards now is a constantly mutating process.  The
"new" way must balance speed to market and pure science, as well as cooperation
and brute force as jockeying for shares of rapidly emerging and growing markets
for "open" technology intensifies.


This means established bodies that in the past have set standards for the way
different types of equipment should work together -- or interoperate -- are
finding their roles are changing.  Such groups as the International
Telecommunications Union, or ITU, and the American National Standards
Institute, or ANSI, may no longer be the natural arbiters of issues involving
bringing products to market and technical arguments among suppliers.  They tend
to be dominated by manufacturers of technology in an era when buyers
increasingly call the shots.  They also often try to accommodate multiple
industries and global regions, and so, by nature, are a tough place to foster
consensus on emerging technology, unless an industry can wait years for it to
arrive.  That's hardly the case any more.


"The new process recognizes that the users have a role -- that there's such a
thing as buyer clout," said Cargill.  "[Buyers] can no longer afford to be
hostage to a single vendor or to the chance that standards bodies will modify
established practice."


In the new world, "the new mythology is essentially rough consensus and running
code, and then let standards bodies ratify it and build on future extensions,"
Cargill said.


Even some who worked in that "old process" agree.


"What our industry needs to learn is the Internet model of standards on the
fly," said Gerry Butters, North American president for Lucent Technologies
Inc.  "It's not going to be an easy process, but we need to make it happen."


For example: Netscape now wants to standardize the current practice for how
people are creating applications for computer networks, using its JavaScript
language.  It will submit a proposal and host a meeting of the European
Computer Manufacturers Association on Nov.  21 and Nov.  22, "a global body and
one of the best for creating an unambiguous standard." Java inventor Sun
Microsystems Inc.  and Java licensee Microsoft Corp.  will be among 30
companies expected to attend.  Whether a standard will result remains to be
seen, but the politicking has begun.


That Netscape should try to push its technology to become a standard is nothing
new.  Milo Medin, vice president of networks for the fledgling broadband data
service @Home Networks, pointed to "the browser wars" as an example of the
current pace of technological change that makes it imperative to find ways to
speed up standard-setting.


"What Netscape has done with browser plug-ins does away with the slow retail
chain required for floppy disk distribution of new software, enabling users to
extend functionality and install new applications overnight," he said.  "All
this really puts pressure on people to deliver products.  By the time a
technology goes through the formal process, it's obsolete."


Indeed, strategies practiced over the past two years by inventors, including
Netscape, represent an effort to largely avoid premature obsolescence.


Netscape's model of "putting the technology out there for free, then trying to
create value out of the extensions is essentially the model we've adopted,"
said
 George Paolini, director of corporate communications for Sun business unit
JavaSoft.


In respect to Sun's own Java software for distributing applications across
networks, JavaSoft's strategy has been to "make it ubiquitous first, then
figure out the business model," Paolini said.  "And I think it's now fair to
say that Java is the de facto standard for Internet application development."


Further, just as manufacturers can form consortia around technologies, such as
Java, to speed them to market -- hopefully as open de facto standards -- so can
technology buyers take the reins and force common technology specifications.


Is It Open?


"There's nothing sacramental about the mantle of a standard generated by a
standards body," Medin said.  Instead, the blessing must come by passing a
marketplace litmus test: "Is it open?  Does it work with other products?  Did
more than one company cooperatively push it through?  Is there any barrier to
entry by others?"


If the answer to each of these questions is yes, he said, an industry has its
interoperable standard.


Further, Medin, who often is cited as a key champion of the Transfer Control
Protocol/Internet Protocol, or TCP/IP, data transfer protocols that underpin
the Internet today, sees a variety of groups producing standards that pass that
litmus test on a pragmatic timetable, including "forums that tend to drive
buying, like the ATM [Asynchronous Transfer Mode] Forum and IETF [Internet
Engineering Task Force]."


Also worth watching and participating in are "truly collaborative forums of
competitors around technologies like Java and RealAudio and Netscape
LiveMedia," he said.  "Just because you don't go through the IEEE [Institute of
Electrical and Electronics Engineers] or another global standards body, doesn't
mean it's not open."


The Java architecture is indeed open, argued Paolini.  "Java may be a
precedent," he said.  "We took unusual steps in publishing it from the start,
licensed the source code, then collaborated on Applications Program Interfaces
with leaders in specific fields.  That puts us in the position of steward --
not making unilateral decisions, but collaborating" with competitors, including
IBM Corp., Macromedia Corp., Microsoft Corp.  and Silicon Graphics Inc.


If that sounds too altruistic to be true, JavaSoft has served itself by
building a going revenue stream out of source-code licensing.  "We still claim
ownership, but the definition is different; there's no way for us to extract
any more value from it than can IBM or any other of our collaborators," said
Paolini.  "If you don't have the buy-in of the collaborators and their weight
in the marketplace, there can be lots of factions that slow things down."


According to Lenny Rosenthal, WebForce Group marketing manager for Silicon
Graphics, "SGI has created the 3-D, multimedia extensions to Java, working with
Sun and Macromedia and others, and Sun has accepted our extensions to the Java
architecture.  With ActiveX, we tried to do that, and Microsoft was not
interested.  They wanted to do it on their own."


Rosenthal also pointed to the Virtual Reality Markup Language, or VRML,
Architecture Group as an example of ad-hoc industry cooperation and speed.
"With about 12 participating companies moving as quickly as possible, Version
2.0 followed Version 1.0 by only about six months, and there are already VRML
browser products out in the market.  Microsoft also made a pretty big effort to
push its own virtual reality language, but it bowed to a united front of 52
other companies."


While Microsoft makes few bones about its general approach -- embedding
advanced functionality into its flagship Windows operating systems -- Microsoft
itself is a Java licensee and has proved its willingness to pursue other give
and take.


Currently the IETF is considering a standard called the Layer 2 Tunneling
Protocol that will enable multiprotocol, remote enterprise network traffic to
"tunnel" through IP traffic across the Internet.  By all accounts, the standard
will incorporate "best of" elements from separate Microsoft and Cisco Systems
Inc.  tunneling proposals with little fuss.


Similarly, in February 1996, two leading credit card associations exercised
buyer leverage over two powerful technology providers.  Separately, Visa Corp.
had been working with Microsoft on a Secure Transaction Technology, or STT, for
electronic transactions, while MasterCard International Inc.  had been working
on the Secure Electronic Payment Protocol, or SEPP, toward the same end.


According to Nick DiGiacomo, president of @YourService, an arm of Science
Applications International Corp., which played an arbitrator's role in the
process, it was Visa and MasterCard muscle that forced a resolution.  DiGiacomo
described the result as "a very open, best-of-breed" Secure Electronic
Transactions, or SET, standard.


De Facto The Matter


Of course, the longer a standards body takes to reach consensus on a standard,
the wider the window for buyers to establish de facto standards by virtue of
purchases, an option open to major telephone, cable and computer technology
buyers alike.


Even with a certified standard in place, alternative solutions can vie to
become the de-facto winner in the marketplace.


In the case of high-speed xDSL technologies, since 1993 the T1E1.4 subcommittee
of Committee T1, a standards setting body operated by the Alliance for
Telecommunications Industry Solutions and authorized by ANSI to set telecom
standards, has endorsed Discrete Multitone, or DMT, technology developed
primarily by Amati Communications Inc.  as the line coding standard for xDSL.


With DMT chips still promised but not on the market, however, local telephone
companies in North America and worldwide have been testing xDSL using
Carrierless Amplitude Phase, or CAP, modulated systems.  Some of those
companies, notably Bell Atlantic Corp.  and Nynex Corp., as well as some
Internet service providers, say they'll continue to use CAP systems if they
prove more cost-effective.


The issue isn't a matter of standards, said Jeff Waldhuter, executive director
of research and development for Nynex, but one of market availability.  Because
DMT chips are just coming onto the market, and could prove more expensive than
the more mature CAP systems, Nynex is unwilling to risk its xDSL deployment
just to support a standard, he said.


Meanwhile, four other telephone companies -- Ameritech Corp., BellSouth Corp.,
Pacific Bell Corp.  and SBC Communications Inc.  -- are betting on the
standard, ordering DMT-based xDSL systems from Alcatel Network Systems Inc.


Which demonstrates how industries no longer are willing to wait for a standard
or even a consensus to emerge before taking matters into their own hands.  "I
think in this case, the more aggressive companies, like Bell Atlantic and U S
West [Inc.], aren't willing to wait for the standard to develop," said Kieran
Taylor, xDSL analyst with TeleChoice Inc.  "The others are in a position to
wait for the DMT chips to be available and affordable."






Setting The Standards


Formal Standards Bodies


International Telecommunications Union


American National Standards Institute


Institute of Electrical & Electronics Engineers


Society of Cable Telecommunications Engineers


Informal Standards Bodies


Digital Audio-visual Interactive Council


Internet Engineering Task Force


ATM Forum


Ad Hoc Groups


Object Management Group


Virtual Reality Markup Language Forum


Multimedia Cable Network Systems


Java Developer Connection






Too Many Cooks Spoil Standards Broth


The longer a standards body takes to reach consensus on a standard, the wider
the window for buyers to establish de facto standards by virtue of purchases


Illustrative of the current pressurized atmosphere, elbow room is now at a
premium for all the cooks in the cable modem standards kitchen.


The 802.14 Committee of the global Institute of Electrical and Electronics
Engineers, or IEEE, has put two years into seeking manufacturer consensus on
key cable modem interfaces, and expects to issue a draft standard by mid-1997.


However, the Multimedia Cable Network Systems, or MCNS, consortium of the
largest cable operators admits it can't wait that long and will issue its own
set of common specifications for cable modem purchases by December, though
those specs will include "hooks" to accommodate further developments blessed by
groups like 802.14.


"As with any new industry, you've got a lot of vendors jockeying for position,"
said Doug Robertson, director of business development for Motorola Multimedia,
one of five makers of modems upon which MCNS is basing its specifications.
"We're willing to share our intellectual property" through a 1 percent to 2
percent average selling price licensing fee, he said.


According to Milo Medin, vice president of networks for @Home Networks, a
national broadband Internet access service co-owned by three MCNS members, "the
MCNS wouldn't exist now if, by mismanagement, 802.14 hadn't taken way too much
time and made itself irrelevant."


Indeed, that relevance came into question in early October, when MCNS and
802.14 separately adopted incompatible modulation and error correction
specifications for cable data.


Despite the apparent setback, recently installed 802.14 Chairman Robert Russell
told committee members that MCNS' specifications "are based on readily
available technology and protocols to satisfy an immediate need by network
operators."


Yet another body, the Society of Cable Telecommunications Engineers, or SCTE,
is working to take input from 802.14 and MCNS, as well as from ATM Forum; the
quasi-official standards group, the Digital Audio Visual Interactive Council;
and eventually the IETF, which is considering an IP-over-cable draft
architecture, and producing its own draft standard.














Copyright (c) 1996 Interactive Enterprises, LLC.  All rights
reserved.Reproduction in whole or in part in any form or medium without
expresswritten permission of Interactive Enterprises, LLC is prohibited.
InteractiveWeek and the Interactive Week logo are trademarks of Interactive
Enterprises,LLC.










Title:  Standards Quandary: Speed Vs. Science
URL:  http://www.zdnet.com/intweek/print/961118/inwk0027.html



More information about the Ipp mailing list