why the facts can no longer be ignored.
Moreover, certification standards
like CMMI do not work. I have been
part of CMMI-certification drives and
find that real software-development
processes have no relation to what is
ultimately certified. Software development in real life starts with ambiguous
specifications. When a project is initiated and otherwise unrelated employees assembled into a team, the project
manager creates a process template
and fills it with virtual data for the
quality-assurance review. But the actual development is an uncontrolled
process, where programs are assembled from random collections of code
available online, often taken verbatim
from earlier projects.
Most software winds up with an
unmanageable set of bugs, a scenario
repeated in almost 80% of the projects I’ve seen. In them, software for
dropped projects might be revived,
fixed by a new generation of coders,
and deployed in new computer systems and business applications ultimately delivered to everyday users.
Software developers must ensure
their code puts no lives at risk and enforce a licensing program for all software developers. Proof of professional
discipline and competency must be
provided before they are allowed to
write, modify, or patch any software to
be used by the public.
As suggested by Parnas, 1, 2 software
should be viewed as a professional
engineering discipline. Science is
limited to creating and disseminating
knowledge. When a task involves creating products for others, it becomes
an engineering discipline and must
be controlled, as it is in every other
engineering profession. Therefore,
software-coding standards should be
included in penal codes and country
laws, as in the ones that guide other
engineering, as well as medical, professions. Moreover, software developers should be required to undergo periodic relicensing, perhaps every five
or 10 years.
Basudeb Gupta, Kolkata, India
1. Parnas, D.l. licensing software engineers in Canada.
Commun. ACM 45, 11 (nov. 2002), 96–98.
2. Parnas, D.l. software engineering: an
unconsummated marriage. Commun. ACM 40, 9
(sept. 1997), 128.
unicode not so unifying
Poul-Henning Kamp’s attack in “Sir,
Please Step Away from the ASR- 33!”
on ASCII as the basis of modern programming languages was somewhat
misplaced. While, as Kamp said, most
operating systems support Unicode, a
glance at the keyboard shows that users are stuck with an ASCII subset (or
My dubious honor learning and
using APL* while at university in the
1970s required a special “golf ball”
and stick-on key labels for the IBM
Selectric terminals supporting it. A
vexing challenge in using the language was finding one the many Greek
or other special characters required to
write even the simplest code.
Also, while Kamp mentioned Perl,
he failed to mention that the regular
expressions made popular by that language—employing many special characters as operators—are virtually un-intelligible to all but the most diehard
fans. The prospect of a programming
language making extensive use of the
Unicode character set is a frightening
William hudson, Abingdon, U.K.
*aPl stands for “a Programming language,” so “the
aPl programming language” deconstructs as “the a
programming language programming language.”
the Merchant is still Liable
In his Viewpoint “Why Isn’t Cyberspace More Secure?” (Nov. 2010), Joel
F. Brenner said that in the U.K. the
customer, not the bank, usually pays
in cases of credit-card fraud. I would
like to know the statistical basis for
this claim, since for transactions conducted in cyberspace the situation in
both the U.K. and the U.S. is that liability generally rests with the merchant,
unless it provides proof of delivery or
has used the 3-D Secure protocol to
enable the card issuer to authenticate
the customer directly. While the rates
of uptake of the 3-D Secure authentication scheme may differ, I have difficulty believing that difference translates
into a significant related difference in
levels of consumer liability.
The process in the physical retail
sector is quite different in the U.K. as
a result of the EMV, or Europay, Mas-
terCard, and VISA protocol, or “Chip &
PIN,” though flaws in EMV and hard-
ware mean, in practice, the onus is
still on the bank to demonstrate its
customer is at fault.
The U.K. Financial Services Authority took
over regulation of this area November 1,
2009, because many found the situation,
as I described it, objectionable. In practice,
however, it is unclear whether the FSA’s
jurisdiction has made much difference.
While the burden of proof is now on the
bank, one source (see Dark reading, Apr.
26, 2010) reported that 37% of credit-card
fraud victims get no refund. The practice
in the U.S. is not necessarily better but is
Joel f. Brenner, Washington, D.c.
format Migration or
David S.H. Rosenthal’s response (Jan.
2011) to Robin Williams’ comment
“Interpreting Data 100 Years On” said
he was unaware of a single format
widely used that has actually become
obsolete. Though I understand the
sentiment, it brought to mind Apple’s
switch from PowerPC to Intel architecture about six years ago. Upgrading the
computers in my company in response
to that switch required migrating all
our current and legacy data to the new
format used by Intel applications at
the time. Though we didn’t have to do
it straightaway, as we could have kept
running our older hardware and software, we had no choice but to commence a process to migrate over time.
This decision directly affected only
my company, not the entire computing world, but when addressing data
exchange and sharing, it was an additional factor we had to consider.
Rather than face some general obsolescence, we may inevitably all have to
address format obsolescence that is a
natural consequence of IT’s historically unforgiving evolution.
Bob Jansen, erskineville,
Communications welcomes your opinion. to submit a
letter to the editor, please limit your comments to 500
words or less and send to email@example.com.
© 2011 aCm 0001-0782/11/0300 $10.00