University of Bielefeld -  Faculty of technology
Networks and distributed Systems
Research group of Prof. Peter B. Ladkin, Ph.D.
Back to Abstracts of References and Incidents Back to Root
This page was copied from: http://catless.ncl.ac.uk/Risks/16.92.html


Previous Issue Index Next Issue Info Searching Submit Article

The Risks Digest

Forum on Risks to the Public in Computers and Related Systems

ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator

Volume 16, Issue 92

Thursday 16 March 1995

Forum on Risks to the Public in Computers and Related Systems

ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator

Contents

o Health card rips off ATM for $100,000
Roy Beimuts
o A340 shenanigans
Les Hatton
o Mistake of platform-specific instructions
Stanton McCandlish
o The Manchurian Printer
Simson L. Garfinkel
o Re: Scientology Blackmail Risk
Lance A. Brown
Jon Green
o Re: Internet-Finland Privacy
Michael Jennings
o Jumping to conclusions? (Lifeguard)
Peter da Silva
o Re: Microsoft and Lotus spreadsheet errors
Bear Giles
o Society and the Future of Computing
Phil Agre
---------------------------------------------

Health card rips off ATM for $100,000

"AGR CANADA, Research Station, Melfort, Sask." <SYSTEM@skrsme.agr.ca>
15 Mar 1995 12:18:46 -0500 (EST)

From Saskatoon StarPhoenix, March 14:

"BC man sentenced for ATM scam"

PORT HARDY, B.C. (CP) -- A Vancouver Island man who used his health card to steal $100,000 from a bank machine has been given a year in jail. Richard Lee Mose, 22, of Port Alice, B.C., was found guilty of theft in Campbell River court. A Bank of Nova Scotia official said a processing problem allowed Mose to use a non-bank card on an automatic teller. With his health-care card, Mose completed several transactions over several hours and got $109,000 dollars. But the bank machine recorded the information stored on the health card, and police traced it back to Mose.

Roy Beimuts, Melfort, Saskatchewan beimutsr@em.agr.ca
---------------------------------------------

A340 shenanigans

Les Hatton <les_hatton@prqa.co.uk>
15 Mar 1995 15:56:26 +0000

The BBC news at 08.30 reported a slight problem which occurred on the morning of 15 Mar 1995 with the ultra high-tech, packed full of software and therefore utterly wonderful Airbus A340.

Apparently on the final part of its approach to Gatwick, both the pilots screens went blank, to be replaced by a polite little message saying "Please wait ...". Somewhat unnerved, the pilots requested that the plane turn left, but it turned right instead. They then tried to get it to adopt a 3 degree approach to the runway, but it chose a 9 degree plummet instead. At this point, from the report, they appeared to gain manual control and landed safely. It is not clear who will pick up the dry-cleaning bill.

Vis a vis this sort of thing, I was at a talk recently, given by the CAA (UK Civil Aviation Authority), at which it was stated that in the past generation of civil aircraft, most of the software problems were reported in the Flight Management System. Not surprisingly, this was the most complex part of the aircraft software system. Not any more it isn't. During the talk, it was also admitted that the newer generation of aircraft such as the A340, other software systems including active systems were "at least as complicated". So what next ?

I suppose it follows on nicely from the story in the October 1994 Risks whereby a Japanese Air Force T-4 jet trainer ejected one of its pilots. Perhaps it didn't like him. :-)

Les Hatton, Ph.D. C.Eng, Director of Research & Engineering, Programming
Research Ltd, England les_hatton@prqa.co.uk +44 (0) 1 932 888 080
---------------------------------------------

Mistake of platform-specific instructions

Stanton McCandlish <mech@eff.org>
Wed, 15 Mar 1995 17:29:33 -0500 (EST)

One thing that's bugged the hell out of me for several years is the failure of many people giving instructions on finding net resources to do so in a generic manner. Here's a great example:

  1. Get into the WWW.
  2. Enter the letter G so you can enter the address for the test site.
  3. At the prompt, enter the following:
    http://sunsite.unc.edu/jembin/mb.pl
This was from an announcement off of the Net Happenings list. Anyone following these instructions will get nowhere unless they are using lynx (unless there's another browser that uses the same keystrokes to specify a URL, which I doubt.) The first item implies that the instructions are generic, for whatever means one uses to browse WWW sites (and, incidentally, indicate a fundamental misunderstanding of the technology, since the WWW is not a "thing" that one "gets into", but is a method, a process, a standard.

All the author had to do was put:

URL:http://sunsite.unc.edu/jembin/mb.pl
At any rate, the application-dependent specificity of the instructions, though not a problem for anyone that's been using Internet tools for very long, is potentially very confusing and frustrating to less experienced users. And these users exist in ever greater numbers, as systems like AOL and Delphi become increasingly Internetly. I don't mean to single out AOL for any more maligning, but as a forum sysop there besides my other online services duties, I can say from firsthand experience that the technical skill level of AOL users is, on average, staggerly lower that that of the typical shell account user, or almost anyone else online for that matter.

Typical questions I receive from AOL users are: "What happened?" "How do I read this file?" "If I want to download an ASCII file do I have to use the ASCII protocol, or can I use ZModem?" "What do I join?" "Help!" "What is a 'GIF graphics viewer'?" Those aren't snippets. Those are entire message bodies. Such people, confronted with the site instructions I quoted from Net Happenings, are never going to reach the site in question, will probably pound people like me with more questions like the above example in an effort to figure out what to do, and may eventually give up in disgust (a few might actually RTFM and get a clue, but not everyone is a born geek.)

So, there are two RISKS to keep in mind here:

  1. If you use platform-specific instructions, you are doing other readers a disservice, and should at very least note that they are platform-specific (or application-specific, or whatever, as apropos). Better yet, just use generic instructions that anyone can use.
  2. If you need help with something, but don't *include enough information* for them to actually help you, you're unlikely to get a useful response. Include info such as: what kind of system you have, what it's relevant configuration is, what went wrong in detail, what all the items (files, whatever) are specifically that you are referring to, etc.
http://www.eff.org/~mech/ Stanton McCandlish
mech@eff.org http://www.eff.org/ Electronic Frontier Foundatio
n http://www.eff.org/1.html Online Services Mgr
---------------------------------------------

The Manchurian Printer

Simson L. Garfinkel <simsong@pleasant.cambridge.ma.us >
Wed, 15 Mar 1995 21:28:37 -0500

The Manchurian Printer, (C) 1995, Simson L. Garfinkel
[The Boston Sunday Globe, March 5, 1995, Focus Section, Page 83]

Simson L. Garfinkel

Early this month, Hewlett-Packard announced a recall of 10,000 HP OfficeJet printer fax copiers. The printer's power supplies may have a manufacturing defect that could pose an electrical shock hazard. HP says that it discovered the problem with its printers during routine testing; HP was lucky: printers can be very dangerous devices. A typical laser printer, for example, can draw hundreds of watts of power, generate internal temperatures high enough to burn a wayward human hand, and even, under the right circumstances, start a fire.

Most manufacturers, of course, try to design their printers to minimize such risks. Increasingly, however, there is a chance that companies might intentionally design life-threatening flaws into their products so that the flaws can be exploited at a later time. These fatal flaws might be intentionally built into equipment manufactured overseas, as a kind of "insurance policy" in the event of a war between that foreign country and the United States. The flaws might form the basis for a new kind of corporate warfare. Or the flaws might be hidden by disgruntled employees contemplating extortion or revenge.

Indeed, U.S. military planners are increasingly worried about this sort of possibility, they place under a heading "Information Warfare." Nevertheless, although the threat of Information Warfare is very real, an even bigger danger is that the Department of Defense will use this threat to convince the new Congress to repeal the Computer Security Act of 1987. This would effectively allow the National Security Agency to declare martial law in cyberspace, and could place the civilian computer industry into a tailspin.

To understand what the military is afraid of, imagine the Manchurian Printer: a low-cost, high-quality laser printer, manufactured overseas, with built-in secret self-destruct sequence. For years these printers could lay dormant. But send them a special coded message---perhaps a long sequence of words that would never normally be printed together---and the printer would lock its motors, overheat, and quickly burst into flames. Such an attack might be the first salvo in an out-and-out war between the two countries. Alternatively, an enemy company might simply use printers to start selective fires, damage economic competitors, take out key personnel, and cause mischief.

Unlike the movie the Manchurian Candidate, the technology behind the Manchurian Printer isn't science fiction. Last October, Adobe accidentally shipped a "time bomb" in Photoshop version 3.0 for the Macintosh. A time bomb is a little piece of code buried inside a computer program that makes the software stop running after a particular date. Adobe put two time bombs into its Photoshop 3.0 program while the application was under development. The purpose behind the time bombs was to force anybody who got an advance, pre-release copy of the program to upgrade to the final shipping version. But when it came time to ship the final version of Photoshop 3.0, Adobe's engineers made a mistake: they only took out one of the bombs.

An engineer inside Adobe learned about the problem soon after the product was shipped, and the company quickly issued a recall and a press release. Adobe called the time bomb a "security code time constraint" and said that "although this is an inconvenience to users, the security constraint neither damages the program or hard drive, nor does it destroy any files."

It only takes a touch of creativity and a bit of paranoia to think up some truly malicious variants on this theme. Imagine that a company wants to make a hit with its new wordprocessor: instead of selling the program, the company gives away free evaluation copies that are good for one month. What's unknown to the users of this program is that while they are typing in their letters, the program is simultaneously sniffing out and booby-trapping every copy of Microsoft Word and WordPerfect that it finds on your system. At the end of the month, all of your wordprocessors stop working: Instead of letting you edit your memos, they print out ransom notes.

Any device that is equipped with a microprocessor can be equipped with such a booby-trap. Radios, cellular telephones, and computers that are connected to networks are particularly vulnerable, since an attacker can send them messages without the knowledge or consent of their owners. Some booby- traps aren't even intentional. What makes them particularly insidious is that it is almost impossible to look at a device and figure out if one is present or not. And there is no practical way to test for them, either. Even if you could try a million different combinations a second, it would take more than 200 years to find a sequence that was just 8 characters long.

* * *

Information Warfare isn't limited just to things that break or go boom. The Department of Defense is also worried about security holes that allow attackers to break into commercial computers sitting on the Internet or take over the telephone system.

"This nation is under IW attack today by a spectrum of adversaries ranging from the teenage hacker to sophisticated, wide-ranging illegal entries into telecommunications networks and computer systems," says a report of the Defense Science Board Summer Study Task Force on Information Architecture for the Battlefield, and issued last October by the Office of the Secretary of Defense.

"Information Warfare could pervade throughout the spectrum of conflict to create unprecedented effects. Further, with the dependence of modern commerce and the military on computer controlled telecommunication networks, data bases, enabling software and computers, the U.S. must protect these assets relating to their vulnerabilities," the report warns.

Information warfare changes the rules of war fighting, the report warns. A single soldier can wreak havoc on an enemy by reprogramming the opposing side's computers. Modern networks can spread computer viruses faster than missiles carrying biological warfare agents, and conceivably do more damage. Worst of all, the tools of the information warrior are readily available to civilians, terrorists and uniformed soldiers alike, and we are all potential targets.

Not surprisingly, the unclassified version of the Pentagon's report barely mentions the offensive possibilities of Information Warfare---capabilities that the Pentagon currently has under development. Nevertheless, these capabilities are alluded to in several of the diagrams, which show a keen interest by the military in OOTW---Operations Other Than War.

"They have things like information influence, perception management, and PSYOPS---psychological operations," says Wayne Madsen, a lead scientist at the Computer Sciences Corporation in northern Virginia, who has studied the summer study report. "Basically, I think that what they are talking about is having the capability to censor and put out propaganda on the networks. That includes global news networks like CNN and BBC, your information services, like CompServe and Prodigy," and communications satellite networks. "When they talk about 'technology blockade,' they want to be able to block data going into or out of a certain region of the world that they may be attacking."

The report also hints at the possibility of lethal information warfare. "That is screwing up navigation systems so airplanes crash and ships runs aground. Pretty dangerous stuff. We could have a lot of Iranian Airbuses crashing if they start screwing that up," Madsen says. Indeed, says Madsen, the army's Signal Warfare center in Warrenton, Virginia, has already invited companies to develop computer viruses for battlefield operations.

Our best defense against Information Warfare is designing computers and communications systems that are fundamentally more secure. Currently, the federal organization with the most experience in the field of computer security is the National Security Agency, the world's foremost spy organization. But right now, NSA's actions are restricted by the 1987 Computer Security Act, which forbids the agency from playing a role in the design of civilian computer systems. As a result, one of the implicit conclusions of the Pentagon's report is to repeal the 1987 law, and untie the NSA's hands. Indeed, the Pentagon is now embarking on a high-level campaign to convince lawmakers that such a repeal would be in the nation's best interests.

This argument confuses security with secrecy. It also ignores the real reasons why the Computer Security Act was passed in the first place.

In the years before the 1987 law was passed, the NSA was on a campaign to expand its power throughout American society by using its expertise in the field of computer security as a lever. NSA tried to create a new category of restricted technical information called "national security related information." They asked Meade Data Corporation and other literature search systems for lists of their users with foreign-sounding names. And, says David Banisar, a policy analyst with the Washington-based Electronic Privacy Information Center, "they investigated the computers that were used for the tallying of the 1984 presidential election. Just the fact that the military is looking in on how an election is being done is a very chilling thought. After all, that is the hallmark of a banana republic."

The Computer Security Act was designed to nip this in the bud. It said that standards for computer systems should be set in the open by the National Institute of Standards and Technology.

Unfortunately, the Clinton Administration has found a way to get around the Computer Security Act. It's placed an "NSA Liaison Officer" four doors down from the NIST director's office. The two most important civilian computer standards to be designed in recent years---the nation's new Escrowed Encryption Standard (the "Clipper" chip) and the Digital Signature Standard were both designed in secret by the NSA. The NSA has also been an unseen hand behind the efforts on the part of the Clinton Administration to make the nation's telephone system "wiretap friendly."

Many computer scientists have said that the NSA is designing weak standards that it can circumvent, so that the nation's information warfare defenses do not get in the way of the NSA's offensive capability. Unfortunately, there's no way to tell for sure. That's the real problem with designing security standards in secret: there is simply no public accountability.

In this age of exploding laser printers, computer viruses, and information warfare, we will increasingly rely on strong computer security to protect our way of life. And just as importantly, these standards must be accountable to the public. We simply can't take our digital locks and keys from a Pentagon agency that's saying "trust me."

But the biggest danger of all would be for Congress to simply trust the administration's information warriors and grant their wishes without any public debate. That's what happened last October, when Congress passed the FBI's "Communications Assistance for Law Enforcement Act" on an unrecorded voice vote. The law turned the nation's telephone system into a surveillance network for law enforcement agencies, at a cost to the U.S. taxpayer of $500 million.

=========WHAT FOLLOWS ARE CAPTIONS FOR THE ART===========
Photo: Box of Microsoft Word 6.0

Even though it's illegal, a lot of people like to "try out" software by making a copy from a friend before they plunk down hundreds of dollars for their own legal copy. Computer companies say that this is a form of software piracy: many who try never buy. More than 2 billion dollars of software is pirated annually, according to the Business Software Alliance.

One way that companies like Microsoft and Novel could fight back is by booby-trapping their software. Sure, customers wouldn't like it if that stolen copy of Microsoft Word suddenly decided to erase every letter or memo that they've written in the past month, but what legal recourse would they have?
=====================
Photo: Cellular Telephone

Is your cellular phone turned on? Then your phone is broadcasting your position every time it sends out its electronic "heartbeat." Some law enforcement agencies now have equipment that lets them home in on any cellular telephone they wish (similar technology was used recently to catch infamous computer criminal Kevin Mitnick). Perhaps that's the reason that the Israeli government recently ordered its soldiers along the boarder to stop using their cellular telephones to order late night pizzas: the telephone's radio signal could be a become a homing beacon for terrorist's missiles.
===================
Photo: Floppy Disk

Beware of disks bearing gifts. In 1989, nearly 7000 subscribers of the British magazine PC Business World and 3500 people from the World Health Organization's database received a disk in the mail labeled "AIDS Information Introductory Diskette Version 2.0". People who inserted the disks into their computers and ran the programs soon found out otherwise: the disks actually contained a so-called trojan horse that disabled the victims' computers and demanded a ransom.
==================
Photo: A computer with a screen from America Online, and a modem

Several years ago, users of Prodigy were shocked to find that copies of documents on their computers had been copied into special "buffers" used by Prodigy's DOS software. Prodigy insisted that the copied data was the result of a software bug, and it wasn't spying on its customers. But fundamentally, if you use a modem to access America Online, Prodigy or Compuserve, there is no way to be sure that your computer isn't spying on you while you surf the information highway.
==================
HP's recall affects only OfficeJet printers with serial numbers that begin US4B1-US4B9, US4C1-US4C9, US4BA-US4BU, or US4CA-US4CK. Worried about your OfficeJet? Call HP at (800) 233-8999.
===============
Simson L. Garfinkel writes about computers and technology from his home in Cambridge, Massachusetts. ---------------------------------------------

Re: Scientology Blackmail Risk (Vilkaitis, RISKS-16.91)

"Lance A. Brown" <lab@biostat.mc.duke.edu>
Wed, 15 Mar 1995 10:58:37 -0500

Vilkaitis is not correct. Postings on alt.security.pgp stated that Finnish authorities secured a warrant to seize the equipment the Finnish Anonymous Server runs on. The owner of the Server negotiated a deal with the authorities where he released the identity of _one_ user of the Server and the authorities didn't seize the equipment.

My understanding of the behind-the-scenes goings on is that the Church of Scientology is bringing copyright charges against one of its former ministers who is now a vocal critic of the CoS on the Internet. The sequence of events, as I understand it, is that someone used the Finnish Anonymous Server to post allegedly copyright material on USENET. The CoS asked the FBI to talk to Interpol who talked to the Finnish Police about getting the ID information of the anonymous poster. Once this ID information was released by the owner of the Server it was immediately handed over to CoS people.

[Also noted by Kevin.P.Maguire@jpl.nasa.gov (Kevin Maguire) and "Matti E. Aarnio [OH1MQK]" <mea@mea.cc.utu.fi>. PGN]

Re: Scientology Blackmail Risk (Vilkaitis, RISKS-16.91)

Jon Green <jonsg@diss.hyphen.com >
Wed, 15 Mar 1995 09:40:35 +0000 (GMT)
[... more as noted by Lance Brown deleted ... PGN]
Nonetheless, this does represent a worrying precedent. There are persistent rumours that the entire user base of at least one anonymizing service has been compromised by covert action by a security agency, and that's just the start. As has been pointed out elsewhere, any agency monitoring international communications (NSA in the US and GCHQ in the UK, to name two) should have little trouble matching anon ID with real ID if the message is in plaintext and the server in another country. Matching messages where the first leg is PGP-encoded (and the server decodes before retransmission) would be more difficult, but by no means impossible.

The only sensible conclusion is that anon remailers provide anonymity from your peers, not from the law. If you use them illegally, you may well be identified. Them's the breaks.

jonsg@hyphen.com jon@sundome.demon.co.uk
---------------------------------------------

Re: Internet-Finland Privacy (RISKS-16.91)

Michael Jennings <M.J.Jennings@amtp.cam.ac.uk >
15 Mar 1995 17:16:33 GMT

>Case #1 ... A Swedish journalist-researcher "reveals" that an Anonymous
>Case #2 ... Finnish Police receive a request from U.S. law enforcement

There have been suggestions on the net (in alt.privacy.anon-server, I think) that these two events may well have been related: specifically that the Church of Scientology might have been indirectly responsible for the 'This anon server is used by pedophiles: shock, horror' stories in the first place, as an attempt to discredit the anon server in order to make the police more likely to raid it for them/ get it shut down. This is only speculation, but it is consistent with their style. It is their standard policy to attempt to discredit their opponents through character assassination at the same time as they attack them through legal means.

For instance, one of the recent posters of copyright material to alt.religion.scientology was described in passing in a Scientology press release as someone who conducted execution-style killings of his pets in front of his children. Paulette Cooper, who wrote a book entitled _The Scandal of Scientology_, found a circular (supposedly written by `a concerned neighbor') circulating around her apartment block suggesting her `removal from our residence, and if possible, have her put under appropriate psychiatric care.' Several critics of scientology have been accused (and sometimes tried) for crimes that they did not commit, after having (apparently) been framed by scientologists.

Many people have used the anon server to post critical articles about Scientology. I suspect the church would like it discredited. (Massive amounts of information about this and the church in general can be found at http://falcon.cc.ukans.edu/~sloth/sci/sci_index.html#diary)

Michael Jennings, Department of Applied Mathematics and Theoretical Physics
The University of Cambridge. mjj12@damtp.cambridge.ac.uk
---------------------------------------------

Jumping to conclusions? (Lifeguard)

Peter da Silva <peter@nmti.com >
Wed, 15 Mar 1995 09:44:33 -0600

> "Anybody who shoots at you from any direction would be immediately located
> and subject to return fire," says Thomas Karr, head of the Lab team.

I don't read this as "the weapon would automatically return fire" but "the police officer would be able to return fire". ---------------------------------------------

Re: Microsoft and Lotus spreadsheet errors (Bellovin, RISKS-16.90)

Bear Giles <bear@tigger.cs.colorado.edu>
Wed, 15 Mar 1995 16:56:25 -0700

>They ended up doing the calculation, and storing a compressed table giving
>the difference between the calculated values and the legal ones. Never mind
>reality -- custom ruled.

I can beat that. I recently ported a mess of old FORTRAN meteorological code to C++, and I extensively expanded the validation suite in the process.

During the process I learned that a number of my newly ported functions were returning values a hair off (typically < 0.5%, when the number of significant digits should have resulted in errors two orders of magnitude smaller). I traced the problem to the fact that I was using the best currently known physical constants, while the standard reference tables were using values from the 1960s. Reasonable, since it was published in 1965, but I had to address the differences in the results.

After consulting with the working meteorologists, I eventually put the 1960-era physical constants back into the software. As a practical matter, the resolution of the available data is still coarse enough that the small differences won't make any difference, and it is easier for them to compare the final results than for FORTRAN programmers to understand scientific C++ classes.

But it still grates. At least I was able to eliminate a number of duplicate functions. Previously, someone had made a token effort to include a validation suite (with typically <10 test cases per function), but they made no effort to identify slow and/or inaccurate functions. A tradeoff of speed for accuracy is often justified, but who would ever want to use a function which is significantly slower and less accurate than another one?

Bear Giles bear@fsl.noaa.gov
---------------------------------------------

Society and the Future of Computing

Phil Agre <pagre@weber.ucsd.edu>
Wed, 15 Mar 1995 11:52:07 -0800

The conference on Society and the Future of Computing (SFC'95) will be held from June 11th to 14th in Durango, Colorado. This conference is an initiative of the US Public Policy Committee of the Association for Computing Machinery (USACM). Its focus is on opportunities for socially beneficial applications of computing technology: visions of what's possible ten years from now and agendas for computer science research that can make those visions come true.

Conference speakers include:

Gary Chapman, University of Texas, Austin
John Cherniavsky, National Science Foundation
Peter J. Denning, George Mason University
Linda Garcia, Office of Technology Assessment
S. Joy Mountford, Interval Research Corporation
Don Norman, Apple Computer, Inc.
Roy Pea, Northwestern University
Paul Evan Peters, Coalition for Networked Information
Virginia E. Rezmierski, University of Michigan
Leslie Sandberg, Institute for Telemedicine
Paul Young, National Science Foundation
Full information about the conference is available from the conference web pages at:
http://www.lanl.gov/LANLNews/Conferences/.sfc95/sfcHome.html
or send an e-mail message that looks like this:
To: rre-request@weber.ucsd.edu
Subject: archive send sfc-95
Poster sessions will be an important part of the conference. For full information on poster submissions, see the conference web pages or contact Doug Schuler <douglas@scn.org>. The deadline is April 1st.

Student scholarships are available as well; information is available from sfc95-students@lanl.gov.

The basic idea is to gather 250 people in a nice place to work hard and have fun learning how to take social issues into account when setting agendas for computer science research. We hope you can join us.

Phil Agre, UCSD
---------------------------------------------

Previous Issue Index Next Issue Info Searching Submit Article


Report problems with the web pages to Lindsay.Marshall@newcastle.ac.uk.
This page was copied from: http://catless.ncl.ac.uk/Risks/16.92.html
COPY!
COPY!
Last modification on 1999-06-15
by Michael Blume