University of Bielefeld - Faculty of technology | |
---|---|
Networks and distributed Systems
Research group of Prof. Peter B. Ladkin, Ph.D. |
|
Back to Abstracts of References and Incidents | Back to Root |
This page was copied from: http://catless.ncl.ac.uk/Risks/16.13.html |
ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator
RISKS is about to enter its customary summer slowdown period, including the next month or so during which I will have only sporadic net access. Please do not expect to get rapid response to your requests or frequent issues. However, please keep the incisive contributions coming, and I will get to them whenever possible. I wish you all a peaceful and productive summer season. PGN
As reported on BBC Radio, Wednesday 8 June 1994: A train travelling through the Channel Tunnel from England to France, was evacuated after an emergency light came on in the driver's cabin. The drivers of the 10 lorries [Amer: trucks (JC)] who were being carried on the train were evacuated to the English end of the Tunnel, through the access tunnel. This was the first emergency on the Channel Tunnel which officially opened last month. It turned out that the emergency was a false alarm. John Colville, Visiting Brunel University John.Colville@brunel.ac.uk +44 895 274000 ext 2561 until June 1994, from University of Technology, Sydney
The Worcester (MA) Telegram of Saturday, May 28, carries a front-page article regarding "a mistake by the city Election Commission [which] nearly cost [name deleted] his wife." A week before, a city resident "opened his mail to find a computer-generated census form from the city." Apparently, he didn't look at the form closely, and when he handed it to his wife for her signature, she noticed the name of the person living with him at the same address was not hers. She had just returned from one of many extended visits with an out-of-town daughter. "'Who is this Beverly that's living with you?' his wife, Barbara demanded. 'Come on, Barbara ... nobody is living with me,' he shot back. 'Well, they got it right here,' she countered, pointing to the name Beverly A. [different last name] on the form. 'If I find out, I'm going to get a divorce.'" After a call to the city Election Commission, he discovered that the presence of the Other Woman is an error that may be attributed to the fact that he lives in an apartment building for senior citizens, and the Other Woman lives in another apartment in the same building. A spokesman for the Election Commission said if the victim "crosses out the incorrect information and mails the form back to City Hall, it will be corrected in the computer." The victim "says he has no intention of signing and sending in the erroneous form - and if his name is removed from the voter registration list, which is the penalty for failing to respond to the census, 'somebody else is going to have a lot more trouble.'" "'I'm not going to send it in. They make too many mistakes, and I'm not going to rectify their mistakes,' he said. 'I can't see why people have to keep paying for their mistakes all the time.'" He says this is the "last straw." The RISKS? If the City Election commission relies solely on returned census forms, accidental and intentional inaccuracies will remain. Given the current economic climate in the NorthEast, it's not likely local government will verify the information on every returned census form containing corrections. Another risk is the perceived apathy on the part the public. If this fellow refuses to correct and return the form, how is the Election Commission going to correct the error? What if others follow his lead? A third risk was mentioned by the Other Woman, who had not received a census form this year: "The census form is supposed to be confidential, but went to the wrong place."
Several perfect examples of software risks here. First is running a security mailing list with software that has security problems. Then the ever popular quick change to complicated software that generates unexpected (and annoying) results. Finally, there were the users sending messages complaining about the duplicate messages to the list which were promptly forwarded to the list, in triplicate. from the keyboard of bugtraq-owner@cscns.com -> > Sorry. I made some changes to the list code because of a serious > security hole found within the list software (a popular one). > This apparently caused things to go bonkers. Please stay patient. > The author of the mailing list software will release information > regarding the security vulnerability within the next few days. > --Scott Hope you found this as amusing as I did. Jim Patterson Photon Research Assoc. jhp@photon.com
I'm currently involved in a software development project, writing software to assist medical diagnosis of brain scans. Although I've ploughed through abound 50 pages of documentation now, without actually coding a single line, I decided it was time to whip up some "vapourware" snap shots of the screen I was intending to use. These I created in a paint program, and whisked off to the supervisor for a critical review, etc.. Paper can only say so much.. The response I got was going as would be expected. Some suggestions for layout, color schemes, font sizes, etc.. Until I got to section 6.. ---- 5 yellow cross-hairs stand out better - is the red arrow for some reason? Not apparent what (or if) 6 not knowing what all the options were, I tried playing with a few. I certainly ended up with some images I did not expect. I figured I should be able to get back to where I started by a judicious choice of operations. Alas - not so judicious. I certainly did not expect to get the text in mirror-image form, even if the picture was. I also thought 'normal' would redraw the original image, but this was not so. Is there some easy option to get back to where you started? ---- I was quite taken back. I had only submitted a gif, but my supervisor was suddenly telling me about all the "options" etc. Gollies, a GIF with a built in OS independent GUI... Turns out, the supervisor had been using "xv" to view the GIF, and had accidentally clicked the right mouse button on the picture at some stage, opening the "xv edit screen", and assumed that the new window was part of the submission ,etc. (Gosh, that _IS_ overwork.) The RISK ? Having a totally foreign program pretending to be mine, without any intent. Was "xv" too user-friendly? Gollies, I was subsequently told that if I hadn't questioned the response that it would have been assumed I was working extra hard.. :) We all had a good chuckle over it, and went back to work :) [Hmm, I suppose that someone is going to come along and refer to "xv" as the FIFTEEN program, which is NINE more than "vi", which is sometimes mistakenly referred to as the SIX EDITOR (according to a sighting reported to me by Caveh Jalali)! PGN]
Regarding a recent RISKS posting of the article "THE GHOST IN THE MODEM" by Richard Sclove and Jeffrey Scheuer of The Loka Institute, I fail to see how the rather insightful analysis of the impacts of government meddling in favor of highway-based transportation supports the conclusions: > Three lessons for the construction of the information superhighway > suggest themselves: > > o _No Innovation Without Evaluation_: ... federal government should > mandate evaluated social trials ... > > o _No Innovation Without Regulation_: We should conserve cultural > space ... > ... modest tax on electronic home shopping and consumer services ... > > o _No Innovation Without Participation_: ... the Danish government > appoints panels of everyday citizens ... These conclusions would lead us down the same road of the government using regulatory and taxing powers to distort the application and development of technology. A common mistake is for people to observe or foresee a danger and demand that government do something about it. They fail to take into account the risks involved in getting government involved, with its tax and regulation sledgehammers. This is appropriate when the dangers to be mitigated are great enough, but is not something to be suggested lightly. In many cases, people may be better off accepting the risk of the danger than putting up with the effects of government involvement. The kind of preemptive government involvement advocated in this case is particularly questionable, since the dangers aren't really known and premature government meddling may actually make things worse (as the authors argue may have happened in the case of transportation). Scott Dickson scott@ontek.com
The proper solution is not to increase the number of modes, but to decrease the number of modes--to ONE. `Less advanced' aircraft, which I believe include Boeing aircraft, use their sophisticated flight control systems to make the plane behave like a simple plane with simple controls. `Fly by wire', to their designers, means `make the best, and best-behaved, simulation of a plane with direct linkage from controls to control surfaces.' `More advanced,' in Airbus case, means `having a greater chance for error.' This has been debated in the aircraft industry, but the FAA has not seen fit to refuse certification to Airbus. My own belief, unsupported by professional qualification of any kind, is that this is a serious mistake, perhaps unavoidable due to the political nature of the Airbus Consortium. If I understand correctly, Airbus was forced to use these multimode control systems because some of its aircraft use sidestick controllers. Sidestick controllers provide very little range of motion compared to the traditional yoke (wheel on a moving stick/post), and the pilot effectively flies using pressure rather than position of the sidestick. This is fine during normal flight, but during takeoff and landing the control surfaces must be moved through large excursions, and the additional control modes are needed to allow this. There is another serious problem with the control mechanism described: the autopilot used one set of control surfaces (stabilizer trim) while the pilot continued to operate another (elevators). In effect, the aircraft and its inherent flight laws provide the summing of the control inputs. For such an arrangement to work at all, the summing process must remain linear, must not interfere with the flight of the aircraft in other ways, etc. Clearly, this did _not_ happen. An unqualified person such as myself has to wonder what happens when the stabilizer is canted one way and the elevator is canted the other. The result, it seems to me, should be excess drag and a disrupted airflow, leading to reduction or loss of control response to both inputs. There is a third problem: the pilot has no indication through his controls that the autopilot--in effect, the aircraft's control laws--are actively working against him. In an emergency, there are only two things he is likely to pay attention to--the primary instruments and the controls under his hands and feet. If he were to put the plane to the edge of a stall, either the natural buffet of the stall or an artificial stick-shaker designed to simulate the buffet would alert him. _That_ is probably the proper place to inform the pilot that he is working the aircraft against the edge. I don't fly much, but if I find myself booked on an Airbus plane I will look for an alternate booking. Mark Terribile mat@mole-end.matawan.nj.us, Somewhere in Matawan, NJ
Earlier I wrote re Clipper LEAF spoofing: >> IMHO this is almost meaningless since *both* ends will need to do this >> (AFAIR each side sends a LEAF. If only one LEAF is spoofed, it will >> just be necessary for a legal tapper to use the other one). and received the following correction: >According to a talk here at MIT given by two NSA mathematicians the >day the NYT broke the story (a happy coincidence :-), this is not >true. Using Capstone/Tessera, only one LEAF is sent, by the >originating party. I wrote: >> Dr. Blase also mentioned that it would take about 20 minutes to come >> up with a valid checksum. Much easier would simply be to record a >> valid LEAF from another chip and use that. and was again corrected: >Can't do that, as Perry said. The LEAF is a function of the session >key, so it needs to be spoofed for each session. What happens appears to be (after reading the draft of Dr. Blase's paper but with some inference - any mistakes are mine) that the receiving Baltimore Clipper decodes the LEAF sent by the originator using the family key, replaces the sender's unit-key-encrypted session key with its own clear session key, mixes in the originator's unit id (also in the LEAF), calculates a new checksum, and compares this with the one received. The obvious implication is that what Dr. Blase was altering may not have necessarily been the checksum (IMHO it was the unit ID) but probably was the last sixteen bits of the LEAF. I leave the implications to those who have the good fortune to have received samples. But then and again it is all moot since IMHO the gov (or at least the Clips, Billary may not have been completely filled in) do not intend to use the LEAF anyway. Padgett [Various similar comments are not included here.]
A. Padgett Peterson (padgett@tccslr.dnet.mmc.com) wrote: > IMHO this is almost meaningless since *both* ends will need to do > this (AFAIR each side sends a LEAF. If only one LEAF is spoofed, > it will just be necessary for a legal tapper to use the other > one). As far as I am concerned, this is why the flaw Dr. Blase discovered is so damning. If the effect of an authorized wiretap on an EES phone is that the correspondents [?] key is removed from escrow, this destroys the entire purpose of the system. For example, a call in a mobile system would involve a chip in the phone, and a chip in the base station. If the result of a legal wiretap on a LEAF-blower equipped mobile phone is that the keys of all the base stations in the vicinity are turned over to the tappers, the escrow system protects no one. > Dr. Blase also mentioned that it would take about 20 minutes to > come up with a valid checksum. Much easier would simply be to > record a valid LEAF from another chip and use that. This is a double misapprehension. First, the twenty minutes (divided by the number of chips used) delay is only associated with some modes of use. Second, the problem is not in collecting LEAFs, but in finding one that will be accepted by your correspondent for THIS call. In some cases that will be doable in advance, in others it will be easy, and in a few cases it may have to be done "the hard way" after the call has been initiated. > The most important element is that the SKIPJACK algorithm is in no > way affected by this and is as strong as ever, only the > government's ability to use the LEAF may be compromised. Not quite true. Being able to "peer through" the crypto protection is usually the first step in a complete crack. In this case it opens up (actually makes more effective) avenues of attack, which while they seem computationally infeasible today, may shorten the effective lifetime of the EES system. Robert I. Eachus
I have created a WWW Virtual Library page on safety-critical systems, including links to on-line RISKS information, under: http://www.comlab.ox.ac.uk/archive/safety.html Relevant information with URL links for possible inclusion, preferably in HTML format, is welcome. Jonathan Bowen Oxford University Computing Laboratory http://www.comlab.ox.ac.uk/oucl/people/jonathan.bowen.html
Subject: "Network Security Secrets" by Stang BKNTSCSE.RVW 940324 International Data Group 155 Bovet Road, Suite 310 San Mateo, CA 94402 USA tel 415-312-0650 fax: 415-286-2740 "Network Security Secrets", Stang, 1993, 1-56884-021-7, U$49.95/C$64.95 norman@digex.com It's hard to have confidence in a book on data security that starts off with two examples of "back doors" -- neither of which have the slightest connection to back doors. Then, there is the example of radiation risks which cites a study probably more related to chemical exposure. Do not lose heart, however. These are apparently aberrations in what is otherwise a very practical and down-to-earth security manual. That opening chapter on specific examples starts a section on risk analysis. Security mavens may find it lacking in rigour and overlong in verbiage, but most micro/LAN/office managers have little formal training in the formal aspects of data security and need the example after example approach. The questionnaires and quizzes probably drive the point home as well as, or better than, Stang's love of charts. Part two is a bit weaker. It opens with a questionable chapter on the "players" in the game: data security versus the hackers. There is a chapter on the security aspects of network design. Given Stang and Moon's background in the NCSA the two virus chapters are no surprise, and generally good. Part three gives more detail (*lots* of detail in the tabular reviews of chapter 12) on security solutions, policies and products. Not exhaustive: the password chapter, in looking at security *breaking* programs, makes no mention of the HACK program for obtaining any passwords used over Ethernet or the KNOCK program which exploits a bug in versions of NetWare which allows anyone to gain SUPERVISOR access without passwords. Still, there is much practical help for the LAN manager here. Part four looks at specific network operating systems and the security features and functions thereof. As well as comparisons of the different systems, there are chapters collecting the security commands and concepts of each. These are handy, but not necessarily more so than the original documentation. NetWare "effective rights", for example, continue to bedevil LAN managers using Novell's software. All the parts are included here, but the calculation of effective rights is not deal with. (There is also one "oops". The chapter on UNIX security contains a description of the VMS "WANK" worm.) Part five looks at ways to implement security. An unusual, but very valuable, section in the chapter on training is an extensive list of training available in a variety of security related areas. (Most of the virus courses seem to be given by an outfit called Norman Data Defense Systems. Oh well.) Part six briefly describes the shareware files on the included disks. A series of appendices primarily give contact information. It'll be difficult to get a busy LAN manager to sit still long enough to read this. But it'll be worth it. copyright Robert M. Slade, 1994 BKNTSCSE.RVW 940324 DECUS Canada Communications, Desktop, Education and Security group newsletters Editor and/or reviewer ROBERTS@decus.ca, RSlade@sfu.ca, Rob Slade at 1:153/733 BCVAXLUG ConVAXtion, Vancouver, BC, Oct. 13 & 14, 1994 contact vernc@decus.ca
Bruce Sterling has released 'The Hacker Crackdown' electronically. It is available via WWW at http://www.eecs.nwu.edu/hacker_crackdown/index.html (This is a mirror of http://www.scrg.cs.tcd.ie/scrg/u/bos/hacker/hacker.html) [But be sure to read the preface about the copyright and why and how Bruce is also making the book available on-line. I have excerpted the beginning of the preface here, because it is too long to include here. PGN] The Hacker Crackdown Preface to the Electronic Release of The Hacker Crackdown Out in the traditional world of print, The Hacker Crackdown is ISBN 0-553-08058-X, and is formally catalogued by the Library of Congress as "1. Computer crimes -- United States. 2. Telephone -- United States -- Corrupt practices. 3. Programming (Electronic computers) -- United States -- Corrupt practices." `Corrupt practices,' I always get a kick out of that description. Librarians are very ingenious people. The paperback is ISBN 0-553-56370-X. If you go and buy a print version of The Hacker Crackdown, an action I encourage heartily, you may notice that in the front of the book, beneath the copyright notice -- "Copyright (C) 1992 by Bruce Sterling" -- it has this little block of printed legal boilerplate from the publisher. It says, and I quote: "No part of this book may be reproduced or transmitted in any form or by any means, electronic or mechanical, including photocopying, recording, or by any information storage and retrieval system, without permission in writing from the publisher. For information address: Bantam Books." This is a pretty good disclaimer, as such disclaimers go. I collect intellectual-property disclaimers, and I've seen dozens of them, and this one is at least pretty straightforward. In this narrow and particular case, however, it isn't quite accurate. Bantam Books puts that disclaimer on every book they publish, but Bantam Books does not, in fact, own the electronic rights to this book. I do, because of certain extensive contract maneuverings my agent and I went through before this book was written. I want to give those electronic publishing rights away through certain not- for-profit channels, and I've convinced Bantam that this is a good idea. Since Bantam has seen fit to peaceably agree to this scheme of mine, Bantam Books is not going to fuss about this. Provided you don't try to sell the book, they are not going to bother you for what you do with the electronic copy of this book. If you want to check this out personally, you can ask them; they're at 1540 Broadway NY NY 10036. However, if you were so foolish as to print this book and start retailing it for money in violation of my copyright and the commercial interests of Bantam Books, then Bantam, a part of the gigantic Bertelsmann multinational publishing combine, would roust some of their heavy-duty attorneys out of hibernation and crush you like a bug. This is only to be expected. I didn't write this book so that you could make money out of it. If anybody is gonna make money out of this book, it's gonna be me and my publisher. My publisher deserves to make money out of this book. Not only did the folks at Bantam Books commission me to write the book, and pay me a hefty sum to do so, but they bravely printed, in text, an electronic document the reproduction of which was once alleged to be a federal felony. Bantam Books and their numerous attorneys were very brave and forthright about this book. Furthermore, my former editor at Bantam Books, Betsy Mitchell, genuinely cared about this project, and worked hard on it, and had a lot of wise things to say about the manuscript. Betsy deserves genuine credit for this book, credit that editors too rarely get. [...]
This page was copied from: | http://catless.ncl.ac.uk/Risks/16.13.html |
COPY! | |
COPY! |
by Michael Blume |