Skip to main content Skip to navigation

JILT 1996 (2) - Peter Seipel

E Mackay, D Poulin and Pierre Trudel Eds.

The Electronic Superhighway - The Shape of Technology and Law to Come

Computer/Law Series 18, The Hague: Kluwer Law International 1995
208pp - £62
ISBN 90 411 0135 7

Reviewed by
Professor Peter Seipel
University of Stockholm
peter.seipel@juridicum.su.se


Contents

Word icon and download article in .doc formatDownload


Date of Publication: 7 May 1996

Citation: Seipel, P (1996), 'E Mackaay et al's The Electronic Superhighway - The Shape of Technology and Law to Come', Book Review, 1996 (2) The Journal of Information, Law and Technology (JILT). <http://elj.warwick.ac.uk/elj/jilt/bookrev/2seipel/>. New citation as at 1/1/04: <http://www2.warwick.ac.uk/fac/soc/law/elj/jilt/1996_2/seipel/>


Introduction

The era of discovery of computer law was the 1960s. White areas on the map were reconnoitred, described and related to the co-ordinate system of the law. The questions were straightforward: Should computer programs be regarded as unpatentable mental steps? Did storage on a computer tape mean that data were not visible representations of works? What clauses ought to be included in a computer purchase contract? Was it theft to steal computertime? Was mass storage of data in itself to be regarded as a new form of surveillance of the individual? Did computer programs constitute a special category of decision-making in public administration? Even conservative observers of these endeavours had to concede that automatic computing machines had brought about a number of legal complications. However, in the perspective of the conservatives, the new land - created by technological uplift - would soon be fully mapped and safely segmented into the stable categories of traditional law.

But there were early signs of a slightly different future. For example, in the early 1960s the Council on Library Resources in the U.S.A. sponsored a study of 'concepts and problems of libraries of the future '(Licklider, J.C.R, 1965). The future was defined as the year 2000. Information technology of that time certainly imposed limitations (unimaginable to today's young Internet surfer) but the remote target date freed the study group 'to concentrate upon what man would like the nature of his interaction with knowledge to be'. The group managed to find a number of promising 'schemata' among early techniques and devices, for example random-access memory, cathode-ray- oscilloscope displays and light pens, list structures, procedure-oriented and problem-oriented languages, xerographic output units, and time-sharing computer systems with remote user stations. The group speculated about possibilities of people sitting at a desk, writing and drawing on a surface with a stylus and thereby communicating interactively with transformable information in a large computer memory. They discussed tools and techniques to support the knowledge work not only of the individual but also of teams of many people. The study group had a grim vision of the future of the traditional library: 'Indeed, if human interaction with the body of knowledge is conceived of as a dynamic process involving repeated examinations and intercomparisons of very many small and scattered parts, then any concept of a library that begins with books on shelves is sure to encounter trouble '(Licklider, J.C.R, 1965 pp. 2,5,9,100) But in the subject index of the report, among terms such as 'cognitive structure' and 'topological space analogy' one looks in vain for terms such as 'copyright', 'privacy' or 'law'.

Now we have moved close to the target date of the early library study group. We have learned that even the imaginative minds of three and four decades ago were not able to fully comprehend the scope and impact of the development of IT and its applications in society. We have come to understand - at least some of us have - that law and IT are intertwined in more complex ways than the early explorers could grasp, that it is hardly possible to discuss IT in society without also devoting attention to its legal aspects. Briefly, the new information infrastructure comprises important legal elements and relationships. They contribute to the shaping of the information infrastructure and are themselves influenced and transformed by its other elements, for example the inter connectivity of data networks and the automation of various transactions. So, now we find ourselves on the dreamed electronic superhighway. The light pen turned out to be a mouse and the computer mutated into microchips, notebooks, message pads, intelligent networks, and what not. The 'topological space analogy' became 'cyberspace' and the interconnected teams of knowledge users became global, open and virtual. The only thing that seems to be the same appears to be the still existing legal uncertainty. A recently published book presenting the proceedings of a 1994 conference in Montreal, Canada on 'The electronic superhighway - the shape of technology and law to come' bears witness to this.

The collection of contributions is divided into three parts, viz. 'Electronic superhighways - Environment', 'Electronic superhighways - Uses' and 'What law for superhighways?' It follows from the nature of the book -the proceedings of a one day conference - that the topics are treated somewhat patchy and in varying detail. As the editors put it, the papers are an attempt to capture the flavour of a debate in midstream. Thus, someone looking for a complete textbook on the law of the Internet should look elsewhere. But someone interested in a discussion of the dynamics of IT law and the factors which contribute to shape it will enjoy the book. However, it is up to the reader to see the contours of the whole and to bring the sometimes disparate elements together.

 

Part 1

Guy Basque introduces the Internet, its history, organisation (or lack of organisation), services, security (or lack of security), and so forth. The thirteen pages contain only a nutshell account but it is well-written and emphasises important aspects, for example the significance of Internet as a means of communication and information exchange.

Larry Press discusses the two cultures of Internet and interactive TV. Among other things, his contribution serves as a reminder that we are dealing not only with the development of technologies and applications but also with matters of sociology and with different, sometimes conflicting, traditions and orientations among the players. Press discusses, among other things, applications and data types, access ethic, economic orientation, and methods of capital formation. He confesses to a degree of discomfort with the interactive TV community as the controller of a technology that may have such a profound impact on society and our world view.

The third contribution to the presentation of the Superhighway environment is by André H. Caron and concerns the attitudes of users to interactive TV and related services. The study describes usage of the so called Vidéoway system in Montreal, Canada, first introduced in 1990. It offers glimpses of how future, more advanced interactive information services will be received by the general public.

 

Part 2

The second part of the book is devoted to uses of the superhighways. It contains - perhaps surprisingly, since use descriptions are the key to the legal issues - only two contributions. But these two, 'The Internet and Legal Information' by Tom R. Bruce and 'The Communication Highway and Academic Publishing' by Jean-Claude Guédon, have the advantage of both dealing with subjects of specific interest to the probable readers of the book and serving to illustrate broader issues and vistas.

Bruce's account of distributed hypertext in the field of law may be applied to other areas of knowledge as well, and the idea of 'add-on scholarship' may be extended to other kinds of 'add-on' phenomena. Bruce himself comments upon, among other things, teaching and symposia. His emphasis is on the growing importance of editorial work, i.e. work which aims at building up and making available 'contextual information about data resources which are only one mouse click away for the end user and very, very far apart by measures of physical distance or institutional culture.' One may wonder how much progress has so far been made on the 'superhighway' in terms of such road maps and traffic signs. It seems safe to assume that, at least from a theoretical point of view, quite little. Compare with the following 30-year old reminder from the previously cited study of the libraries of the future: 'It is of paramount importance not to think of relevance as a vague, unanalysed relation, but rather to try to distinguish among definite types and degrees of relevance. With such development, the concept of relevance networks might progress from its present unelaborated form to a systematic, analytic paradigm for organisation of the body of knowledge'(Licklider, J.C.R, 1965 p. 63).How much of the hypertext structure of today's Internet can be said to be based on such a 'systematic, analytic paradigm'?

Guédon's article continues Bruce's discussion with an emphasis on scholarly electronic publishing [1]. Among other things, it explains how the data networks contribute to changing the whole established framework of knowledge work having to do with communication, retrieval, access, archives, legitimacy, economic investment and economic remuneration, and so forth. For example, it is worth thinking about what it would (will?) mean for the traditional publishers of scholarly journals to have to survive with fees coming from actual use of their articles and no subscription fees from libraries. In this context Guédon cites statistics which indicate that 80 % of all articles are not read by more than two people on average and are never quoted.

One important part of Guédon's article treats what he calls 'the fluidization of documents'. Briefly, we are dealing with the transformation of traditional print into hypertextual and multimedia forms of information handling which allow the communication of, for example, dynamic data from scientific experiments and completely new forms of bibliographic information. In addition authorship is changing in nature and may turn more and more into direct participating in dialogues, criticism, and collaboration on the network. In his concluding remarks Guédon proposes that we are now recreating some of the dimensions of knowledge work during the Middle Ages such as glosses and debates (disputationes). Is law helpful in this process? Should it be? Part three of the book (making up about half of its contents) deals with such issues. Again, it may be noted that the text is not intended as a guidebook on the law of the Internet. Rather, it may be said that the articles continue the discussion of many of the topics in the preceding parts. But the perspective is now distinctively legal and IT and its uses are analysed from the point of view of the law.

 

Part 3: Karim Benyekhlef's Contribution

Karim Benyekhlef paints a panorama of legal issues in connection with what he calls dematerialised transactions on electronic pathways. The discussion is, however, narrower in scope than Guédon's on the 'fluidisation of documents' since Benyekhlef limits himself to electronic transactions involving consumer-users of electronic services. Nevertheless, the range of issues is broad enough: evidence, consumer protection, civil liability, and protection of the right to privacy. To bring the themes together Benyekhlef attempts to structure the phenomenon of telematics, i.e. the coupling of computer and telecommunications technology. This means, above all, a brief presentation of actors and of different types of telecommunications services. The discussion which follows to some extent focuses on Canadian law, for example certain provisions on evidence in the Quebec Civil Code. Nevertheless, even this discussion is of interest for an international audience since the issues are certainly not odd phenomena of national and regional regulation. For example, the Swedish Code of Judicial Procedure contains no rules regarding the production of evidence corresponding to the common law concepts of 'hearsay' and 'best evidence rule'. Nevertheless, there are certainly problems of Swedish law regarding, for example, the presumption of reliability of computer generated evidence which can easily be placed within the framework of Benyekhlef's discussion.

Naturally, the panorama promised by the title of the article does not permit much detail. For example, the one and a half page piece on liability issues can serve as little more than a reminder that this highly complex and important field exists and that it is going to become increasingly controversial as communication and information services proliferate and diversify.

Essential parts of the presentation are devoted to international data flows and certain issues of international private law and attempts at harmonising national laws. It may be said that these issues are of such fundamental concern that they ought to have been the subject of a separate article in the book. It is a serious question how much room there will be for more or less disparate national solutions to the issues of legal regulation of global data networks.

In his concluding remarks Benyekhlef brings up another question of regulatory strategy, viz. to what extent new legal norms ought to be independent of the state of art of technology at a given point of time. This question can also be raised in relation to valid law: to what extent must we do away with parts of law that are dependent on outmoded concepts of information handling? Not least copyright law gives rise to such concerns.

 

Part 3: Pamela Samuelson's Contribution

Pamela Samuelson's contribution 'Copyright, Digital Data, and Fair Use in Digital Networked Environments' is only ten pages long - but ten succinct and thought provoking pages. At the outset she concludes that copyright has survived previous new technologies but that these previous technologies have not threatened the viability of the core concepts of copyright law. Thus, we are now facing a different situation. She goes on to describe how the digital medium may upset - should one not say has upset - the complex classification system that copyright laws have developed. Works can be more than one kind of work. 'Bits are just bits', she states, '(and h)ow the bits are processed will determine what kind of work will be perceived to exist'. Imagine, for example, digital geographic data being used to produce either a visible representation - or a lecture, or music, or a tactile experience. For Samuelson the key phrase is 'the plasticity of works' which, of course, is the 'fluidization' and the 'dematerialization' that we met above but in yet another perspective. Samuelson emphasises the possibilities of digital data uses rather than the copyright infringement threats. She concludes that fair use and fair dealing doctrines may be increasingly helpful as flexible instruments with which to balance the interests of authors and of owners of copies when the latter wish to 'enjoy the plasticity of works in the digital medium'. The final part of Samuelson's article raises the question of whether combinations of technological and contractual means for protecting digital works will not reduce the role of copyright to that of justifying such practices. In this context I am reminded of a study that I conducted in 1973-74 of the Swedish software industry. Of the responding software enterprises 82 % stated that they relied on contract clauses and 72 % on practical security measures to protect their software whereas only 22 % claimed that they relied on copyright protection (Seipel, P. 1975) .

The reasons were probably to be found in uncertainty as to whether the Swedish Copyright Act did apply at all to computer software. Today that kind of fundamental uncertainty has disappeared but, as Samuelson's article bears witness to, other kinds of uncertainty have emerged. At the same time a new breed of 'practical security measures' - encryption, digital signatures, 'header contracts', automated reporting procedures etc. - have brought about new possibilities of controlling access to and use of works on digital networks. Thus it is not far-fetched to speculate that in digital network environments the principal role of copyright might become to protect encrypted works against decryption. But it is also easy to join Samuelson in her concluding remark that it is obviously too early to tell what impact digital technologies will have on copyright law.

 

Part 3: Richard Rosenburg's Contribution

So what about the impact of electronic networks on free speech and on related rights and values? These issues are dealt with by Richard S. Rosenberg and Pierre Trudel in the two final contributions. Rosenberg criticises harshly the practices of some university administrators who have, for example, restricted access to certain so called newsgroups which have been considered to distribute obscene or otherwise offensive information.

Some of the issues are legal-technical - for example, the question of whether steps taken by someone to prepare electronic newsgroups for accessibility are to be viewed as an act of publishing. Some questions regard overall strategies for electronic media as compared with print media. For example, is it acceptable that the possibility that certain forms of electronic communication can be more effectively controlled than journals and books is invoked as a reason for stricter control? Finally, some issues are political and moral and concern the formulation and interpretation of general principles of free speech. The three categories of issues tend to blend with one another and, to further complicate things, they also often involve conflicting rights and issues of their reconciliation with one another. In Rosenberg's article one finds many practical illustrations and experiences of such complications. He also suggests a set of basic principles, administrative and social, to deal with them. The question is; to what extent do they have to be complemented with legal norms in order to be effective?

 

Part 3: Pierre Trudel's Contribution

Pierre Trudel's article deals with the protection of rights and values in open-network management. Once again we meet a version of 'the fluidization phenomenon' - this time it has to do with 'the tendency to federate the various known contexts of information transmission and exchange'. To sum; there is no longer a conversation or a mass transmission - there can be both at once. One of the associated difficulties has to do with the potential of local networks to serve as gateways to a borderless, global networked environment. In consequence, the search for legitimate grounds for intervention becomes problematic and so called 'site policies' of network managers may have to be defended in a framework quite different from the one in which they were drawn up.

Trudel seeks to clarify the emergent norms (concerning attacks on reputation, threats to privacy, harassment, obscenity etc.) in electronic environments by analysing 'contexts' which make obvious the roles of the participants and the relations between them. For example, there are centrally controlled networks and there are relations between users and network managers. This is certainly helpful - although again 'fluidization' tends to blur the categories. Trudel: 'In relation to present legislation on electronic media, such as television and radio, the open network environment exhibits greater volatility of roles played by the main actors. In effect, the roles played in computer communications are highly interchangeable. It is no longer possible to formulate a set of rights and obligations for each role on the assumption that the roles played by the various participants in communication are always constant'. For example, everyone can become an 'information supplier'. And a role which gives aright to monitor information may presuppose that the monitoring party also accepts to role of being held liable for harm caused by the information at issue.

Trudel's article, just like the other ones, ends in tentative and open-ended conclusions. He sees a need for a systematic analysis of the reasons for controlling circulation of certain information. He recommends that increased attention be devoted to self- generated norms of conduct, and he refers to the different models of rule design and application which have been suggested by Henry H. Perrit: the authoritarian model where the network service supplier sets rules, the democratic model, where rules are established on a voluntary basis, and the formal legal model where contractual arrangements, statutes and administrative regulations define acceptable conduct (Perritt, Henry H., 1993) . Trudel underlines that these models never exist in pure form and advocates flexible strategies.

 

Conclusion

To conclude, 'The Electronic Superhighway. The Shape of Technology and Law to Come' is well worth spending time on. It does give a good view of the 'new frontier, a community in which traditional rules do not exist or are open to question' and it does stimulate to 'a debate on the society we want to live in' (as said in the editors' Introduction). Its weakness - a common one in conference proceedings - is that it cannot compete with more systematic and comprehensive treatments of the subject. But, then, there are not so many of these around and all serious attempts to develop the law of the electronic frontier are very much welcome.

 

Bibliography

Licklider, J.C.R, 1965 Libraries of the Future (Cambridge: The MIT Press)

Seipel, Peter 'Software Protection and Law'. in:Data 1975 Number 6, p. 43.

Perritt, Henry H Jr., (1993) 'Dispute Resolution in Electronic Networks Communities' 38 Villanova Law Review 349, at 354.

 

Footnotes

 

[1] For those who take a special interest in this subject Volume 11, Number 4 (October-December 1995) of the journal 'The Information Society' (ISSN0197-2243) ought to be of interest. It is a special issue on electronic journals and scholarly publishing.

JILT logo and link to home page