PHONE NUMBER

+64 21 389 121

EMAIL ADDRESS​

clive@cliveelliott.com

Tracking and Sniffing

Internet technology has spawned a plethora of devices and software for identifying, tracking, collating and retrieving information, often including personal information.  The most common of these are rather endearingly referred to as cookies, spiders and web bugs.  However, other names exist.

Cookies and Web Bugs

Cookies are data files which reside on a user’s computer hard drive.  They are deposited on the hard drive and retrieved when the user visits the same website again.  The information stored is used to convey the user’s preferences and again his/her personal details. Cookies are however also used for legitimate purposes, including allowing the use of “shopping carts” when buying online.

Session cookies are temporary in nature and only retain data about a user while he or she is present on that particular website. They are thus less invasive. However, stored cookies are more permanent and retain data from repeated visits to a particular website.

Web bugs are similar.  These are programming codes comprising tiny graphics files, undetectable to the human eye.  They allow others to monitor who is accessing a website and to provide details of the Internet protocol address – the user’s unique identifier.

eBay

A useful source of information on the technical issues can be found in the US decision eBay, Inc v Bidder’s Edge, Inc (US District Court, ND Cal, No. C-99-21200 RNW, 24 May 2000). See online at http://legal.web.aol.com/decisions/dldecen/ebayorder.pdf

eBay operates a massive live auction site with some 7 million registered users. It adds approximately 400,000 new items each day.  Bidder’s Edge used a variety of technical tools (variously called software robots, robots, spiders and web crawlers) to access auction information on eBay’s site, for transmission to its own customers.

eBay used a series of robot exclusion headers – utilising the “Robot Exclusion Standard” design – a data file “robots.txt”.  However, certain operators, including Bidder’s Edge, disregarded the standard and avoid the technical “blocks”.  In doing so they also ignored the warning on eBay’s website that web crawlers and the like were unwelcome.

Bidder’s Edge used proxy server software which allows outgoing requests to be routed through unprotected proxy servers and appear to originate from the proxy server, thereby getting past the website’s filter/block.  Bidder’s Edge accessed the eBay site approximately 100,000 times a day, giving an indication of the magnitude of the exercise or problem, depending on which way you look at it.

eBay sought a preliminary injunction, relying primarily on the tort of trespass.  It argued, by analogy, that the defendant’s activities were the rather frightening equivalent of sending an army of 100,000 robots a day to check the prices in a competitor’s store.

Without necessarily accepting the analogy, the Court noted at page 12:

“If eBay were a brick and mortar auction house with limited seating capacity, eBay would appear to be entitled to reserve those seats for potential bidders, to refuse entrance to individuals (or robots) with no intention of bidding on any of the items, and to seek preliminary injunctive relief against non-customer trespassers eBay was physically unable to exclude.”

The Court then went on to conclude:

“The analytical difficulty is that a wrongdoer can commit an ongoing trespass of a computer system that is more akin to the traditional notion of a trespass to real property, than the traditional notion of a trespass to chattels, because even though it is ongoing it will probably never amount to a conversion.  The Court concludes that under the circumstances present here, BE’s ongoing violation of eBay’s fundamental property right to exclude others from its computer system potentially causes sufficient irreparable harm to support a preliminary injunction.”

This approach was followed in Oyster Software v Forms Processing, (2001 WL 1736382, ND Cal, December 6, 2001).

However, more recently it was not followed in the Central District of California in Ticketmaster v Tickets.com. The decision is available online at http://www.netcoalition.com/keyissues/2003-06-12.430.pdfit

Also see article by Frankfurt, Kurnit, Klein and Setz PC on 17 July 2003 at http://www.worldebusinesslawreport.com/index.cfm?action=login&c=17801&id=2157

The plaintiff failed in part because of a lack of proof of harm, an issue that turns on whether using up existing computer capacity is sufficient to found the action. Judge Harvey at p308 of the text “internet.law.nz” (Lexis Nexis, 2003) suggests that it is not unreasonable to conclude that the rationale in eBay could be extended to New Zealand. Applying the flexible constructs of the tort of trespass, this seems correct.

Pharmatrak

The US Court of Appeals for the First Circuit has just found that web bugs and cookies may violate the Electronic Communications Privacy Act (ECPA).  See article by McCarter and English LLP in World eBusiness Law Report, 13 June 2003 at http://www.worldebusinesslawreport.com/index.cfm?action=login&c=17801&id=2040

Pharmatrak used stored cookies to collect date on users visiting websites of various pharmaceutical companies.  These web bugs identified the user’s internet address and sent it a cookie.  Even though the reports were generic (non-personalised) in nature, Pharmatrak’s servers were found to contain at least some personal information (on some 232 users among millions tracked).

Even so, the Court found that even though consent had been given by Pharamtrak’s clients, collection of this data was outside the consent provided.  As such, the Court found that the ECPA may have been breached in that there had been an “interception” of an electronic communication. However the Court found that as it needed to be shown that the interception was intentional the matter was remanded to the Trial Judge to determine whether this had occurred in advertently or intentionally.  For a full copy of the decision in In Re Pharmatrak, Inc (US Court of Appeals, First Circuit, No. 02-2138 – available on-line at http://caselaw.lp.findlaw.com/scripts/getcase.pl?navby=search&case=/data2/circs/1st/022138.html

Scraping

Scraping involves gathering information from websites and re-using it.  In American Airlines Inc v Farechase Inc (District Court, Tarrant County, Texas, No. 067-194022-02, 8 March 2003),

see copy of temporary injunction at http://www.eff.org/Cases/AA_v_Farechase/20030310_prelim_inj.pdf the defendant was enjoined from accessing American Airlines’ website.  The defendant had accessed and obtained information from various travel websites – in particular in relation to travel schedules are fare information – known as “screen scraping”.  As it did in the eBay case, the US courts found that this practice was unacceptable in that it placed an undue burden on American Airlines’ computer system.

The Court found:

“FareChase intentionally and without authorization from American continues to interfere with American’s possessory interest in its own computer system.  Fare Chase’s conduct intermeddles with and interferes with American’s personal property.  Such conduct constitutes a trespass.”

The Court also found that this conduct not only placed an unacceptable strain on American’s computer system but also breached the website’s terms and conditions.

Also see article by Skadden, Arps, Slate, Meagher and Flom LLP in World eBusiness Law Report, 24 April 2003 at  http://www.worldebusinesslawreport.com/index.cfm?selectedpub=1,8&action=dsp_item&id=1891

In New Zealand, the organisation Trade Me took similar action and managed to get a competitor TradeWise to stop scraping auction information from its site.  See article by Russell McVeagh in World eBusiness Law Report, 9 May 2003 at http://www.worldebusinesslawreport.com/index.cfm?action=login&c=17801&id=1925

In the EU, member states are required to introduce an “opt-out” option on or before 31 October 2003.  This means that users must be given the necessary information and allowed to opt out of having cookies on their system. See Linklaters newsletter – Spam Busters? The Implementation of the Directive on Privacy and Electronic Communications 20 June 2003 at http://www.linklaters.com/newsanddeals/newsdetail.asp?newsid=1469&navigationid=6

Web bugs can usually only be detected by examining the html code and are very difficult to disable.  Likewise, their very existence is often a mystery.  They do however raise real privacy concerns given that they are adapted to intercept and monitor communications and they have as much invasive capability as robots and other data gathering tools.

Privacy Concerns

From a privacy point of view, the main concern about cookies and bugs is that few Internet users understand exactly what they are and certainly do not consent to their existence on their computers.  Further, most users are not properly informed of the use to which the data is to be put.

Cookies and bugs raise significant human rights/privacy issues if they store information about a user’s private conduct. Commercial issues come to mind. However, there are other, more fundamental, issues.  By way of example, leaving aside plainly illegal conduct such as accessing child pornography, should an Internet user be entitled to access say lawful porn using his/her own computer terminal at home? And should each step be recorded by an anonymous cookie set up to recognise that person’s “viewing” interests but without informed consent? Finally, should that person then be subjected to unsolicited and mostly unwanted banner advertisements which appear on their monitors and are frustratingly hard to delete? Most observers would say no!

Assuming again the conduct in accessing that material is legal, is there any distinction between the situation outlined above and that in Lawrence (supra)?  Does either the state or a commercial enterprise, have the right to interfere, monitor and record that person’s activities?  I would suggest that in principle the answer is again no.

The threat to privacy has not gone unnoticed.  In Private Word, Issue No. 48, April-June 2003, the Privacy Commissioner notes that the 1980 OECD Guidelines on Privacy may well be ineffective to deal with spiders and crawlers, being prepared before these devices were invented.  It is also noted that they are capable of subjecting personal data to fresh surveillance against criteria different from those for which the data had originally been collected and possibly unknown or even non-existent at the time of collection.

In the same article, Justice Kirby is reported to have suggested that better encryption of personal data is necessary along with cross-checking measures, through some human agency rather than an automated process.

Facebook
Twitter
LinkedIn

Clive Elliott-Barrister

I live and work in Auckland, New Zealand. I am a frequent writer and commentator on intellectual property and information technology issues. I am a barrister and arbitrator. Before going to the Bar in 2000, I was a partner and headed the litigation team at Baldwin Shelston Waters/Baldwins. I took silk in 2013. Feel free to contact me via phone, email or social media.