Digital Wallets and Whistle Blowing



            Whistle blowing has many connotations and is often confusing to those who are faced with critical decisions.  Some may view a whistle blower as a public protector, while others may view him as a pest.  Whistle blowing can be defined as notification, outside the accepted channels, of immoral, illegal, or dangerous activities[1].  Such an action typically arises from a situation involving a conflict between social responsibility to the public and loyalty to an employer or client.  The information to be divulged should involve a significant moral issue and be previously unknown to the ones to which it is reported.[2]  There can be repercussions to the whistle blower if these conditions are not met, and even when they are, loss of employment is always a possibility.   After discussing some of the background of digital wallets, we consider a scenario in which an employer is not directly involved, but the very act of going to the public has negative consequences as well as positive ones.  Possible alternatives are then analyzed according to the Software Engineering Code of Ethics.



The next big step in the e-commerce world is the common use of digital wallets.  The purpose of a digital wallet is to make online purchases simple.  A digital wallet stores information such as credit card numbers, billing information, shipping information, and other sorts of business transaction information.  This enables the user to checkout online very quickly.  The digital wallet can be located on a user’s computer, PDA, or even a smart phone.  When users make a purchase, if the seller supports their version of the digital wallet, they can complete the transaction with ease and security.  The transaction is done without having to enter any detailed information.  For individuals that make a lot of online transactions this is a very useful tool.   Besides credit and billing information, the digital wallet can also store other information such as phone numbers, addresses, and contact information (Iliumsoft).  This tool is just like an ordinary wallet, secure and private, except all of the information is stored on a computer.


There are several manufacturers of digital wallets, including Microsoft’s “Passport” and Ilium software’s “eWallet”.  Passport , released in 1999, is a little different than other digital wallets.  Most digital wallets store the information on the user’s computer just like a wallet keeps their own information.  However, Passport keeps the information in a database on a Passport server.  According to Microsoft, the credit card information is on a database that is not connected to the Internet and uses the Triple Standard Data algorithm for encryption.  Storing the information on a database that is not connected to the Internet gives Passport more security. 


            Passport is used strictly for online transactions.  Users log into Passport via the Internet by supplying a username and password.  Microsoft’s website says “Passport is the largest online authentication system in the world.”  They have more than 200 million user accounts worldwide (Microsoft). 


Since the digital wallet stores very private information, security is a significant issue.  Microsoft’s Passport had a security hole, which would allow people to hack into the Passport site and get sensitive information about the user.  The sensitive information included the users credit card numbers, back account information, home address, and phone number.  A hacker could gain access through a user’s hotmail account using a special program.  By getting into the hotmail account, the hacker could get between the person’s computer and the website to obtain the information.  When a user signs in to Passport, the user is given five minutes to make a transaction, or the user will have to re-login.  This opened a window for the hacker by using the 5 minutes of inactivity to get the information.  So if a user logs in and doesn’t complete the transaction right away, or forgets to log out, the hacker can gain sensitive information through the open account. (Berger).


A Scenario[3]

John Harrington is an average fellow.  Like many consumers, he likes to rent a video occasionally, play basketball on Saturday, and visit the park on Sunday.  John also happens to be a software engineer for a small software development company.  Lately, John has developed an interest in network programming.  His background is mostly in database and user interface programming, and he wanted to learn more.


So, John bought a highly recommended book about network programming and started reading in his free time.  One of the examples in the book showed a step-by-step method for building a packet sniffer that would enable him to see his network traffic.  John, excited at the new project, started following the example and implementing it as he went.  He completed the example and was pleased with himself because of the few problems he encountered during the exercise.  Then, being the curious type, John left the packet sniffer running on a spare computer hooked into his home network to see what data the program would collect.  John soon forgot about his packet sniffer’s running in the background. 


Later that evening, John was completing an online purchase of the latest fad in portable data storage, RAM on a stick, facilitated by his digital wallet.  He then logged onto some Internet games for recreational distraction.  The next morning, John sat down at his spare computer and remembered that he had left the packet sniffer running.  He browsed through the recorded data and was surprised when he noticed the logs of his online transaction for the portable data storage device.  The digital wallet server had sent an unencrypted command to his digital wallet telling it to transmit its important customer information back to the server.  With his curiosity piqued, John wrote a program that would simulate a digital wallet server and send that command to his digital wallet.  To his dismay, the digital wallet transmitted the information without verifying the source of the command.


John collected some more data and contacted the company that created his digital wallet software with his findings.  After John voiced his concerns to a representative of the company, he was told, “We are aware of that capability, but it is minor and not worth addressing.”  John was understandably upset.  What the company considered minor could cost their users unimaginable amounts of money.  The representative assured him that such an occurrence was highly unlikely.  Flabbergasted, John hung up the phone and was unsure what to do.


Analysis of the issues

John tried to think about the issue in an organized way.  First, he considered whom this might affect.  He identified four basic stakeholders in the issue:  John himself, the company, the users of the digital wallet, and the general public.  John is also one of the users.  The most vulnerable of the stakeholders are the users, who are not aware of how vulnerable they are.  John has an entire range of possible actions, from ignoring the situation to complete public disclosure of the bug.  We will consider the following four alternatives within this range:

  1. Stay quiet and hope no one finds the problem,
  2. Inform an appropriate business agency of the problem,
  3. Go public without revealing the exact details of the problem, and
  4. Go public and reveal the exact details of the problem.


The first alternative is, in a sense, the status quo: do nothing more.  If John hadn’t been snooping around in the first place, there would be no issue.  John recalls that a friend had told him of some ethical guidelines that could be helpful in weighing the alternatives.  So he checked online and found the Software Engineering Code of Ethics. Principles 1.02 and 1.05 call for weighing “the interests of the company with the public good” and cooperation “in efforts to address matters of grave public concern”, and John has already notified the company privately of the problem and its consequences.  By not going public, he would be protecting the company, and also protecting the public by not disclosing the flaw, unless someone else discovers the bug and violates the privacy of the users.  Perhaps the company will change its mind and address the bug before anything happens.  (But how could he know for sure?)  The problem with this alternative is that John has been told that the company is not interested in fixing the situation, and users’ privacy and economic safety are still at risk.  These issues are just too important to be ignored.  This means the company should be viewed as violators of the code of ethics, and according to principle 6.13, significant violations of the code should be reported to the appropriate authorities.  Thus, the alternative of remaining quiet has to be rejected, and John feels he must become a whistle-blower, but at what risk.


The second alternative John could take is to inform appropriate business agencies, such as the Better Business Bureau.  By informing these agencies he is trying to follow the Software Engineering Code of Ethics.  Imperative 1.04 states that one should tell the appropriate people of any potential danger of software.  He had found a problem in the digital wallet and informed the company.  Since the company did not respond, another step is needed. By going to the business agencies he would be trying to help every party involved in the situation.  By not going public with the issue, he did not expose the vulnerability of the system to everyone.  He also helped the company by not exerting public pressure to fix the bug.  He identified and defined the problem and then told the business agencies about the bug so they could investigate.  By doing this, the business agencies could now be aware of the potential problem in other related products.  This again helps the public by creating stricter standards on new software.  By informing the business agencies about the problem, John still retains his option to go to the public in case the business agencies do not do anything about the problem.  This option seems like a good compromise, but is it the best alternative?  John continued considering the other possibilities.


A third alternative is for John to go public with the fact that a problem exists without revealing the details of the problem to the public.  Following this course, the exact vulnerabilities of the system are not made public (and easily exploitable) while at the same time alerting the users that the software they use is not totally secure.  The users could then judiciously restrict their use of the software.  The public could also now put pressure on the company to fix the problem.  However, because of John’s reticence to reveal exact details of the problem, the public might not take his alert seriously, and he could be accused of having ulterior motives.  Also, even though the exact details would not be released, the vulnerabilities of the system will be known to exist, and inevitably some hackers are going to try to crack the system.  Finally, after all is finished, the company still might not fix the problem, dismissing John’s claims again.


This action upholds the two parts of the code that concern the alerting of “appropriate persons or authorities” to software with dangerous effects (1.04, 6.13) if the public is considered appropriate persons.  Considering that John first tried contacting the company, as stated by point 6.12, the public is a fair candidate because contacting the company was non-productive.


A fourth alternative is for John to go public and completely reveal the details of the problem.  With this information, hackers will be able to commandeer the digital wallet system, which will force the company to fix the problem or lose their entire customer base.  If the company fails to respond quickly, however, users could lose large amounts of money and have their privacy violated.  If users were aware of the problem they could stop using the digital wallet software, although this would cause some merchants still to lose money.  Finally, as a personal effect, John’s reputation as a software engineer could be damaged as some professionals might see his actions as too extreme.  While John seems to be upholding points 1.04 and 6.13 as mentioned above, he could be doing more harm than good for the users by revealing all the details.


After weighing the pros and cons of these four alternatives, we think that John should choose the second alternative and notify an appropriate business agency.  The problem is too important just to walk away, and going to the public at this point opens up new problems.  By choosing this action, John will bring the leverage of the agency to bear on the company while insulating himself from any repercussions associated with the act of whistle blowing.  It will also put the users in the least danger in terms of financial and privacy loss.[4]  If nothing happens through the agency, John still retains the option to go public.  That threat may be useful in pressuring the company to fix the bug.



            Whistle blowing is a tricky endeavor.  Usually, the whistle blower has good intentions, but sometimes more harm than good can come from the act.  In this paper, we portrayed the case of a software engineer faced with such a decision.  Either his action or inaction could cause significant financial damages to the users (including businesses) of a software package.  In trying to determine a best course of action, we analyzed four alternatives spaced along the spectrum of action to inaction.  While no one general solution exists for all cases of whistle blowing, in our specific case, the solution that was a compromise between both extremes and that isolated both the public and the individual from the consequences of whistle blowing seemed to be the best course.



Berger, Matt.  InfoWorld.  “Hole Found In Microsoft’s Passport Wallet.”  November 4, 2001. 


Gotterbarn & Deimel. “SDI”


Ilium Software: eWallet.  (

Microsoft .Net My Services. “.Net Passport Overview.”  March 20, 2002. 


The Software Engineering Code of Ethics and Professional Practice (5.2). 



Copyright 2002 D. Atkins, D. Fetters, and D. Hulse. This case may be published without permission and at no cost as long as it carries the copyright notice.

[1] Adapted from Gotterbarn & Deimel

[2] Ibid

[3] Based on an actual case that involved the previously mentioned security hole in Microsoft’s Passport.

[4]  This is consistent with both the utilitarian and deontological philosophical viewpoints.