Surfing the internet, I came across this site goole.com. Click this site at your own risk. It brings up an ethical question. Should websites that deal results to a wide range of people and can expect an error or two from time to time from everyone own certain domains? How far should they go?
Please consider the golden rule here. The majority of spelling errors occur within an edit distance of two. Peter Norvig found a value as high as 98.9 percent of errors with roughly eighty percent at one distance unit. A quick review of Levenshtein from code can be found here but it is the total number of transpositions, insertions, and deletions. Apache also produces a version.
Let’s consider Google, the obvious target of the above link. If someone was looking for a handout and became angry, they could easily try to turn the site into a virus ridden hell-fest for any unsuspecting victim. People make mistakes. Therefore, it is a decent proposal to at least try and protect the user by owning some of these sites.
Ownership has quite a few pros some of which are more commercial than ethical.
- Attain credibility by attempting to protect users
- Acknowledges humanity
- Alert ill-doers that you take some stance against ill-will
- Protection from likeness and image issues
The issue with ownership is that going so far may create an expectation of going even further. If a company such as Google purchased Goole, do they then need to purchase Gogle and Googls at a Levenshtein of 1. What about goilele? Perhaps the user then fails to take matters into their own hand and correct their mistakes. Even worse, what if expectations of a payout follow and failure to do so create more virus ridden Velociraptor. If this were the case, the acknowledgement may even negate some of the pros.
- Generating expectations of protection creating complacency
- Creating a drive to use the site to do ill will
- Generating opportunity to achieve a payout without effort
- Cost (especially with the availability of domain names)
- Errors may not matter for a small,specialized, or obscure website
The pros and cons are actually more numerous than mentioned but still interesting in this case. There should be some attempt to protect a site. Not everyone will hit the target page every time. Errors are human. Since most errors are within two Levenshtein distance units, obtaining a large number of sites within this number that seems appropriate to avoid errors by most users is helpful. Goole may be so close to Google that it would make a solid purchase. This weighs the need to protect ones self against errors but why not show an alert or post a blank page instead of a redirect. This approach avoids extreme costs by considering appropriateness, acknowledges human error, and meets most criteria laid out here.