Digital Choke Daynotesmoving graphic image

Daynotes a daily journal of our activity

Send us email

Digital Choke an action that is sometimes needed for your computer; also a short techno-story available here.

"Daynotes" are popularized by a Internet Web site called the "Daynotes Gang" ( or, a collection of the daily technical and personal observations from the famous and others. That group started on September 29, 1999, and has grown to an interesting collection of individuals. Readers are invited and encouraged to visit those sites for other interesting daily journals. You can send your comments to us by clicking on any mailbox icon.


Anti-Spam Server

What I Did On My Summer Vacation (July 2003)


Email – Not Absolutely, Positively Delivered
Rick Hellewell, July 30, 2003

While email has become an essential part of business and personal communication, using it as a reliable delivery source has become more difficult. Messages may not be delivered for many different reasons. The email user (sender or receiver) may need to shift away from the absolute assurance that an email message will be delivered.

The chief reason for this shift in email delivery reality is due to the increased amount of “spam”, or unsolicited email. The increase in the amount of spam arriving on the corporate (or personal) email doorstep has resulted in the increased use of some sort of email filtering process. Businesses (and the individual user is included in this discussion as a “business”) are using email-filtering software to block the amount of spam, which has been estimated to be 30-50% of all email processed by a business. (Similar amounts of spam are received by individuals; many have their own personal email filtering process.)

The use of email filtering software is not foolproof. Content blocking rules may catch a message that is not what it seems. Consider the example of a message containing common words such as “adult” or “exhibition”. While these words might be common in an adult-content message, and therefore should be blocked in a business environment, the same words could also occur in valid email. Filtering software attempts to handle this situation by using dictionary-based analysis. One dictionary category might contain “adult” words, another ‘shopping’ words, and another ‘weapons’ words. Each word in a dictionary category has a score of 0 to 100. The aggregate score of all message words that were found in a dictionary are used to determine whether the message is of that category. So a rule based on dictionary analysis might specify that a score of over 300 for the words found in a shopping dictionary would indicate that the message is shopping/marketing spam, and should be blocked.

In many cases, these types of rules function properly to catch that type of shopping/marketing spam message. But there will be occasions that a valid message (such as a verification of an Internet order) might be caught by that type of rule. Such a message may have a lot of shopping-type words in it, either due to the nature of the message, or the additional ‘canned’ marketing text that is commonly added to a purchase verification message. When a valid message is caught this way, it is called a ‘false positive’.

An analogy would be that some people think catalogs received via postal mail is great, while others detest catalogs. How is the postal service to judge whether a catalog should be delivered? Their users have decided that it is important to block catalogs with offensive (adult) content, so some blocking process needs to be in place. The postal service implements a catalog analysis rule that states that catalogs containing offensive pictures are not allowed. This is a good rule, and it works until the ABC Tool Company decides to include a picture of a bikini-clad model next to a picture of their tools in the catalog. The rule will find that picture to be offensive, so it is blocked. The shop foreman depends on that catalog, and it doesn’t get delivered. That blocking rule just found a false positive.

The challenge is to create email-filtering rules that will catch the obvious spam (and the types of mail that is not wanted), while still allowing valid messages to be delivered. The goal should be to reduce, as much as possible, the number of false positives in an email filtering solution. This can be quite difficult. If a rule is too strict, there might be more false positives. If a rule is too lenient, offensive or unwanted mail will be delivered.

It would seem inevitable that there needs to be a fundamental shift in user’s perception and use of email. The increased use of email filtering by businesses and individuals may mean that users cannot assume that every single email message is going to be delivered. If a person is going to send out important information via email, they may also need to ensure that the message was actually received by the intended recipient. This may require sending another short email (such as “I sent you the information you requested about our current project. Please let me know if you didn’t receive it.”). It may be required that a follow-up phone call to ensure important/valid messages are delivered.

Users may need to be more aware of the content of their message, and the possibility that innocuous content may be caught as a false positive. For instance, it is common in draft documents to put a marker such as “xxx” in the place of words or phrases (such as “the project will be managed by xxx”). If there are enough occurrences of ‘xxx’ in the message, it may cause the messages’ aggregate score to be over the limit of an email-filtering rule. (The “xxx” word is commonly found in many adult or pornographic messages.) The sender may need to remember to use a different value, such as “zzz”.

This adjustment will be difficult for many users. People are used to assuming that their email was delivered and read. Many people understand the need for content filtering of messages; they are tired of all the junk mail they get. But they also want their message to be delivered, no matter what.

The problem of spam, including offensive mail, is large and it will be difficult to completely solve. The solution may require a combination of technical means (‘do not mail’ lists, a technical redesign of the email infrastructure), and legislation and enforcement. But the users need to adjust to the reality that email delivery is not guaranteed. Email needs to be carefully written to take into account possible (or even probable) email filtering. And users need to understand that there might be email caught as a ‘false positive’, even though the best efforts of the email filtering administrator will try to keep false positives to an absolute minimum.

Copyright (c) 2000-2003    Two Bridges Group,   All Rights Reserved
Digital Choke Daynotesmoving graphic image