This is a note in progress on sensors and processing for airport security. I will be working on it for several days or weeks. Expert advice on queuing theory and other topics would obviously improve it.

Subtle and rare threats arise to the great society. A bridge modulated at the right frequency may collapse. Software may not recognize the year 2000. A pungent mold may grow in a new building. An obscure sect in a remote corner of the world may be offended by our tolerance of homosexuals. (Do we just tolerate homosexuals or do we revel in our gay friends? Some of them are a laugh riot. Others are moody. Some have fashion sense. Some have none. They are surprisingly like us, mostly joyful. We adore them as we adore all constituents of the great society. This does seem to offend in some quarters, even within the great society.)

The stability of the great society requires us to anticipate, detect and mitigate rare threats. There were nearly 700 million passenger emplanements in the United States in 2000. None of these passengers were successful terrorists.

Let us approximate the probability that a random individual drawn from an airport the United States is a terrorist as one in a billion. How should we discover this one in a billion event? How should we balance the cost of failure to find the terrorist against the opportunity cost of time and equipment wasted in looking? If we do not look, air borne terror is certain. If we look so hard that we cannot fly, the connections between cities and communities in our great society are fractured. If we delay 2 million passengers a day by one hour, we are wasting 228 person-years per day, 4 lifetimes a day, over a thousand lifetimes per year.

Why is it necessary to delay passengers for two hours for a five minute security check? Two justifications occur:

  1. Queue processing capacity cannot keep up with passenger arrival and
  2. The behavior of the passengers during the two hour delay is part of the test.

With regard to queuing , making passengers wait does not improve queue throughput. So long as the queue stays full the rate of passenger screening is constant. By adjusting the screening staff to the queue length, airport operations ensures that the airport cost per screen is constant. A queue short enough to eliminate the line raises the possibility that operating cost per screen becomes unacceptable.

Queue level cost control is used most effectively in fast food restaurants. A fast food worker may earn one quarter median national hourly wage, but if the worker stands idle waiting for customers then the wait time becomes a multiplicative factor in the labor cost. Ironically, for typical fast food customers the wait time may be more expensive than the food itself. This inefficiency is unfortunate for the restaurant, which ideally would capture the full cost to the customer as income.

As a result of recent events, the ticket cost of airtravel is now much less than the opportunity cost to travelers of traveling. In embarking on a trip a traveler calculates the cost of transportation, the opportunity cost of the time in transit and the expected rewards of making the trip. The traveler makes the trip if this calculation is sufficiently positive. An airline increases revenues by raising the ticket cost, but must be careful to keep the trip cost positive.

The overhead cost of preparing for work and getting to the airport and the opportunity cost of part-time work mean that security personnel levels cannot be rapidly adjusted to queue status. Thus the queue tends to grow until passengers decide not to travel. The queue eventually falls as the day ends. Since unused capacity in quiet travel times does not help with the lost revenue from the busy times, capacity is reduced.

The second reason to keep passengers around for a while is to observe them for a longer time. This may not be a bad idea. The basic problem with airport security is the assumption that a firewall is possible. The test at the checkpoint is wildly ineffective in reading the human heart, is easily subverted under the stress of the queue and creates enormous economic inefficiencies.

A terrorist looks like this:

A nonterrorist looks like this:

Suppose that we know we are given a stack of a billion lottery tickets and are told that one of them is a winner. What are the odds that we can find the winning ticket? What search strategy should we follow?

The tickets are not ordered, there is no a priori predictor of which ticket will be the winner. Do we have any alternative to sequential scan of all the tickets?

Suppose that we have one test that is absolutely accurate. Let N be the number of tickets and assume that all tickets are equally likely to be winners. The expected time to ticket detection is (N+1)T/2, where T is the time required per test. The variance is Sqrt((N^2-1)/12). Note that the variance is of order N. Basically one expects to fully search most of the tickets.

Suppose that we have two tests. The first test takes time t1 and has probability Pp1 of falsely declaring a losing ticket a winner and probability Pn1 of falsely declaring a winning ticket a loser. All declared winners are examined again with a absolutely accurate second test taking time t2. With random probability Ps, tickets initially declared winners are tested again with the second test. The probability that the winning ticket is not discovered is independent of the number of tickets tested and is Pn1*(1-Ps). This is the probability of failure. The time required to search all the tickets is N*t1+Pp1*t2+(1-Pp1)*Ps*t2. The probability of error drops and the search time grows linearly in Ps. For Ps=1 the search time is the same as for the one test case. If we require that the probability of failure be low Ps will be large and the search time will be the same as for the one test case. Independent of all other factors, the overall testing time remains linear in the number of tickets.

Our search for terror is more complex. While we know that the probability is low, we do not know that terror is certain. The pool of potential terrorists is a steady ergodic stream. The probability that an individual is trouble is Poisson, although bunching may occur. We measure the rate at which we process samples rather than the time to process a sample set. As before, both the error rate and the testing rate are linear in the probability of testing negative results. No speed up seems possible.

Quantum processors claim improved search efficiency, but these results are based on simple parallelism. Since all tests are assumed independent, the search process is embarrassingly parallelizable. Ideal parallelization involves distribution of inexpensive resources to reduce the cost of screening.

The ideal system must use the natural resources of the source space to achieve parallelism. One does not seek to find the winning ticket as much as one seeks to cause the winning ticket to shout "here I am." Can we create an environment in which this occurs.

If we are seeking a one in a billion event it might help to establish a baseline of normality. Then the one in a billion event is the event that breaks this baseline. If we record which doors tend to open when, trajectories that people and groups tend to take through spaces, likelihoods of cars being lost verses being security probes, etc. would the unusual event tend to stand out early. To test these theories two things are needed: flexible sensors that record data to be correlated and correlation tools that find unusual system states. We seek to find the measure space in which the unusual activity is obvious.

Does a suicide bomber behave in unusual ways as he moves from checkpoint to target? Is is gait measured. Is he unusually quiet. Does he touch himself more often, patting his tools? Does he talk to fewer people. Does he sweat alot? These are measureable quantities. They can even be measured in ways that do not involve image analysis and that are not immediately tagged to the individual (the lack of visual specificity may be important for privacy concerns). We could monitor these behavioral issues in airports. If we made airports pleasant work and shopping spaces, we could even monitor these factors while delaying passengers. In this way, the cost to passengers might be reduced by making wait time productive and the value returned to the airlines might be more reflective of the real cost to passengers.