Popis: |
Classical mechanics seems to deal with limitless precision. For example, a particle may be precisely at x at t. The widths associated with these values are dx and dt, but are taken to approach 0. We argue that this view is carried over into probabilistic scenarios such as a photon moving in the x direction reflecting or refracting. Classically there is limitless precision so the photon is an incident one at one precise x and then instantaneously is either a reflected one, presumably at the same x, or a refracted one. This leads to the equation: 1=probability to reflect + probability to refract ((1)), which holds for a single photon which may be written in terms of fluxes (which are average values). For this equation to make sense, one requires limitless precision so that the states (incident, reflected, refracted) are precise and exist at precise times. We argue (as we have before) that this is the inherent reason why ((1)) is not sufficient to solve for the relative probabilities of reflection/refraction. We suggest that the interaction at an n1-n2 interface (indices of refraction) is complicated and there is not sharp x,t precision or even a clear picture of what is instantaneously occurring. As a result, we suggest one uses averages to try to describe the situation, with p and E being average values during the interaction. Nonrelativistically p=mov and given the quantum exp(-iEt+ipx), one cannot have a precise velocity, we suggest v is an average and hence p an average. Thus impulses may be described in terms of averages as well as speeds. This is why Snell’s law may be obtained using only these average values, which are taken to be precise deterministic values in classical mechanics. The blurriness of the interaction suggests there is not an instantaneous division between the incident and refracted/reflected or even the refracted/reflected photon. As a result, it is a blurry or probabilistic scenario which describes the situation probabilistically. One may note that during the interaction, p and E could be changing so these are treated as average values. Thus the idea seems to be to use average values of p and E when describing complicated interactions, but x and t probabilities governed by these average values. The question then becomes: How does one find these probabilities? We argue (as before) that a general approach is to use maximization of Shannon’s entropy (subject to periodic constraints) to obtain exp(-iEt) and exp(ipx). E and p are Lagrange multipliers and so like in classical statistical theory are averages. This equilibrium approach with its blurriness in x and t , but average p and E values, allows one to describe a complicated interaction by changes in the average value e.g. the magnitude or direction of the p vector together by adding exp(ipx) for what would be normally deterministic paths.  |