Setterfield again

Issues related to how the world came about can be discussed here. <i>Registered Users</i>

Moderator: webmaster

tuppence
Moderators
Moderators
Posts: 1017
Joined: Thu Aug 19, 2004 03:12 pm

Setterfield again

Postby tuppence » Sun Jan 16, 2005 11:04 pm

The following post by Admiral Valdemar was in the 'science defined' thread. It is not a part of that thread and therefore has been posted here with my response, starting a new thread. Folks, I am VERY serious about keeping on topic. Feel free to start a new topic if you having something to say which is not on the topic of the thread you are in, but as long as I am moderator, the threads are going to stick to the topics. Thanks. Here is this one, starting with Admiral Valdemar's:

Admiral Valdemar wrote:
tuppence wrote:It is very easy to show a speed of light change right where you are.

Put a straw in a glass of water. It appears that the straw has disconnected somewhat at the surface of the water when you look at it from the side. That is due to a change in light speed due to the medium it is passing through.


Which'd mean something if it wasn't for the fact that c is defined in a vacuum.

*Snip rest of stuff*


Wrong again, I'm afraid.

Ever heard of supernova S1987A? This quite nicely puts the age of the universe down to at least 170,000 years thanks to the spectral signature of Cobalt exhibited by that event. Had light been slowing down en route then the decay rate would have been linearly proportional to c. But it isn't, because the decay rate is not linear.

I'm going to enjoy seeing how you weasel out of this given the decay rates are constant with those witnessed in the lab. I know, maybe God intervened to change this too!



Admiral, you are again showing your ignorance.

The vacuum of space is not a 'nothingness'. It has not been called a 'seething vacuum' for nothing.

The following is from one of Barry's papers that you have not bothered reading:

THE VACUUM

During the 20th century, our knowledge regarding space and the properties of the vacuum has taken a considerable leap forward. The vacuum is more unusual than many people realise. It is popularly considered to be a void, an emptiness, or just 'nothingness.' This is the definition of a bare vacuum [1]. However, as science has learned more about the properties of space, a new and contrasting description has arisen, which physicists call the physical vacuum [1].

To understand the difference between these two definitions, imagine you have a perfectly sealed container. First remove all solids and liquids from it, and then pump out all gases so no atoms or molecules remain. There is now a vacuum in the container. It was this concept in the 17th century that gave rise to the definition of a vacuum as a totally empty volume of space. It was later discovered that, although this vacuum would not transmit sound, it would transmit light and all other wavelengths of the electromagnetic spectrum. Starting from the high energy side, these wavelengths range from very short wavelength gamma rays, X-rays, and ultra-violet light, through the rainbow spectrum of visible light, to low energy longer wavelengths including infra-red light, microwaves and radio waves.

THE ENERGY IN THE VACUUM

Then, late in the 19th century, it was realised that the vacuum could still contain heat or thermal radiation. If our container with the vacuum is now perfectly insulated so no heat can get in or out, and if it is then cooled to absolute zero, all thermal radiation will have been removed. Does a complete vacuum now exist within the container? Surprisingly, this is not the case. Both theory and experiment show that this vacuum still contains measurable energy. This energy is called the zero-point energy (ZPE) because it exists even at absolute zero.

The ZPE was discovered to be a universal phenomenon, uniform and all-pervasive on a large scale. Therefore, its existence was not suspected until the early 20th century. In 1911, while working with a series of equations describing the behaviour of radiant energy from a hot body, Max Planck found that the observations required a term in his equations that did not depend on temperature. Other physicists, including Einstein, found similar terms appearing in their own equations. The implication was that, even at absolute zero, each body would have some residual energy. Experimental evidence soon built up hinting at the existence of the ZPE, although its fluctuations do not become significant enough to be observed until the atomic level is attained. For example [2], the ZPE can explain why cooling alone will never freeze liquid helium. Unless pressure is applied, these ZPE fluctuations prevent helium's atoms from getting close enough to permit solidification. In electronic circuits another problem surfaces because ZPE fluctuations cause a random "noise" that places limits on the level to which signals can be amplified.

The magnitude of the ZPE is truly large. It is usually quoted in terms of energy per unit of volume, which is referred to as energy density. Well-known physicist Richard Feynman and others [3] have pointed out that the amount of ZPE in one cubic centimetre of the vacuum "is greater than the energy density in an atomic nucleus" [4]. Indeed, it has been stated that [5]: "Formally, physicists attribute an infinite amount of energy to this background. But, even when they impose appropriate cutoffs at high frequency, they estimate conservatively that the zero-point density is comparable to the energy density inside an atomic nucleus." In an atomic nucleus alone, the energy density is of the order of 10^44 ergs per cubic centimetre. (An erg is defined as "the energy expended or work done when a mass of 1 gram undergoes an acceleration of 1 centimetre per second per second over a distance of 1 centimetre.")

Estimates of the energy density of the ZPE therefore range from at least 10^44 ergs per cubic centimetre up to infinity. For example, Jon Noring made the statement that "Quantum Mechanics predicts the energy density [of the ZPE] is on the order of an incomprehensible 10^98 ergs per cubic centimetre." Prigogine and Stengers also analysed the situation and provided estimates of the size of the ZPE ranging from 10^100 ergs per cubic centimetre up to infinity. In case this is dismissed as fanciful, Stephen M. Barnett from the University of Oxford, writing in Nature (March 22, 1990, p.289), stated: "The mysterious nature of the vacuum [is] revealed by quantum electrodynamics. It is not an empty nothing, but contains randomly fluctuating electromagnetic fields with an infinite zero-point energy." In actual practice, recent work suggests there may be an upper limit for the estimation of the ZPE at about 10^114 ergs per cubic centimetre (this upper limit is imposed by the Planck length, as discussed below).

In order to appreciate the magnitude of the ZPE in each cubic centimetre of space, consider a conservative estimate of 10^52 ergs/cc. Most people are familiar with the light bulbs with which we illuminate our houses. The one in my office is labelled as 150 watts. (A watt is defined as 10^7 ergs per second.) By comparison, our sun radiates energy at the rate of 3.8 x 10^20 watts. In our galaxy there are in excess of 100 billion stars. If we assume they all radiate at about the same intensity as our sun, then the amount of energy expended by our entire galaxy of stars shining for one million years is roughly equivalent to the energy locked up in one cubic centimetre of space.

THE "GRANULAR STRUCTURE" OF SPACE

In addition to the ZPE, there is another aspect of the physical vacuum that needs to be presented. When dealing with the vacuum, size considerations are all-important. On a large scale the physical vacuum has properties that are uniform throughout the cosmos, and seemingly smooth and featureless. However, on an atomic scale, the vacuum has been described as a "seething sea of activity" [2], or "the seething vacuum" [5]. It is in this realm of the very small that our understanding of the vacuum has increased. The size of the atom is about 10^-8 centimetres. The size of an atomic particle, such as an electron, is about 10-13 centimetres. As the scale becomes smaller, there is a major change at the Planck length (1.616 x 10^-33 centimetres), which we will designate as L* [6]. In 1983, F. M. Pipkin and R. C. Ritter pointed out in Science (vol. 219, p.4587), that "the Planck length is a length at which the smoothness of space breaks down, and space assumes a granular structure."

References for this part of the paper:

[1]. Timothy H. Boyer, "The Classical Vacuum", Scientific American, pp.70-78, August 1985.

[2]. Robert Matthews, "Nothing like a Vacuum", New Scientist, p. 30-33, 25 February 1995.

[3]. Harold E. Puthoff, "Can The Vacuum Be Engineered For Spaceflight Applications? Overview Of Theory And Experiments", NASA Breakthrough Propulsion Physics Workshop, August 12-14, 1997, NASA Lewis Research Center, Cleveland, Ohio.

[4]. Harold E. Puthoff, "Everything for nothing", New Scientist, pp.36-39, 28 July 1990.

[5]. Anonymous, "Where does the zero-point energy come from?", New Scientist, p.14, 2 December 1989.

[6]. Martin Harwit, "Astrophysical Concepts", p. 513, Second Edition, Springer-Verlag, 1988.


The Setterfield response to questions about SN1987A are here:

http://www.setterfield.org/Astronomical ... supernovas

Again, Admiral, you ought to stick to areas you know something about instead of parroting others.
born again Christian, non-denominational. Young universe creationist.

tuppence
Moderators
Moderators
Posts: 1017
Joined: Thu Aug 19, 2004 03:12 pm

Postby tuppence » Sun Jan 16, 2005 11:15 pm

In keeping with trying to get material in one thread to conform to the thread topic, I am erasing this thread of mine in the other thread and putting it here. It actually came just before the material in the post before this one:

**********

tuppence wrote:It is very easy to show a speed of light change right where you are.

Put a straw in a glass of water. It appears that the straw has disconnected somewhat at the surface of the water when you look at it from the side. That is due to a change in light speed due to the medium it is passing through.

That is EXACTLY the same reason we have been able to measure a change in c for the last several hundred years. Things are happening to the 'fabric of space' which have caused this change.

In the meantime, Planck's constant and the mass of the electron have been measured as changing in direct relation to the changes in c.

All of this has absolutely nothing to do with 'one man's crusade' -- it has to do with the data

By the way, Admiral, it's easier to research data than to pull your hair out; usually less painful, too.

justforfun stated the evidence for evolution stands up. WHAT evidence??? I would love to see some!


You wrote, If one man calls you an ass, ignore him. If 100 people call you an ass, buy a saddle. That is a terribly ignorant and mocking statement. It assumes truth is truth by majority rule. History certainly does not bear that out! Now, if a man I know and respect for his knowledge and honesty calls me an ass, then no matter what the hundred other folk say -- even if they think I am brilliant and right and such -- I will considere buying a saddle!

And yes, often it is laypeople arguing on forums. But I am not one and neither is my husband and we have plenty of data and documentation to back up what we say here.


Admiral, the earth is not 6000 years old; it is closer to 8000. Nor has c stopped changing, for the vascillation that is occurring can now be seen in it and a number of other constants now that the curve has flattened out pretty well. You are spending your time ignoring data, and that is doing nothing to help you at all.

Here is a little raw data for you: The three measurements of the atomic constants listed are from 1969, 1980, and 2002, in that order. The trend is then listed as up or down. I think you will find that there have been changes!


COMMITTEE ON DATA FOR SCIENCE AND TECHNOLOGY
(CODATA) RECOMMENDED VALUES
The figures in parenthesis are error values as per the last digit(s) in the measurement



Planck’s Constant
x 10-34 Joule-seconds

6.626196 (50)
6.6260755 (40)
6.6260693 (11)
Down


Electron Mass
x 10-31 kilograms

9.109558 (54)
9.1093897 (54)
9.1093826 (16)
Down


Proton Mass
x 10-27 kilograms

1.67265392 (11)
1.67262310 (10)
1.67262171 (29)
Down


Gyromagnetic Ratio
x 10^8 seconds^-1Tesla^-1

26751.270 (82)
26751.5255 (8 )
26751.5333 (23)
Up


Magnetic Flux Quantum
x 10^-15 Weber

2.0678538 (69)
2.06783461 (61)
2.06783372 (18 )
Down


Josephson Constant
x 10^14 Hertz/Volt

4.8359740 (11)
4.8359767 (14)
4.83597879 (41)
Up


Electron charge-to-mass
x 1011 coulomb/kg

1.7588028 (54)
1.75881962 (53)
1.75882012 (15)
Up


Electron charge/h,
x 10^14 Amps/Joule

2.41798805 (12)
2.41798836 (72)
2.41798940 (21)
Up


Rydberg Constant
meters -1

10973731.2 (11)
10973731.5 (0)
10973731.5 (0)
Constant


In other words, Admiral, come back when you have something besides your declarations to offer.

Andreas, the possible change in the fine structure constant has nothing to do with c. One of the terms in the fine structure constant is hc. h, Planck's constant (see aboce chart), is proportional to 1/c. Therefore hc is a constant and no changes in c or h will be reflected in any change in the fine structure constant.

The other way of putting it is that any measured changes in h or c are independent of the fine structure constant.

Is Barry's work starting to gain acceptance? Yes. It is. He has been invited to submit work to a peer reviewed journal. This was a shock to us, albeit a very pleasant one. His paper is in the works.

Going further down (which is actually up...grin) the list of posts here, what I am seeing is just a lot of declarations, mocking, and nay-saying, without any referencing or data at all. You evolutionists, get with it! Quit with the words only stuff and get some science together! You have declared yourselves so many times that it is getting boring to read it again and again.

If we are so stupid and ignorant, you MUST have the data which proves yourselves to be correct.
born again Christian, non-denominational. Young universe creationist.

tuppence
Moderators
Moderators
Posts: 1017
Joined: Thu Aug 19, 2004 03:12 pm

Postby tuppence » Sun Jan 16, 2005 11:24 pm

The following was also transferred as I am sure the person did not know what I was doing when he posted:

Mister Emu wrote:
stops changing at 1960 when more accurate lightspeed tests came about.


I realize that I probably have no right to be discussing such things(I'm in high school and know little about physics having yet to even take the HS course on it) but, why doesn't someone who wishes to disprove the C-decay work just use one of the older lightspeed tests, if it consistantly shows up within the ranges that were previously recorded(with the same instruments) than you could say no decay has occured, if on the other hand a consistant lower (with the same instrument) than the previous(however long ago) reading than could you not say with at least a modicum of certainty that C has decayed?


In response to this I would say "well done" to Mister Emu! And to let him know where to find the data, here are a couple of links:

http://www.setterfield.org/report/report.html -- this is the original invited white paper done for Stanford Research Institute International and published by Flinders University in Australia in 1987. It is a monster to get through for most of us, but if you just scroll down and catch the tables as they are found, you will find the data and where it came from.

http://www.setterfield.org/data.htm -- this is the part of the Discussion section of the Setterfield website in which the data itself is discussed.

I think that will be of help for anyone interested.
born again Christian, non-denominational. Young universe creationist.

Andreas
Sunday School Teacher
Sunday School Teacher
Posts: 48
Joined: Tue Dec 07, 2004 06:28 am

Re: Setterfield again

Postby Andreas » Mon Jan 17, 2005 09:55 am

tuppence wrote:Admiral, you are again showing your ignorance.

The vacuum of space is not a 'nothingness'. It has not been called a 'seething vacuum' for nothing.



Admiral Valdemar is perfectly right, this should be clear even if you stick to the hypothesis of your husband. No one questions that the speed of light in matter (c') can have almost every value. It's an interesting topic but has nothing to do with our discussion. The speed of light in vacuum (c) is just another term for the maximum possible velocity in our universe. And this speed is the same everywhere, in matter and in vacuum. On the other hand c=c' is only valid in vacuum.


tuppence wrote:Andreas, the possible change in the fine structure constant has nothing to do with c. One of the terms in the fine structure constant is hc. h, Planck's constant (see aboce chart), is proportional to 1/c. Therefore hc is a constant and no changes in c or h will be reflected in any change in the fine structure constant.


A possible change of c as discussed in the scientific community has more or less always to do with the fine structure constant alpha.
Your proposed change of c has nothing to do with alpha. This is one of the reasons why such a change would be unobservable. Again I recommend you to reread the other c-decay thread and try to understand, why unitless constants are so important.

tuppence wrote:Is Barry's work starting to gain acceptance? Yes. It is. He has been invited to submit work to a peer reviewed journal. This was a shock to us, albeit a very pleasant one. His paper is in the works.


Which journal? Proceedings of the Biological Society of Washington? :-)

I allready recommend you to try to publish the work without the untenable conclusions, just as some interesting trend in historical speed of light measurements.

Andreas
Sunday School Teacher
Sunday School Teacher
Posts: 48
Joined: Tue Dec 07, 2004 06:28 am

Postby Andreas » Mon Jan 17, 2005 11:10 am

Mister Emu wrote:I realize that I probably have no right to be discussing such things(I'm in high school and know little about physics having yet to even take the HS course on it) but, why doesn't someone who wishes to disprove the C-decay work just use one of the older lightspeed tests, if it consistantly shows up within the ranges that were previously recorded(with the same instruments) than you could say no decay has occured, if on the other hand a consistant lower (with the same instrument) than the previous(however long ago) reading than could you not say with at least a modicum of certainty that C has decayed?



Several points.
  • No single modern observation supports his hypothesis. For most people there is no reason to wish to disprove him. They are simply not interested.
  • There is too much data and the way it is presented doesn't help to review it.
  • It's not so easy as it seems. To interpret older measurements you have to know much about the older standards. They didn't have atomic clocks, for example.
  • Some of the data was allready reviewed. The reviews can be found on talkorigins and on the webpage of the Institute for Creation Research.
  • Nobody trust his data because it can easily seen that his conclusions are absurd. There is no reason to assume that his data is correct if his conclusions are flawed. For example there is no way ever, to determine the explicit curve of the decay or how c behaved in the last 6000 years. The error bars are too big.
  • There are other publications which affirm a constant speed of light.

tuppence
Moderators
Moderators
Posts: 1017
Joined: Thu Aug 19, 2004 03:12 pm

Postby tuppence » Mon Jan 17, 2005 06:01 pm

Andreas, the simplest way to respond to you is that you are speaking from ignorance.

The data and use of it were statistically defended by Lambert Dolphin, a physicist and Alan Montgomery, a professional statistician for the Canadian government in Galilean Electrodynamics. They have never been answered in any professional publication, and they are still waiting! Here is their article:

http://www.ldolphin.org/cdkgal.html

Further material on the data and further work with it can be found here:

http://www.setterfield.org/data.htm

In addition, above you can see that the measurements of other 'constants' in the past few years show changes that support what Barry is saying. I posted some of those above.

Your comment about the error bars being too big shows exactly how ignorant of this you are. Check the papers and material linked a few lines above, please, if you are wanting to know instead of to rant.

An atomic clock CANNOT measure the speed of light because the speed of light is an atomic process which varies in lockstep with all other atomic processes; therefore using the atomic clock as a measuring tool is guaranteed to show all atomic 'constants' as truly constant. However when the speed of light is determined by any other method, its changes become obvious. It is this very change at our time, which has produced the Pioneer anomaly. Barry has just finished an article on this but the basics can be seen in his previous short essay here:

http://www.setterfield.org/accelanom.htm

The review on Talk Origins is absurd and Barry answered it here:
http://www.trueorigin.org/ca_bs_02.asp

The ICR article is an embarrassment to ICR since Aardsma used a graph to 'show' Barry's work was wrong which was so out of proportion to the changes in the measurements listed that it did not appear there were any changes at all. It was like measureing daisy petals in terms of miles. If you look at the Aardsma graph and then look at the measurements in the original paper of Setterfield and Norman, you will see what I am talking about. The Aardsma article is nonsense.

In the meantime, going to the post two posts ago by you, please research wht the vacuum of space is. It is not nothingness. That is why I posted the material on the 'seething vacuum' which you evidently ignored. The Zero Point Energy is real, and is measured by means of Planck's Constant, h. Please learn what you are talking about before you start talking.

And finally, I will let you know, if you like, when the article is published.

Are people interested in Barry's work? We have been invited to speak quite literally all over the world; we get daily emails regarding his work; physicists come to see Barry from many different places and are often our houseguests for as long as a week as they spend time discussing material with Barry.

Yeah, people, and professionals in the field, are quite interested. It is just those who are so wedded to their own religious evolutionary beliefs that cannot believe he might not be a crackpot. But then, right now he is at least ahead of the acceptance Wegener got from professionals about plate tectonics. He was mocked until he died and never had the privilege of knowing that he is now considered a hero in the field. That happened in my lifetime.

I think it is fear that begets the rabid evolution defenders. Fear that God might be real, or might be right. Fear that they might be accountable to him. This was Huxley's driving concern when defending Darwin. He was absolutely against his life being accountable to anyone, and this is why Darwin had to be right. It had nothing to do with actual science. Most of evolution doesn't have anything to do with actual science to this day. That is why they are so hysterical about protecting it from any challenges. Real science can and does thrive on such challenges.
born again Christian, non-denominational. Young universe creationist.

Andreas
Sunday School Teacher
Sunday School Teacher
Posts: 48
Joined: Tue Dec 07, 2004 06:28 am

Postby Andreas » Mon Jan 17, 2005 08:42 pm

tuppence wrote:The data and use of it were statistically defended by Lambert Dolphin, a physicist and Alan Montgomery, a professional statistician for the Canadian government in Galilean Electrodynamics. They have never been answered in any professional publication, and they are still waiting! Here is their article:

http://www.ldolphin.org/cdkgal.html


Ok, I stand corrected. It's not true that the data is presented in a way that it cannot be reviewed easily. On Lambert Dolphin's web page there is a link to the c-decay data:
cdata.txt
So it's easy to import the data in any program to analyze it. Let's guess the result - it almost looks like the graph in Aardsma's article. No obvious decrease can be seen. Next let's run some fitting algorithm on the data:

Exponential decay: No dependence from the fitting parameters
Double exponential decay: No convergence in the fitting iterations
Line: Finally it works, the result is:

c=c0+dc*time

c0=2.9979e+05 +-0.118 km/s
dc=-6.8e-06 +-6e-5

The reduced Chi Square is 23.

constant: It works again, the result is

c=2.9979e+05 +-0.00016 km/s
with a reduced Chi Square of 23



The data can be perfectly explained by a constant c. There might be a slight decrease in c, this decrease would mean that c 12000 years ago would have the value 2.9973e+5 km/s. 13.7 billion years ago it would have the value 3.9e+5 km/s. Of course the error of dc is much to high to legitimate such a calculation.


An atomic clock CANNOT measure the speed of light because the speed of light is an atomic process which varies in lockstep with all other atomic processes; therefore using the atomic clock as a measuring tool is guaranteed to show all atomic 'constants' as truly constant. However when the speed of light is determined by any other method, its changes become obvious. It is this very change at our time, which has produced the Pioneer anomaly. Barry has just finished an article on this but the basics can be seen in his previous short essay here:


Tuppence, not everything I write is meant as negative criticism. The atomic clock was an example why the analysis of old data can be very hard. The wrong assumptions in above paragraph were already addressed in the old thread.


In the meantime, going to the post two posts ago by you, please research wht the vacuum of space is. It is not nothingness. That is why I posted the material on the 'seething vacuum' which you evidently ignored. The Zero Point Energy is real, and is measured by means of Planck's Constant, h. Please learn what you are talking about before you start talking.


Did I talk about vacuum fluctuations? I didn't notice anything wrong with the quote of your husband. I addressed your wrong understanding of c in matter versus c in vacuum.

Yeah, people, and professionals in the field, are quite interested. It is just those who are so wedded to their own religious evolutionary beliefs that cannot believe he might not be a crackpot. But then, right now he is at least ahead of the acceptance Wegener got from professionals about plate tectonics. He was mocked until he died and never had the privilege of knowing that he is now considered a hero in the field. That happened in my lifetime.


Wegener used fossils to legitimate his theory. :-)

tuppence
Moderators
Moderators
Posts: 1017
Joined: Thu Aug 19, 2004 03:12 pm

Postby tuppence » Mon Jan 17, 2005 11:14 pm

Hi Andreas!

Its Barry Setterfield here. It seems that your curve-fitting routine is missing something. Did you use all 163 data points or the 121 points that Montgomery preferred? Let me run through a couple of items. First of all we spent some considerable time on curve-fitting and data analysis at Flinders University Maths Department during 1986-1987. We had the Professor of Statistics helping us, and he was so impressed with the data trend that he requested a seminar for the whole Maths Department on the matter by Trevor Norman and myself.

However, it was not just us. Dr. David Malcolm, a researcher from Newcastle University, did his own analysis. He commented that any linear decay provided a better result than that with a constant speed of light assumption. The data residuals reduced from over 22,000 with a constant light speed, to under 2000 with a linear decay scenario.

I suggest that you also seriously examine the two analyses by Montgomery and Dolphin linked on our website. There they show conclusively that there is a statistically significant decay. Unfortunately, their graphs are not displayed on the links in our website, or you would see the force of their argument.

Let me come at this another way. Let us take the aberration method of measurement. In this case, we have a series of 63 determinations made up of hundreds of individual measurements taken from Pulkova Observatory using the same instruments over the period from 1740 to 1940. The period 1765 +/- 25 years gave an average value for lightspeed of about 300,555 km/s or 763 km/s above its current value. In 1865 +/- 25 years, the mean value obtained was 299,942 km/s, which was 150 km/s above its current value. In 1915 +/- 25 years the results were averaging 299,812 km/s or 20 km/s above its current value. Here are the same instruments being used over a 200 year period. Therefore, this is not instrumentation improvement or changes in technique. A very real decay has been measured.

A simple, brief analysis of this data set gives a t-statistic which indicates that the hypothesis that c has been constant can be rejected at the 93.9% confidence level. A least-squares linear fit to the data gives a decay of 4.83 km/s per year with acceptance of the decay correlation r = -0.947 at the 99% confidence level. Furthermore, these data reveal (as do others) that the decay is non-linear and flattening out. In addition, the method has a built-in systematic error which causes all values obtained to be systematically low compared with the actual value pertaining at the time.

In all, there have been 16 methods used to measure the speed of light, c. Each method as a class of observations has shown the decay in c. Furthermore, on every occasion that c has been measured by the same equipment at a later date, a lower value for c has been obtained. Each of Michelson's determinations revealed a decline from the previous values. And the first two were done with the same equipment, and the last two were done with the same equipment, and a significant decay was registerred each time the same equipment was used.

The decline in the measured value of c was the source of comment in the scientific press. Writing in the scientific journal Nature on 4th April 1931, p.522 Gheury de Bray wrote: If the velocity of light is constant, how is it that, INVARIABLY, new determinations give values which are lower than the last one obtained... There are twenty-two coincidences in favour of a decrease in the velocity of light, while there is not a single one against it (his emphasis).

I would like to suggest, Andreas, that it would be profitable to examine all the evidence in detail with an open mind before coming to some final conclusion. After all, the most rapid way to progress towards the truth of any situation in science is to examine the areas where there is a discrepancy or an anomaly between data and theory. This is probably not the method that will make you the most popular, but it is the method whereby science is assured of progress rather than stagnation.
born again Christian, non-denominational. Young universe creationist.

Andreas
Sunday School Teacher
Sunday School Teacher
Posts: 48
Joined: Tue Dec 07, 2004 06:28 am

Postby Andreas » Tue Jan 18, 2005 07:23 pm

Hi Barry

tuppence wrote:Its Barry Setterfield here. It seems that your curve-fitting routine is missing something. Did you use all 163 data points or the 121 points that Montgomery preferred? Let me run through a couple of items. First of all we spent some considerable time on curve-fitting and data analysis at Flinders University Maths Department during 1986-1987. We had the Professor of Statistics helping us, and he was so impressed with the data trend that he requested a seminar for the whole Maths Department on the matter by Trevor Norman and myself.


I fitted both data sets and there were only small differences between the results. The values from my last post derived from the bigger data set. I neglected the value from 1675 due to the missing error and the following values
Auwers (Wanstead) 1727-47
Busch 1727-47
Bessel (GO) 1750-54
Peters 1750-54

since in those cases the table wasn't clear about the correct date.

For the fitting I used a commercial scientific data analysis software (Igor Pro). There is a demo version available, if anybody wants to redo the fit.

For data with such an high error the result is as clear as it can be. There is no possibility to deduce a variable speed of c from the data in the magnitude you proposed.
Let me come at this another way. Let us take the aberration method of measurement. In this case, we have a series of 63 determinations made up of hundreds of individual measurements taken from Pulkova Observatory using the same instruments over the period from 1740 to 1940. The period 1765 +/- 25 years gave an average value for lightspeed of about 300,555 km/s or 763 km/s above its current value. In 1865 +/- 25 years, the mean value obtained was 299,942 km/s, which was 150 km/s above its current value. In 1915 +/- 25 years the results were averaging 299,812 km/s or 20 km/s above its current value. Here are the same instruments being used over a 200 year period. Therefore, this is not instrumentation improvement or changes in technique. A very real decay has been measured.


There is no reason to divide the data arbitrary into different time periods. Looking at the graph of the Pulkova subset there is no obvious decay. It looks like random noise around a mean value. The best linear fit results in a 6km/s per year decay with a reduced Chi Square of around 3.5 and would be in agreement with your hypothesis. If you overlap the Pulkova data with the rest, this linear decay becomes much too high, however.
Fitting with a constant results in a Chi Square of 5, which is slightly worse but would also be in agreement with the data of the subset.
It's worth mentioning that in general small subsets with an high error are very sensitive to single points you include or exclude in the fit. For example you can get more or less arbitrarily any value for a c decay between 0 and +/-20 km/s per year just by the removal of some points.

DRPHYSICS
New Convert
New Convert
Posts: 4
Joined: Sun Jan 30, 2005 10:54 pm

Postby DRPHYSICS » Mon Jan 31, 2005 12:53 am

Tuppence wrote:
Here is a little raw data for you: The three measurements of the atomic constants listed are from 1969, 1980, and 2002, in that order. The trend is then listed as up or down. I think you will find that there have been changes!


COMMITTEE ON DATA FOR SCIENCE AND TECHNOLOGY
(CODATA) RECOMMENDED VALUES
The figures in parenthesis are error values as per the last digit(s) in the measurement



Planck’s Constant
x 10-34 Joule-seconds

6.626196 (50)
6.6260755 (40)
6.6260693 (11)


Two problems: Firstly, those are NOT raw data. Those are recommended values at different times encompassing experimental results up to those times. That is, the later values take into account data used in earlier recommendations as well as data from after the earlier recommendations. The analysis assumes constancy of Planck's Constant over time. If that assumption of constancy is in error, then the actual recommended values are without basis.) If the trend in recommended values is real (it's pretty small, if it is real) it might at most suggest a time-variation of Planck's Constant.

More amusing, though, is combining this apparent "trend" in Planck's constant with any trend in the speed of light over the same period. Setterfield claims that although h and c vary with time, their produ

Code: Select all

t is strictly constant over time.

Now if we take these data for Planck's constant we find a relative decrease of about 2 parts in 100,000. If the product hc has been constant over this time interval, then we would have expected to observe (since the precision of c-measurements is much greater than this) a relative increase in c of approximately similar magnitude. That is, we should have seen a change in c of about 6 km/sec. We havent.

DRPHYSICS
New Convert
New Convert
Posts: 4
Joined: Sun Jan 30, 2005 10:54 pm

Postby DRPHYSICS » Mon Jan 31, 2005 12:59 am

My apologies: The last two paragraphs of my previous post were garbled do to a spurious code. The last two paragraphs of my previous post should read as follows:

More amusing, though, is combining this apparent "trend" in Planck's constant with any trend in the speed of light over the same period. Setterfield claims that although h and c vary with time, their product is strictly constant over time.

Now if we take these data for Planck's constant we find a relative decrease of about 2 parts in 100,000. If the product hc has been constant over this time interval, then we would have expected to observe (since the precision of c-measurements is much greater than this) a relative increase in c of approximately similar magnitude. That is, we should have seen a change in c of about 6 km/sec. We havent. [/code]


Return to “Science, Creation & Evolution”

Who is online

Users browsing this forum: No registered users and 0 guests