> On the Sunday Feynman was up at his usual hour (nine a.m.), and we went down to the physics building, where he gave me another two-hour lecture of miscellaneous discoveries of his. One of these was a deduction of Maxwell’s equations of the electromagnetic field from the basic principles of quantum theory, a thing which baffles everybody including Feynman, because it ought not to be possible.
Physics stackexchange discussion of this [1].
That includes a link to a 2001 paper, "Feynman's derivation of Maxwell equations and extra dimensions" [2].
Murray Gell-Mann described it completely: first, he wrote down the question, then he thought very hard, then he wrote down the answer. Feynman was acutely aware of the very different ways in which people internally model things - something that was, no doubt, heavily reinforced by the initial reception of Feynman diagrams at the Shelter Island Conference - so he didn't place any special importance on his own internal "sketching" models when describing things to others. (If anything, he probably viewed them about the same way he viewed Maxwell's mechanical model of electromagnetism - a useful thinking tool for someone who happened to think best in a particular mode, but ultimately nonsensical.) If he couldn't think of at least a couple of alternative ways of coming to the same answer, he usually left it as a sort of exercise for the student.
"first, he wrote down the question, then he thought very hard, then he wrote down the answer."
this is like:
Step 1: Start
Step 2: ???
Step 3: PROFIT!
" he didn't place any special importance on his own internal "sketching" models when describing things to others."
So the answer was more like: Feynman's priority was to show the most concise derivation, not to show the path he stumbled around to arrive at a result.
Indeed I believe most people once they have a rough path to a result, immediately try to simplify it to the crux of the argument before presentation, all the unnecessary assumptions are trimmed off, just like a mathematician tries to find a proof that relies on the least axioms.
Sometimes I see people explain how they came up with their result, but I suspect even these detours are often substantially cleaned up before presentation. Others simply no longer remember the original thought process, due to time passing, or because while working out the longwinded original path the simpler shortcut simply hits them. That instant high of a realization also tends to pollute your memory of stumbling towards it.
Another problem is that big problems were often in the back of a persons mind for a long time, and before a solution materializes, small insights or even vague associations -that may be seperated by years- ultimately combine, so a sudden insight is often just the last piece, with some of the supporting lemmas never disclosed before (as of seemingly no value when the smaller supporting insights were conceived)
Perhaps there is a text somewhere, where Feynman explains how he originally arrived at it, but if so I haven't seen it. I too would like to see it if it is known or ever turns up...
The model you're talking about here is a little different from what I had in mind. I'm more interested the succession of ideas, motivations, new data, insights etc. that caused him to 1) decide that the problem might be worth looking into in the first place 2) choose various approaches in the course of solving the problem. So any particular 'model' might just be one step in this larger activity.
> In the evening I mentioned that there were just two problems for which the finiteness of the theory remained to be established; both problems are well-known and feared by physicists, since many long and difficult papers running to fifty pages and more have been written about them, trying unsuccessfully to make the older theories give sensible answers to them. When I mentioned this fact, Feynman said, “We’ll see about this,” and proceeded to sit down and in two hours, before our eyes, obtain finite and sensible answers to both problems. It was the most amazing piece of lightning calculation I have ever witnessed, and the results prove, apart from some unforeseen complication, the consistency of the whole theory. The two problems were the scattering of light by an electric field, and the scattering of light by light.
I'm familiar with Compton scattering (I did the experiment), but that is scattering from a charge(d particle). I would have expected Dyson to write "light scattering from an electric charge" or perhaps "from an electron" if he meant Compton scattering.
I still read Dyson's sentence as light scattering from an electrostatic field (not clear if it would scatter from a homogenous field, or only gradients in electric field).
Could others please chime in, and confirm SiempreViernes response, or enlighten us both?
EDIT:
Could this refer to pair production?
The electric field of the highly positive nucleus affecting the virtual electron positron pair before it collapses and re-emits the scattered photon?
I don't think pair production makes more sense, it's probably Compton scattering because all QFT interactions using Feynman diagrams are expressed as particle interactions.
why would such interactions not be expressible to express as a particle interaction?
first draw a Feynman diagram of an incoming photon the splits into electron and positron which recombine into an outgoing photon (an electron loop in the middle of the diagram)
then on the bottom add an incoming proton which interact with a photon to either the electron or the positron of the electron loop...
I'm not saying pair production can't be expressed as a particle interaction; I'm saying all interactions with fields in QFT are expressed as particle interactions, so photon electron scattering is a perfectly valid interpretation of
"light scattering from an electric charge"
thinking of it more, I have a hard time seeing the conversion of a photon into two "electrons" fairly being termed "scattering".
so photon electron scattering is a perfectly valid interpretation of
"light scattering from an electric charge"
"
But look up, you misquote Dyson: Dyson didn't write "light scattering from an electric charge" but wrote "light scattering by an electric field" .
You also misread me:
1) a photon can not be converted into two "electrons" and I did not write this (but in the past positrons were called positive electrons).
2) the electron positron pair (from a single photon, internal to the feynman diagram) is only intermediate but as charges are influenced by a strong electric field (say a nucleus), before annihilating back into a single photon.
3) the incoming particles are a photon and a nucleus, and the outcoming particles are again a photon and a nucleus, this is obviously scattering
This is different from Compton scattering (scattering of light from a practically free electron) or its low energy limit Thomson scattering, but in the case of scattering from atoms still high enough energy to eject an electron.
This is different from Coherent Compton scattering, which is scattering of light from an atom, or rather from an electron that remains bound to the atom, such that instead of the compton wavelength of the electron, the compton wavelength of the atom is used:
"Compton found that some X-rays experienced no wavelength shift despite being scattered through large angles; in each of these cases the photon failed to eject an electron.[5] Thus the magnitude of the shift is related not to the Compton wavelength of the electron, but to the Compton wavelength of the entire atom, which can be upwards of 10000 times smaller. This is known as "coherent" scattering off the entire atom since the atom remains intact, gaining no internal excitation." (From Wikipedia page on Compton scattering)
I still interpret Dyson's "light scattering from an electric field" as probably referring to the Meitner-Hupfeld effect:
> On the third day of the journey a remarkable thing happened; going into a sort of semistupor as one does after forty-eight hours of bus riding, I began to think very hard about physics, and particularly about the rival radiation theories of Schwinger and Feynman. Gradually my thoughts grew more coherent, and before I knew where I was, I had solved the problem that had been in the back of my mind all this year, which was to prove the equivalence of the two theories. Moreover, since each of the two theories is superior in certain features, the proof of equivalence furnished a new form of the Schwinger theory which combines the advantages of both. This piece of work is neither difficult nor particularly clever, but it is undeniably important if nobody else has done it in the meantime. I became quite excited over it when I reached Chicago and sent off a letter to Bethe announcing the triumph. I have not had time yet to write it down properly, but I am intending as soon as possible to write a formal paper and get it published. This is a tremendous piece of luck for me, coming at the time it does. I shall now encounter Oppenheimer with something to say which will interest him, and so I shall hope to gain at once some share of his attention. It is strange the way ideas come when they are needed. I remember it was the same with the idea for my Trinity Fellowship thesis.
> My tremendous luck was to be the only person who had spent six months listening to Feynman expounding his new ideas at Cornell and then spent six weeks listening to Schwinger expounding his new ideas in Ann Arbor. They were both explaining the same experiments, which measure radiation interacting with atoms and electrons. But the two ways of explaining the experiments looked totally different, Feynman drawing little pictures and Schwinger writing down complicated equations. The flash of illumination on the Greyhound bus gave me the connection between the two explanations, allowing me to translate one into the other.
Dyson is obviously brilliant, and I'm sure he's being humble to a fault here, but it's amazing how valuable being in the right place at the right time is. This piece of work brought Dyson to fame:
> Oppenheimer rewarded Dyson with a lifetime appointment at the Institute for Advanced Study, "for proving me wrong", in Oppenheimer's words.[0]
Actually this conference was largely from a previous generation of scientific thinkers. According to their Wikipedia bios, Niels Bohr was the only one of those 10 names involved in the Manhattan Project. Einstein wrote a letter alerting President Roosevelt to the possibility of a German atomic bomb which may have helped inspire the project. Heisenberg was also involved in the war effort but "knew little of the Manhattan Project, so, if he were captured, he would have little intelligence value to the Germans".
You quote -- without attribution -- wikipedia, and you get the antecedent wrong.
I'll italicize the exact words you quote after their preceding sentence in the Werner Heisenberg article on wikipedia:
"Godusmit was selected for this task because he had physics knowledge, he spoke German, and he personally knew a number of the german scientists working on the German nuclear energy project. He also knew little of the Manhattan Project, so, if he were captured, he would have little intelligence value to the Germans."
Note that the "he" in the second sentence is Godusmit, not Heisenberg.
Heisenberg was, according to Meitner, straightforwardly a supporter of the Nazi regime. Whether or not he was ideologically committed to the party, Heisenberg was most definitely involved in the war effort -- on the German side! [ https://www.nytimes.com/2002/01/07/us/letter-may-solve-nazi-... -- first sentence, "The leader of the Nazi atomic bomb program, Werner Heisenberg, revealed its existence ..." a paragraph later: "Heisenberg never expressed moral qualms about building a bomb for Hitler or hinted that he might be willing to sabotage the project, the documents reveal" with a follow-up here https://www.nytimes.com/2002/02/07/world/new-twist-on-physic... ].
Thanks for this correction! I had hurriedly checked Wikipedia bios for each participant and completely missed the main point of Heisenberg's wartime activities.
I think the Internet has two competing effects here:
1. Ideas and inventions disseminate quickly and become part of open source knowledge almost immediately if useful, and when people do learn of a new idea they are unlikely to know who invented it;
and
2. Because of this openness of information and 'democratization of invention,' there are fewer recognized centers of invention (mostly being universities and special projects teams working on mostly unpublished work at big tech companies), and so great minds are less likely to gravitate to the same spot.
Another consideration here is that the hottest field is software, where innovation suffers from the open-sourcing/attribution problem to a greater degree than any other field.
>>software, where innovation suffers from the open-sourcing/attribution problem to a greater degree than any other field.
Suffers? What a curious choice of word! I would claim quite the contrary, where innovation flourishes due to the ease and speed of dissemination of ideas.
Can you explain why you think innovation is dampened?
Sentence structure could have been better -- I think the Internet a net positive for innovation, but in this particular regard (aggregating great minds to get concentrated breakthroughs, and being recognized for those breakthroughs), it is dampened.
For theoretical computer science, Berkeley in 80s and 90s was a fantastic place: the foundations of cryptography and complexity theory were developed here. A sample: Goldwasser, Micali, Blum, Impagliazzo, Rudich, Arora, Sudan, Karp, the Vazirani's, Rubenfeld, Naor, Sipser, and more. These are the who's who of cryptography and complexity theory, and they all overlapped in time over 1982-92.
I think there are plenty of teams of this caliber out there. They're just working on less impactful or more stubborn projects. If Bell Labs hadn't existed, I believe it wouldn't have been long before most of the breakthroughs that came from there came from other places instead.
Really? Just off the top of my head, here are some of the people that worked at Bell Labs:
Claude Shannon, Richard Hamming, William Shockley, Ken Thompson, Dennis Ritchie, John Hopcroft & Brian Kernighan. (I am probably forgetting several Nobel prize and Turing award recipients)
Some people at the time felt that C and Unix were a setback to computing compared with Lisp, in the same way that Microsoft Windows was a setback compared to Unix. Bell Labs is the canonical example of "Worse is Better", also called the "New Jersey Philosophy".
One could also argue that both Windows and Unix were necessary phases for general computing given limited hardware resources, and only now is the stateless functional paradise envisioned at MIT in the 1960's actually possible.
Ken Thompson, Dennis Ritchie, and Brian Kernighan, on the other hand, aren't of the same caliber as people like Shannon and Feynman (and the Manhattan project all-stars).
They've had a huge influence on programming, and on IT, but not always for the better from an academic and quality perspective (e.g. Worse is Better), and more as pragmatic hackers and tinkerers than deep thinkers.
While their practical achievements have had tremendous affect on a specific level of a revolutionary industry: Von Neumann alone was smarter and more influential than any three of them combined (maybe excluding Shannon).
To be fair that's also true of almost any three people involved in the Manhattan Project too.
Not a maybe for Shannon. He is still having a trememdous impact. Information theory has developed by Shannon is so fundamental that we forget about it these days. But big.
I could argue for Shockley and Hamming, but they were more on an engineering side.
Did they work there at the same time? What were the respective team sizes.
I get the feeling the Manhattan Project is pretty high up with regard to Nobel Prizes / Scientists employed, at least if you count only the research team at the Los Alamos site (the total team was in the 6-digits IIRC because of "Computers", Military staff, and all the physical work of enrichment and metalworking)
I think there's one school in Budapest that might have more laureates among an even smaller student body. Coincidently (or not) there's a rather large overlap of that group an Los Alamo (.
This is almost certainly true. Breakthroughs in science, mathematics, technology, and engineering occur when the conditions are ripe; if you look into every major famous breakthrough, you find that it was being duplicated or research along similar lines being pursued elsewhere in a very close timeframe. Geniuses are fungible.
The events around the founding of America brought together a group of people of staggering intellect and morality, definitely unrivaled. (Although their efforts ultimately failed, as predicted by one of them.)
I don't think that people in the past were smarter so I would think that at places like CERN you have the same level of talent but information gets out much easier these days so it's hard to stay ahead. The internet levels the playing field a lot.
Physics stackexchange discussion of this [1].
That includes a link to a 2001 paper, "Feynman's derivation of Maxwell equations and extra dimensions" [2].
Here's a paper Dyson wrote about it in 1989 [3].
That was discussed here on HN [4].
[1] https://physics.stackexchange.com/questions/391744/does-feyn...
[2] https://arxiv.org/abs/hep-ph/0106235
[3] http://fermatslibrary.com/s/feynmans-proof-of-the-maxwell-eq...
[4] https://news.ycombinator.com/item?id=11067435