Announcement

Collapse
No announcement yet.

Second Great Depression?

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #46
    Re: Second Great Depression?

    Originally posted by ASH View Post
    Perhaps it was the Fed, under Greenspan, blunting the economic cycle with interest rate policy. Perhaps it had something to do with the changes in reserve fraction requirements that happened in the early 90's.

    Still, to return to an earlier point, the factors which affect the timing of the cycle change on a time scale much, much shorter than 250 years. It's frankly implausible that any economic pattern you might identify on that sort of time scale will be useful for timing (except the trivial case of things with a 1-year period).

    So much to study - so little time. I've seen various video demo on fractional reserve lending. Is there anything good on the who-why-where as to its origin/motive?

    Comment


    • #47
      Re: Second Great Depression?

      Originally posted by strittmatter View Post
      So much to study - so little time. I've seen various video demo on fractional reserve lending. Is there anything good on the who-why-where as to its origin/motive?
      I think the origin/motive is pretty straight-forward: fractional reserve lending allows a bank to collect interest on a larger pool of loans than could be generated from its capital using one-to-one lending. It also allows much faster credit creation, which can be helpful for increasing the pace of economic activity. So, from the public standpoint, it amounts to cheaper credit; from the bank's standpoint, it's simple profit motive. If you add a layer of regulation and central banking, you can even make it fairly stable for long spans of time.

      Have you read Aaron Krowne's guest article What (Really) Happened in 1995? It doesn't talk about the ancient history of fractional reserve lending, but it does talk about the changes to fractional reserve requirements in the early 90's.

      Comment


      • #48
        Re: Second Great Depression?

        Originally posted by ASH View Post
        I think a lot of the useful reading is probably in the free sections of the site, although I wouldn't discourage you from subscribing. Try reading the articles which are linked in the lower-right corner of the main iTulip web page (in sections such as "Economics" and "Financial Markets", etc.). I think those are free, and they span a lot of the analysis behind EJ's interpretation of the present economic events.
        Thanks again. I went to the main page and looked. There is plenty to check out. I'll see what I can find/learn.

        Originally posted by Ash
        Perhaps it was the Fed, under Greenspan, blunting the economic cycle with interest rate policy. Perhaps it had something to do with the changes in reserve fraction requirements that happened in the early 90's.

        Still, to return to an earlier point, the factors which affect the timing of the cycle change on a time scale much, much shorter than 250 years. It's frankly implausible that any economic pattern you might identify on that sort of time scale will be useful for timing (except the trivial case of things with a 1-year period).
        Yes, it seem that intervention would have an effect on the pattern. So, the current efforts are sure to make where we are headed even less knowable. I guess that is why all the uncertainty.

        I hear what you are saying regarding timing and makes sense to me. Kondratieff himself admitted that his cycle could vary between 45 and 60 years.

        Comment


        • #49
          Re: Second Great Depression?

          Very interesting ASH.

          What about quasiperiodic crystals?

          (Somebody please stop us!)
          It's Economics vs Thermodynamics. Thermodynamics wins.

          Comment


          • #50
            Re: Second Great Depression?

            I just saw your comment in another forum about "cross me on the physical sciences and...", that made me curious how you had responded. Looks like you were true to your word.

            Well, since you start your tome with your "creds," I will do the same. BS cum laude in physics, MS in comp sci, phd in computational biology. I also happen to have a brother who is a physicist that hangs out with people like Stephen Hawking, and does string theory for a living at a prestigious theoretical physics institute. I've done a lot of reading about the nature of quantum reality and quantum computing, and in fact am now working on a book on this subject.

            Your explanation of quantum mechanics is from a purely engineering point of view, which uses math to claim something about reality. While the math is a tool that can reflect the underlying reality (sometimes correctly, sometimes only as an approximation), it does not explain it. It is merely a tool that can allow one to compute some useful things. Hence, when you use that math to make the claim that a particle is truly localized, you are making an incorrect claim. The FFT is just a mathematical tool that has the property that a localized point in space is distributed in frequency, and vice-versa. And an eigenstate isn't reality, it is a mathematical tool that reflects some small portion of reality. So it has a nice property that mirrors some underlying principle. It does not explain it. And, in fact, a particle is not localized, unless it is measured or has an interaction with another particle.

            What I said about quantum coherence and collapse is correct. It is directly linked to the mass of a particle. An electron can "interfere" with itself, in other words, travel two paths as a wave, then interact with a surface as a particle (e.g. a two-slit interference experiment). The more massive the particle, the less of this tendency it exhibits, the more it acts like a classical particle. But, the key point is, a single particle will interfere with itself - there is no explanation for this that fits with your claim that it exists at a specific location even before measurement. It does not; it exists as a wave, which is inherently non-local. And then, somehow, it makes the transition from wave-nature to particle-nature, which is what I was referring to as the coherence collapse.

            Significantly, there is still the unresolved issue of when and how the choice between the wave and particle state occurs. Shroedinger's cat has been debated for many years, because there is a fundamental problem: one must make a measurement before a particle (photon, electron, or whatever..) will be found in a particular state (and hence, particular position/momentum).

            This is not solved by some simple little math trick. There is truly (apparent) randomness that emanates from the quantum world in this decision making process. This (apparent)randomness is not computable (or, at least nobody has figured out how to compute it).

            That's why effects like quantum tunneling can occur. But more importantly, it is why one can theoretically build a quantum computer that will break any cryptographic code. The system is maintained in a superposition of states (coherence) such that multiple solutions to the problem are examined simultaneously. The particles of the system exist not in any single state, but in multiple states. If the system is set up properly, then the collapse of coherence is biased towards the solution to the problem. By its nature, such a system is non-local - it is coherent.

            If one could just model the above process with simple linear algebra (or FFT's, or whatever...), then one could use it to build a computer that would model that world, and hence break any code. But one cannot. That is because the quantum world has properties that are unique and non-local, and it is only once coherence has collapsed that its properties become localized.

            Last but not least, you say that to localize something, "just measure it." What is a measurement? Please define that for me.

            For example, you mention a phosphor screen to measure electrons. But, an electron that travels towards a phosphor screen can tunnel through it and never interact with it. That is something a classical particle cannot do. Instead, there is some probability, described by Schroedinger's eqn, that the electron will interact with the screen at particular points. The thicker the screen, the more likely the electron will be stopped, and hence "measured".

            But, the problem of measurement is not a trivial one, and if it were as simple as you make it sound, then why have very smart physicists been debating about it since wave/particle duality was first discovered?








            Originally posted by ASH View Post
            I'm afraid I have to disagree with you, since this is my area of professional expertise.

            My point regarding the Heisenburg uncertainty principle is unrelated to my point about x-ray diffraction. A particle with an exact location is in an eigenstate of position: a point in real space. A harmonic plane wave (in space) with an exact wavelength is in an eigenstate of momentum: a point in Fourier space. The incompatibility between position and momentum, and the Heisenburg uncertainty principle, simply fall out of Fourier analysis. If you apply Fourier analysis to find the spatial frequency components of a delta function in position (the point-like particle) you get a white spectrum (completey undefined momentum); conversely, the real-space image of a point in Fourier space is a wave that extends through all space (completely undefined position). The same relationship applies when the domain is time rather than space, leading to the energy-time Heisenberg principle. Fourier analysis can also be used to prove that a Gaussian wave packet is the minimum-uncertainty wave function, for which:
            delta-x * delta-p >= h-bar/2

            which is the quantitative form of the uncertainty principle (for space and momentum).

            My comment about x-ray diffraction stands. X-rays diffract from a regular lattice of atoms if the incoming x-ray's wave vector matches the periodicity of the crystal. One is essentially testing for periodicity (taking a Fourier transform) by throwing waves at the crystal. This is why solid state physicists spend all their time working with the "reciprocal lattice", which is the Fourier transform of the crystal. If you want to compare wave phenomena to the spatial periodicity of the crystal, you move everything to Fourier space, where both the waves and the periodicity of the crystal are points.

            What you wrote about quantum coherence, the Schroedinger equation, and "collapse into a classical state" isn't correct. In fact, the statement "the lower the mass of the particle, the more non-local it becomes" is not technically correct and misses the point. The deBroglie wavelength may be shorter for higher-mass particles, but with regard to nonlocality, it's really all about eigenstates versus superpositions, and measurement. If you have two conjugate variables which don't commute, you can't have simultaneous eigenstates of both. You can put the system into an eigenstate of either with the appropriate measurement, and you can have a superposition state in which both variables are indeterminate, but there's never anything "classical" about it. As far as delocalization goes, that has nothing to do with mass. That only has to do with whether you are forcing the system into an eigenstate of position or not by measurement. Photons, which have no rest mass, can be localized just as easily as electrons. All you have to do is attempt to measure their position -- for instance with a phosphor screen or a PMT. If you want to delocalize something, then measure its momentum.

            Comment


            • #51
              Re: Second Great Depression?

              Where else but iTulip can you find people working on a quantum theory of economic Depressions?

              That's why I like it here.

              Comment


              • #52
                Re: Second Great Depression?

                Originally posted by mcgurme View Post
                I just saw your comment in another forum about "cross me on the physical sciences and...", that made me curious how you had responded. Looks like you were true to your word.

                Well, since you start your tome with your "creds," I will do the same. BS cum laude in physics, MS in comp sci, phd in computational biology. I also happen to have a brother who is a physicist that hangs out with people like Stephen Hawking, and does string theory for a living at a prestigious theoretical physics institute. I've done a lot of reading about the nature of quantum reality and quantum computing, and in fact am now working on a book on this subject.

                Your explanation of quantum mechanics is from a purely engineering point of view, which uses math to claim something about reality. While the math is a tool that can reflect the underlying reality (sometimes correctly, sometimes only as an approximation), it does not explain it. It is merely a tool that can allow one to compute some useful things. Hence, when you use that math to make the claim that a particle is truly localized, you are making an incorrect claim. The FFT is just a mathematical tool that has the property that a localized point in space is distributed in frequency, and vice-versa. And an eigenstate isn't reality, it is a mathematical tool that reflects some small portion of reality. So it has a nice property that mirrors some underlying principle. It does not explain it. And, in fact, a particle is not localized, unless it is measured or has an interaction with another particle.

                What I said about quantum coherence and collapse is correct. It is directly linked to the mass of a particle. An electron can "interfere" with itself, in other words, travel two paths as a wave, then interact with a surface as a particle (e.g. a two-slit interference experiment). The more massive the particle, the less of this tendency it exhibits, the more it acts like a classical particle. But, the key point is, a single particle will interfere with itself - there is no explanation for this that fits with your claim that it exists at a specific location even before measurement. It does not; it exists as a wave, which is inherently non-local. And then, somehow, it makes the transition from wave-nature to particle-nature, which is what I was referring to as the coherence collapse.

                Significantly, there is still the unresolved issue of when and how the choice between the wave and particle state occurs. Shroedinger's cat has been debated for many years, because there is a fundamental problem: one must make a measurement before a particle (photon, electron, or whatever..) will be found in a particular state (and hence, particular position/momentum).

                This is not solved by some simple little math trick. There is truly (apparent) randomness that emanates from the quantum world in this decision making process. This (apparent)randomness is not computable (or, at least nobody has figured out how to compute it).

                That's why effects like quantum tunneling can occur. But more importantly, it is why one can theoretically build a quantum computer that will break any cryptographic code. The system is maintained in a superposition of states (coherence) such that multiple solutions to the problem are examined simultaneously. The particles of the system exist not in any single state, but in multiple states. If the system is set up properly, then the collapse of coherence is biased towards the solution to the problem. By its nature, such a system is non-local - it is coherent.

                If one could just model the above process with simple linear algebra (or FFT's, or whatever...), then one could use it to build a computer that would model that world, and hence break any code. But one cannot. That is because the quantum world has properties that are unique and non-local, and it is only once coherence has collapsed that its properties become localized.

                Last but not least, you say that to localize something, "just measure it." What is a measurement? Please define that for me.

                For example, you mention a phosphor screen to measure electrons. But, an electron that travels towards a phosphor screen can tunnel through it and never interact with it. That is something a classical particle cannot do. Instead, there is some probability, described by Schroedinger's eqn, that the electron will interact with the screen at particular points. The thicker the screen, the more likely the electron will be stopped, and hence "measured".

                But, the problem of measurement is not a trivial one, and if it were as simple as you make it sound, then why have very smart physicists been debating about it since wave/particle duality was first discovered?
                Hi mcgurme. I realized I had been kind of an a**hole, so that remark in the earlier thread was self-conscious. Sorry.

                Please assume that I am familiar with the two-slit experiment (with and without detectors at the slits), tunneling, quantum nonlocality, and objective indeterminism. I think I see where you got the impression that I imagine an electron "exists at a specific location even before measurement". I do not. What I actually said was:
                A particle with an exact location is in an eigenstate of position: a point in real space

                The operative word here is "in". In my view, a "particle" is a classical idea -- a point object with an exact location. In this sense, electrons are not "particles" unless they happen to be in an eigenstate of position. To really split hairs, there is likely no such thing as a true point particle state, both because we lack a physical means to measure position with absolute precision, and because the idea of infinitely divisible space probably doesn't survive quantum mechanics. However, I realize that common usage is to say an electron is a variety of particle (and I often use the term that way myself), so I should have clarified my usage in that passage. What I meant to say is that a point particle is a classical abstraction, and that the closest one comes in quantum mechanics is if something is in an eigenstate of position.

                I know I'm going to get in trouble with you on the whole math vs. reailty thing, but it would be most accurate to say that an electron is "something" whose physical state of existence acts exactly like a vector in a complex linear vector space. Its physical attributes correspond to the representation of the state vector in different basis sets -- its projection onto a particular set of axes. The complex linear vector space in question bears no relationship to real space -- the subspace for a particular measurement may have an infinite number of orthogonal dimensions, as is required to represent position, or a finite number of dimensions, as in the case of spin. However, some representations span the same subspace with different basis vectors, and are simultaneously incompatible, such as position and momentum. The Fourier transform is a less abstract and more accessible way to think about this particular incompatibility, and I brought it up for that reason. It is a way of demonstrating what the incompatibility actually means. In the formal mathematical structure of QM, one constructs operators that correspond to particular measurements, and uses those operators to project the state vector onto the basis vectors (representation) of choice -- and if two operators do not commute, then we know their corresponding physical attributes are incompatible, and they will be related by a Heisenberg-like uncertainty principle.

                In that picture, the act of measurement changes the orientation of the state vector to align with one of the basis vectors of the representation for the physical attribute being measured -- it puts the system into an eigenstate of that attribute. The likelihood that a particular outcome for the measurement will be found is proportional to the projection of the original state vector onto the basis vector corresponding to that outcome (the norm of that projection), but the outcome is subject to objective uncertainty. This is the "collapse of the wavefunction" that can take the system out of a superposition state of the attribute being measured and put it in an eigenstate in which that attribute has a definite value. It does not, however, put it into a "classical" state, because in classical physics all attributes of the system can be simultaneously definite, and the system will still be in a superposition with respect to variables that are incompatible with the one being measured.

                The Schroedinger equation to which you make reference is used to find the projection of the state vector onto the representation for position (the wavefunction). It is a common exercise to rewrite the equation in the momentum representation and find solutions using waves as the domain rather than points, in order to teach the idea of representations. As I recall, the Fourier transform connects the two representations.

                I agree with you wholeheartedly that the meaning and underlying mechanics of the collapse of the wavefunction is a matter of ongoing study. I must leave those matters to folks like your brother. However, in the formal structure of QM, collapse of the wavefunction is tied to measurement. And, to be clear, my reference to the Fourier transform was in regard to the incompatibility of position and momentum -- not the mechanics of the measurement itself. I have to treat that as a black box, unless I grow a bigger brain.

                Going back to math as a description of reality vs. math as reality... I am compelled by the historical observation that most of the math used to describe quantum mechanics was developed much eariler in history than the discovery of the quantum phenomena to which they were applied. Also, there are a number of things in physics which seem rather significant to me, which only seem to admit to a mathematical explanation. Take, for instance, the exchange operator and the properties of symmetry and anti-symmetry with respect to its application. The seemingly mathematical constraint that multi-particle wave functions be either symmetric or anti-symmetric with respect to exchange (which is necessary to "get the same answer" when you swap labels) leads to things like covalent bonding and the band structure of solids... hell, the Pauli exclusion principle too, and collective behavior of bosons while I'm at it. I guess my point is that many fundamental things in physics behave like mathematical objects in detail, to the point that the line between math and physics blurs a bit.
                Last edited by ASH; January 02, 2009, 06:49 PM.

                Comment


                • #53
                  Re: Second Great Depression?

                  Originally posted by we_are_toast View Post
                  Where else but iTulip can you find people working on a quantum theory of economic Depressions?

                  That's why I like it here.
                  Next time I have a math problem, I am calling ASH...

                  Comment


                  • #54
                    Re: Second Great Depression?

                    Originally posted by LargoWinch View Post
                    Next time I have a math problem, I am calling ASH...
                    Ehrm. I expect there are several others who post on iTulip who would be better bets. Although I'm in mid-disagreement with mcgurme, I expect she is in fact better at math than I, if she graduated with a degree in physics from a good school. I'm okay with the math that I use in my field, but my grasp of what math "means" has always been better than my calculational skills.

                    Comment


                    • #55
                      Re: Second Great Depression?

                      Originally posted by ASH View Post
                      Ehrm. I expect there are several others who post on iTulip who would be better bets. Although I'm in mid-disagreement with mcgurme, I expect she is in fact better at math than I, if she graduated with a degree in physics from a good school. I'm okay with the math that I use in my field, but my grasp of what math "means" has always been better than my calculational skills.
                      ah ASH, stop being so modest! (Mr. Phd Scientist)

                      Comment


                      • #56
                        Re: Second Great Depression?
                        Diagnosing depression

                        Dec 30th 2008
                        From The Economist print edition
                        What is the difference between a recession and a depression?


                        THE word “depression” is popping up more often than at any time in the past 60 years, but what exactly does it mean? The popular rule of thumb for a recession is two consecutive quarters of falling GDP. America’s National Bureau of Economic Research has officially declared a recession based on a more rigorous analysis of a range of economic indicators. But there is no widely accepted definition of depression. So how severe does this current slump have to get before it warrants the “D” word?

                        A search on the internet suggests two principal criteria for distinguishing a depression from a recession: a decline in real GDP that exceeds 10%, or one that lasts more than three years. America’s Great Depression qualifies on both counts, with GDP falling by around 30% between 1929 and 1933. Output also fell by 13% during 1937 and 1938. The Great Depression was America’s deepest economic slump (excluding those related to wars), but at 43 months it was not the longest: that dubious honour goes to the one in 1873-79, which lasted 65 months.


                        The Economist talks about the D word ...

                        Comment

                        Working...
                        X