Automation, capital, labor, deflation, QE

September 9th, 2014

Automation can be viewed as an increase in productivity; less people do more by leveraging machines and software. In 1776 Adam smith addressed the issue of increased productivity by explaining the natural market forces that result. In an environment of increasing productivity, capitalists (those who invest capital in business in Smith’s terminology) see increases in profit as the demand for labor simultaneously is reduced. In immature (unsaturated) markets, the labor demand can remain constant because the increased productivity leads instead to quicker saturation, but the capitalist still profits. As long as the majority of capital is invested in mature markets, there is an overall reduction in labor demand. Only if the majority of capital is spent on finding and exploiting new markets (meeting unmet needs or wants) can labor demand be sustained. If labor demand is not sustained or grown, the value of labor falls because the supply outstrips the demand (the exception, of course being major violent events that significantly reduce world population).

When the value of labor falls, wages stagnate or fall. This tightens the budget of the families that make up the labor force, who are the vast majority of the citizens of a country. This in turn, reduces the demand for nonessential goods and services, causing a “contraction” in the economy. But, keep in mind the capitalists are still more increased profit per worker. There is some harm to the capitalists investing in nonessentials through this reduced demand.

One thing that is very different now is Smith’s economics was based on metal-based money. While some metal money enthusiasts will refer to such currency as “sound money”, Smith saw it differently. He studied history and saw that throughout time the weight and composition of various coinage was changed, without changing the denomination. As such, it was always decreasing, since that meant the issuer could issue more currency with less metal or cheaper metal. Why would the issuer want to issue more currency? Because innovation has increased productivity, which has contracted the labor market and thus the economy as a whole. Recall what I said above: if the majority of capital is spent on finding and exploiting new markets labor demand can be sustained. By issuing new currency and choosing the right investments, they hope to get things moving again. These devaluations of the currency were the analog of what we today call QE. But QE is a different beast and we got here in stages, so let me start with metal coins.

Metal coin based money was “semi-sound” in that there was a lower limit as to how flimsy you can make coins. The more you dilute the gold, silver, or copper coin, the more obvious it is. If you take a 50 pennies made before 1982 and 50 made after, the ones made after will be way more likely to be heavily worn, even if they are only a few years old (95% copper 5% zinc vs 97.5% zinc, 2.5% copper). At the time of metal coins gold, silver, and copper were the currencies of international trade. So while increasing the money supply got the local economy moving, it only hurt the local economy’s purchasing power on the world stage due to the reduced base metal content in labor’s wages and capitalist’s profit. This basic process is still occurring, but money is quite different.

First, metal coins were replaced by paper bills. This paper was a certificate of deposit for metal coins and bars. This meant that you could take it to the bank or your country’s treasury and exchange it for gold, silver, or copper. Paper certificates have also come to be used for international trade. There have been several currencies that became standard for international trade, and each has started off being a certificate of deposit for actual metal (usually gold). I won’t dwell on the earlier ones because I honestly haven’t studied them thoroughly, but I do know that each one prior to the US dollar has at some point been unable to deliver on all the promises made. So, again, we have the dilution of the currency, but there is still that ability for foreign nations to request their metal, which acts as a lower limit on how flimsy your promises are.

That’s it for now, next episode, I’ll dig into the establishment of the USD on the world stage via the Federal Reserve Bank, the great depression, and the Bretton Woods system. Then we’ll go over the electronification of money, the 1971 crisis, and the paradigm shift that has been perpetuated so far of allowing electronic balances to substitute for metals in international trade even without the ability to redeem for gold. This separates the money system of the world completely from the base metals that have traditionally served this purpose. Will this shift stick, or will the world revert to metal at some point? Or will we find a middle ground like bitcoin, which is truly “sound money” in that it has a hard limit that would be extremely difficult to change, yet can be moved around the world dramatically faster and cheaper than the existing electronic money manage.

Where’s my food from?

June 15th, 2011

I want to know. For unprocessed foods (raw meat and produce), the label or sticker should have the name and address of the farm it was produced on. I would pay a little more for a product if I knew I could get in the car and go see the farm it came from. Would you?

Calculus Fallibility Theorem

May 8th, 2011

Calculus fails on discontinuous functions. Discontinuities exists in our measurements of reality. Therefore calculus is fallible when applied to reality. QED.

Thankfully, discontinuities aren’t all that common when you consider all the shit that does work according to our calculations.

CPU Frequency Scaling

November 19th, 2010

I compiled my own kernel for the first time since high school last night. And it felt good. And now, I have CPU Frequency Scaling working. I probably could have made it work before because it looked like everything was built as modules, but I never got around to figuring out what I needed to load. So, I just built in what I knew I needed. The kernel compile took 3:18 (3 minutes) on the stock kernel, and running the build again using the new kernel was slightly slower at 3:48. If I get time, I’d like to see if I can get that down… although I would need to save the specific configuration to have a meaningful comparison. However, as long as the build time is going down I should be approaching faster, because if it’s due to less things being compiled the kernel will be leaner. A leaner kernel leaves more precious CPU cache for the processes. Of course, modules break this theory in that removing the compilation of unneeded modules speeds up the compile without speeding things up at all.

Anyway, back to frequency scaling. My CPU supports 4 frequencies, 800 MHz, 2.2, 2.7, 3.4 GHz. And each core can be set independently. Sweet. It turns out that when all 4 cores are at 800 MHz, I get an audible buzz that is rather annoying. Shame on you AMD. Setting any one of the processors to the next higher frequency eliminates the problem, but I want to use the ondemand governor. What ondemand does is increase the frequency when the CPU load is high. I think it can even skip for example from 800 MHz to straight to 3.4 if it detects the load rising quickly enough. Well, it turns out that I can set a minimum frequency per core. So, by running

sudo su -c ‘echo 2200000 > /sys/devices/system/cpu/cpu3/cpufreq/scaling_min_freq’

I set the minimum frequency to 2.2 GHz on the fourth CPU. So now I’ve got 3 at 800 MHz and one at 2.2 GHz. No whiny processors, and much power savings over 4 cores blasting noops at 3.4*10^9 times per second. Not the ideal of replacing 4.25 noops with 1, because of AMD’s failure to thoroughly test the acoustic consequences of their design, but still we are reducing 2.96 to 1 at idle.

Circular Logic

November 16th, 2010

Ask a logician about circular logic and they will tell you it is a fallacy. But, to me, this is an oversimplification. It holds as a fallacy when used in isolation, but when there are feedback loops, circular logic can indeed be used to produce matter from the illusions created by circular logic, or circular reasoning as Wikipedia calls it. Of course, the logicians are right in that circular logic is illusory, but that does not make it useless. Math has imaginary numbers already, why not formalize imaginary logical structures? Society has been running off half baked theories that are consistently revised since we started chasing our crazy half baked ideas. And it has clearly led to great advances. But, knowing that today’s brilliant idea will be tomorrows useful, but discarded broken theory, shouldn’t we attempt to understand the phenomenon? There are small errors in everything, and engineers design by just allowing enough tolerance. But knowing how much tolerance is needed is considered an art, and many would argue it can’t really be a hard science. If you look at Gödel’s work with the glasses of illusions as false, they have a point. Any theory consistent with existing theories will not “complete” (finish, as in “we’ve solved it, there are no more theories”) all the theories, as long as the set of theories is past a certain level of complexity. And that level of complexity exists everywhere. Out best hope is to predict as much as we can, and if a hurricane hits or lightening strikes, all bets are off. Butterflies are flapping their wings all the time. Sure, what we know works 99.999% of the time, but that’s still 1.33 days a year. Luckily, there’s a lot of things with even higher reliability, but machines breaking and needing repaired, or rebooted is common. 99.99999…% wears out pretty quickly when a processor is operating at 3 GHz (3,000,000,000 operations per second), or 94,670,208,000,000,000 in a year. That’s right, 94 quadrillion times. To get that down to 2.59 failures per day you need 99.9999999999999% reliability. I think you get the point. However, when Gödel’s work is looked at with consideration of the power of shared illusions, and his proof that there will always be a true statement that can not be proved, one can see a justification for accepting the premise of circular logic in a limited capacity. Much theory is fairly secure in predicting results, but theories are just that, theories. Sure, they all break down at some point, but when the illusion fails, the products of the illusion still remain.