Monday, September 2, 2013

Core/Operator Design Pattern

define Core:
     Collection of libraries constituting the [modular] components that compose the software/hardware abstraction layer. Size N for object(libraries) is "very large".

define Operator:
     A user-agent, real or simulated, that enacts the effect of maximizing symbolic links and symmetry at the operating system level through the use of software developer kits and formal logic.

0. (Core ^ Operator) is a unity
1. A unity defines new work closures in software life-cycles
   

Information and No Original Research

The nature of information, whether on the Internet or the "real world" has been very obviously well-characterized as "increasing without limit", denoted by our favourite function: the exponential. With the saturation of information versus the lack of interested academics, we can only get out of this slump of having no original research by generating new questions from preexisting data sets. There is a running joke in the programmers community: "grids, grids and more grids". Indeed, the mathematical structure of the grid and matrix are timeless. Euclid and the pre-Socratics solved very interested properties in maths using these kind of structures.

Rather than coming up with new programming languages and frameworks, give your Data Structures and Algorithms textbook some love and read the first three chapters or so. You'll find everything you need to solve interesting statistical problems to be pre-installed with your GNU/Linux minimal build or else your Windows/Apple machine.

Time Distortion and Learning in Virtual Realities

In the film The Matrix, Morpheus and other red-pillers tell us about how being plugged in longer than others seems to change our sense of reality, personalities and perceptions. Indeed, if you were to ask the regular net user since the inception of IRC and those  who have successfully run a business on the Internet, they might tell you "it's all about the timing". In a place where mathematical logic rules and propositions inherit their futures from the study of vector fields, we have to ask ourselves the point at which imagination and reality begin to break down. At the cost of sounding like a hipster, the plugging in process seems to change the time scale at which our minds manage to process and learn new information and the rate at which it acquires knowledge.

Just like the "slumdogs" in India managed to learn biotechnology in two weeks with one computer connected to the Internet, we now see a boundary re-emerging in which humans are melding with the neo-agora and decentralized economics: the boundary of choice. Perhaps this is what Captain Picard meant when he was discussing how economies worked in his future:

We... handle our money systems differently here.

Hashes per second versus Cryptocoin Output

If you look at a typical mining pool for cryptocurrencies, you will notice that most of them contain a "Hall of Fame" segment. Upon closer inspection, you may notice that some people clock in at some absurd number like 140954085 mhash/second and get x number of some cryptocoin (let's say it's 100 BTC). One row down, you may see some user who clocks in at 43948 mhash/second is getting 10 or 20 more BTC than the user getting x number of BTC. Statistically, this means that there is little to no correlation.

The race for building the "best miners" seems to have been a rat race instead of one to the finish line. Save your cash and read the documentation. The transmission protocols used in cryptocurrencies are based on the same ones used for your favourite BitTorrent client and relies on open connectivity to the network. Hash computations aren't going to solve any quicker when the original algorithm by Satoshi was meant to actually be a self-evolving blockchain.