ENTRUST YOUTH MENTORSHIP CAMP BY STEPHJOY AND UBRICA
The Third Science, Technology, Engineering, Arts and Mathematics Camp – Thursday December 5, 2019 – Sunday December 8, 2019
THEME – ENTROPY: MONEY SCIENCE AND TECHNOLOGY
Background of the problem
Knowledge is verified or falsified information, processed and interpreted into meaning and significance, wisdom and leadership (Gilder 2013). In Africa, knowledge is not processed to turn to information that can be converted into money. Knowledge is only confined to microstate in isolated systems such as universities and learning institutions (i.e., the ivory tower syndrome). Isolated systems create a state of very low entropy. Low entropy is caused by lack of a clear channel to transport knowledge from the university to the community (See e.g., Clausius, 1899; Godel, 1931; Shanon 1948). Knowledge in the universities and learning institutions, should be turned to information that would add value to commercial products which create wealth.
Power is the ability to endow knowledge with effectiveness, providing reliable low entropy channels of law, finance and governance, or to suppress or overrule knowledge by coercion (Guilder 2013). Power in Africa is mindless. It is driven by corruption and political positions. Prosperity in an economy is brought about by dispersion of knowledge from the microstate and from isolated systems, such as the learning institutions, to the community. Dispersion creates conditions for high entropy (Clausius, 1899). The dispersal of knowledge is then complemented by equal dispersal of wealth creation and power. Creating a channel for dispersing knowledge achieves the goal of dispersing power to the communities.
In Africa, knowledge is divorced from power. When knowledge is separated from power, knowledge does not turn to information and therefore it does not turn to money. Thus, the available knowledge and people in position of power do not have economic value in the society. To create economic value and wealth we have to unify knowledge and power through science, technology, engineering arts and mathematic [STEAM] (Gilder, 2008).
Statement of the Problem
In Africa people in positions of power do not have knowledge and people with knowledge don’t have power. Knowledge is separated from power with grave consequences (Gilder, 2013). When knowledge is separated from power it does not turn into information. Knowledge, therefore, does not turn into money. Knowledge becomes money only when it becomes information. Information brings power when it turns to money.
For knowledge to turn into information we have to create
conditions for high entropy. The specific problem is that the process for
creating conditions for high entropy of knowledge in Kenya is not clear.
Purpose of the 3rd Entrust STEAM Camp
The purpose of this STEAM Camp 2019 is to gather class 8 and high school students and graduates in Kenya at Stephjoy Girls’ High School to teach them how they can use STEAM to unify knowledge and power. The Camp will take place from Thursday, December 5to Sunday, December 8, 2019.
The camp seeks to answer the following questions:
- How can we use STEAM to turn knowledge into information so that it can become money?
- What is the scientific process for creating high entropy of knowledge that will convert knowledge into information and information into money?
- Students will learn how to use STEAM to find solutions to vexing problems of daily life.
- The camp will help students to select careers that can help them find profound solutions in their professional lives.
- Students and parents will get to know the importance of knowledge and information.
- Students and parents will learn how to use science for the betterment of society.
- Students will learn how science, technology and mathematics convert knowledge to information and information into money to reconcile knowledge with power.
Once the students experience the practical learning session of the STEAM camp, they will be motivated to grow into creative thinkers. The students will be able to generate powerful ideas for solving real-world problems.
How we do it
We have held 2 successful camps so far, one in December 2018 and one in April 2019. Parents from all over the country bring their children to the camp for an enlightening experience. We invite professionals from different career fields to come and work with students. The professionals engage the students in the morning sessions in podium presentations. Afternoons are for practical and interactive sessions. During these sessions, we give the students a problem where they sit in groups and work out possible solutions. Evenings include panel discussions with the students and specialists from different professions.
The theme for this camp is Money, Science and Technology. The students will learn how the three work together to create profound value. The camp will use Ubricoin as a technology to illustrate how to unify knowledge and power.
Money as Information: Entropy of Knowledge
All information is surprise. Only surprise qualifies as information. Entropy is information in terms of uncertainty. The higher the uncertainty, the higher the entropy in a system, and the higher the amount of information in a system. The system between the university and the village (for example, the university science and technology park and produce value addition center in a village [USTP-PVAC system] has very high entropy). Entropy is the amount of uncertainty of knowledge in the system before the USTP sets in. Information is the knowledge that the PVAC gains after the USTP has set in the PVAC. Information is the change between what we know before university transmits knowledge to the village, and what people in the village know after knowledge is transmitted to them.
Adam Smith (1776) focused on economics of order, how a market confronted with change restored new order, a new equilibrium. Rather than economics of order, knowledge economy relies on mathematics of randomness, a rigorous mandate for freedom of choice. When we strip all the baggage, we arrive at the concept of probability. All we need is certain randomness that there is uncertainty to the outcome. Key concept of information is probability, a genuine randomness that cannot be explained by anything more fundamental not caused by anything. For a given set of largely observable properties of knowledge, every possible configuration of elements of knowledge that could give these properties is equally likely.
Shannon defined information in terms of binary digits (bits) and measured information by concept of information entropy, or unexpected or surprising bits. Information entropy is governed by a logarithmic equation that derives its power from the Rudolf Clausius (1865) second law of thermodynamics, that entropy tends to be maximum. Shannon’s entropy in a system is maximum when all bits in a system are equally improbable and cannot be further compressed without loss of information.
Linear economics of disorder, disequilibrium and surprise could explain and measure the contribution of entrepreneurs (Guilder, 2013). For a given macrostate of knowledge, all microstates consistent with its elements of knowledge properties are equally likely. For some macrostates of knowledge, there are lots of different microstates of elements of knowledge that lead roughly to the same properties. While other macrostates can be produced by only a few microstates. If we leave the system alone, it will eventually try all microstates possible. All arrangements of elements of knowledge will eventually happen spontaneously.
Instead of elements of knowledge being distributed by position space, a macrostate of knowledge is really defined by how elements of knowledge are distributed through the phase space. The average distribution of an element of knowledge in a single individual in a phase space defines the knowledge properties of the system. If we leave a system long enough, its elements of knowledge will find their way and combine in all different possible forms. The vast majority of possible distributions of knowledge elements leave the system in a state in which knowledge is maximally spread out.
In knowledge entropy, the only special arrangement of elements of knowledge that change entropy are the ones that change knowledge properties. The macrostate that defines knowledge equilibrium is by definition the one with most microstates of elements of knowledge, the maximum entropy, assuming we do not force the system from outside, say by government interference. In a system with a clear channel of flow of knowledge, the probability of all elements of knowledge gathering in the same corner, say in a university is very low. Entropy is thus a measure of freedom of choice of where knowledge will go. When the probability of elements of knowledge gathering in the same corner is not zero, it gives rise to an ivory tower, where there is no freedom of choice of what to do with the knowledge.
Entropy is a measure of the number of positions that a state can occur. In this case, we define position of a state as an element of knowledge carried by a particular individual. In the example below (Figure 1 a and b), one sphere contains five elements knowledge and the other one contains zero elements of knowledge. Let us consider that the spheres on the left side to represent elements of knowledge in the university and spheres on the right represent knowledge in the community (say in the village). In panel (a) sphere contains 5 units of elements of similar knowledge, while the next sphere contains zero. There is immense knowledge in the university and negligible knowledge in the village. A state where all elements of knowledge are in the same sphere, has the lowest entropy because there is only one way the elements can combine. In this state knowledge is said to be in ivory tower, and has the lowest entropy.
In panels (c) and (d) there five different ways that elements of knowledge in the university can combine with elements of knowledge in the community. In the panel (e) and (f) there are many different ways the elements can combine. When there are many ways that elements can combine, the system has high entropy. The more evenly the elements are distributed between the spheres, the higher the entropy. The more evenly knowledge is distributed between the university and the community, the greater the entropy. Since entropy always increases, elements of knowledge will evenly spread out the in the university and the community if we wait long enough. When evenly spread out, elements of knowledge they will never gather again in one sphere no matter how long we wait, because entropy always increases.
Behavior of elements of knowledge is inherently probabilistic. Entropy applies to the area occupied by elements of knowledge in a nation. The concept of entropy does not apply to specific combinations of elements of knowledge. Rather, entropy applies to a measurement. Say, measurement of the area that elements of knowledge are confined to. If the elements are confined to a small area, say, a university, the system has less entropy than if they are confined to a larger area, say and entire county, a nation or the entire world through the world wide web. This is because, more combinations are possible in a larger area than in a smaller area (see figure 1 (e) and (f)). The internet is a very large area that allows very many combinations of elements of knowledge.
Each individual combination of elements of knowledge in a large area is equally likely to each individual combination of elements of in the small area. However, if we measure the area across which all elements of knowledge are spread out, then, the greater the area, the greater the number of combination positions of elements of knowledge are possible, because there are more combination positions available. The internet increases tremendously the number of combination positions of elements of knowledge from a particular university. We call widespread combination positions as macrostates of knowledge
For each possible macrostate, there are many possible combinations of positions and velocities of knowledge (microstates). The number of possible microstates is much larger for a macrostate with large area such as a county, or a nation. The internet is the largest macrostate of elements of knowledge. To maximize the entropy of elements of knowledge in a university all students and teaching staff must learn to work effectively on the internet. They must post their final published works on the world wide web.
If the number of possible microstates is larger, we say that a particular macrostate has high entropy. If number of combinations of elements of knowledge is very large on the internet, we can say that the internet has very high entropy. While, all microstates are equally likely, all macrostates are not equally likely. All individuals in a university are equally likely to produce innovations. Not all innovations are likely to become commercializable products in the real world.
UBRICOIN CHANNEL TRANSMITTING KNOWLEDGE OVER TIME AND SPACE
Because Shannon was remorselessly rigorous and restrained, his theory could be brought to bear on almost anything transmitted over time and space. Ubricoin, UBN as a transmitter of knowledge has large capacity for works of innovation that include design, engineering, manufacturing, marketing and distribution in a PVAC. Innovation is a complex endeavor dense with information at every stage. While the managing director of a PVAC will control information on many of the stages, the ultimate success will depend on existence of UBN working as a channel through which the innovation can be consummated. UBN channel for knowledge is vital to PVACs achievements.
PVACs will need existence of a stable channel that ideas conceived by combination of elements of knowledge at one point in time and space to arrive at another point in time and space. Essential to UBN as a channel is existence of Smithian (1776) order, such that essential features of the economic system in place at the beginning of the value adding processes at the PVAC are still there at the end. This is where a stable government policy comes in. Essential Smithian features include free trade, reasonable regulations, sound currency (cryptocurrency), modest taxation and reliable protection of property rights. Thus success of PVACs as places for combination of elements of knowledge delivered through UBN channel will need an environment that does not drastically change in these critical elements. Technology developed at PVACs can radically change but the basic elements of the environment of operation of UBN channel cannot change drastically for free entrepreneurship to occur. As such, transmission of high entropy surprising elements of knowledge requires a low entropy unsurprising channel largely free of interference such as a cryptocurrency.
CHANNEL CAPACITY OF UBN
In microstate, we have no control over the position and velocity of elements of knowledge. But we have to control certain properties of the channel of knowledge such as the bandwidth, quality of the signal, and the noise. Shannon formulated his channel capacity theorem thus:
C = B log2 (1+S/N). Where, B = bandwidth of the channel; S = received signal power; and N = Gaussian noise. S/N ratio means that if the received signal has low power, the channel capacity is low. If the Gaussian noise is very high, the S/N ratio is extremely low, approaching zero. Very high noise in the channel reduces the capacity of the channel C to zero.
Bandwidth is the apparent physical carrying capacity of elements of knowledge, carried in each individual. Peer to peer, boundariless cryptocurrencies have a very high unrestricted bandwidth for passing unlimited amount of elements of knowledge from the university to the community. Cryptocurrency has bandwidth billion times greater than than those possible with paper money of fiat currency.
Noise is the change in the channel. An ideal noiseless channel is perfectly linear. A cryptocurrency such as UBN is an ideal channel. What goes in to the channel is what comes out. Such a channel is changeless. The message in the channel communicates the changes. The message of change can be distinguished from unchanging parameters of the channel.
Noise in the channel that reduce channel capacity comes from many sources including acts of God (e.g., earthquakes, floods, volcanic eruptions and the like). An overzealous government is the greatest source of noise and interference in a channel. The government interferes with channel in two ways:
- when the government neglects its role as guardian of the channel by failing to create stable political and economic environments, and
- when the government tries to help by becoming the transmitter, turning up the power to certain favored signals, and fills the channel with unpredictable political interference that depresses the sacrificial longterm investment of capital.
Interest rate is the opportunity cost of investment. When
government manipulates interest rates of fiat is send false signals about
investment breeding confusion that undermines entrepreneurial activity.
Economic friction of fiat currency is a great source of noise, that reduces channel capacity for transmission of university knowledge to communities to zero.
- Using the concept of entropy, what can we do to increase the outflow of knowledge from learning institutions (High schools, Colleges and Universities).
- Considering Shannon (1948) channel capacity equation:
C = B x log2 (1+S/N)
What are the characteristics of a channel that can move large quantities of knowledge from learning institutions to the villages?
- What would give a better channel between fiat and cryptocurrency (Buterin 2012: Nakamoto, 2008). Why? (Give 10 reasons).
Buterin, V. (2012). A next generation smart contract and decentralized application platform. Retrieved from https://whitepaperdatabase.com/ethereum-eth-whitepaper/
Clausius, R. (1899). Entropy: Second law of thermodynamics. New York, London; Harper & Brothers publishers. Gateway Editions publisher. ISBN 1621576132, 9781621576136
Gilder, G. (2018). Life after Google: The fall of big data and the rise of the blockchain economy.Washington, DC; Regnery Publishing.
Gilder, G. (2013). Knowledge and power: The information theory of capitalism and how it is revolutionizing our world. Washington, DC; Regnery Publishing.
Gilder, G. (1981). Wealth and poverty. A new edition for the twenty-first century. Washington, DC; Regnery Publishing.
Gödel, K. (1931). On formally undecidable prepositions of principia mathmatica and related systems. New York; Dover Publications.
Nakamoto, S. (2008). Bitcoin: A peer-to-peer electronic cash system. Retrieved from https://bitcoin.org/bitcoin.pdf
Shannon, C. (1948). A mathematical theory of communication. The Bell System Technical Journal, 27(3), 379-423, doi: 10.1002/j.1538-7305.1948.tb01338.x.
Smith, A. (1776). An inquiry into the nature and causes of the wealth of nations. 1 (1 ed.). London: W. Strahan. Retrieved from https://oll.libertyfund.org/titles/smith-an-inquiry-into-the-nature-and-causes-of-the-wealth-of-nations-cannan-ed-vol-1
Turing, A. (1938). Systems of logic based on ordinals (PhD thesis). Princeton University. doi:10.1112/plms/s2-45.1.161.