26 December 2016

Robert Steele: Augmented Intelligence with Human-Machine Integrity – Future-Oriented Hybrid Governance Integrating Holistic Analytics, True Cost Economics, and Open Source Everything Engineering (OSEE)

Robert David Steele

Future-Oriented Hybrid Governance Integrating Holistic Analytics, True Cost Economics, and Open Source Everything Engineering (OSEE)

Synopsis

The gatekeeper companies in the Data and Information Technology (IT) industries, both old and new, have betrayed the public trust at multiple levels – good people, no doubt, but trapped in bad systems. Their “Balkanization” of software, hardware, data, and spectrum has handicapped humanity and despite some stellar offerings in a micro sense, at the macro level they have imposed an opportunity cost against innovation, integrity, and intelligence of 90%. Open Source Everything Engineering (OSEE), when combined with holistic analytics and True Cost Economics (TCE) and placed squarely upon a foundation of human ethics and human thinking, can achieve the United Nations (UN) Sustainable Development Goals (SDG) at 10% of the cost of the old predatory proprietary paradigm, in half the time or less. This is a major augmentation of C. K. Prahalad’s vision of “the fortune at the bottom of the pyramid,” (Prahalad) and the only means by which we can achieve “Inclusive Capitalism” (Rothschild).

Introduction

Stepping back from the many unsubstantiated claims of the Artificial Intelligence (AI) community, I humbly suggest that Intelligence Augmentation (IA) is also making many of the very same mistakes common to the over-sold and under-performing computers “ubber alles” mindset, a mindset that currently dominates thinking within the Singularity/AI domain, with a troubling creep into the IA world.

The potential of machine intelligence in and of itself is not only severely over-stated, but misses the fundamentally important role that humans have played in the past and will play in the future with regard to technology. Considering the fact that we collect less than 1% of the information available today and process less than 1% of what we collect, it is obvious that the infinite potential of human intelligence and imagination has been denigrated—whether intentionally or not— to the point that the AI value proposition is simply not credible when tested against reality (Steele, 2016a).

Consider the fact that the National Security Agency (NSA) in the United States has spent at least a half trillion dollars on “advanced” computers, including AI but not IA, and yet has shown little or no return on investment to the taxpayer. Illuminating this point, James Bamford, the single most published observer of the NSA, ends his book Body of Secrets with the following (Bamford, 2002):

Eventually NSA may secretly achieve the ultimate in quickness, compatibility, and efficiency – a computer with petaflop and higher speeds shrunk into a container about a liter in size, and powered by only about ten watts of power: the human brain.

This chapter puts forward a concept for achieving a human-centric World Brain in which Applied Collective (Human) Intelligence comprises 80% of the whole, while augmented (machine) intelligence provides no more than 20% of the whole (Steele, 2015a, 2014). Focusing on the human (and especially the now-marginalized five billion poor at the bottom of the pyramid) and the potential of aggregated orchestrated human intelligence in every clime and place, this chapter focuses in general terms on the severe short-falls in the current approach to computing that fails at every level when evaluated in relation to the needs of human beings and a sustainable Earth. I argue against short-term corporate profit taking and government control needs and conclude with an argument for the unlimited wealth that is possible if we get serious about taking Data and IT to the next level.

We can do better. For example, for $500 per person, a one-time cost, I can relocate one million Somalis from UN displacement camps in Ethiopia, Kenya, and Uganda, to the uncontested northeast portion of Somalia that has three things in abundance: dirt, seawater, and sunlight. $500 million dollars will buy them the Global Village Construction Set (GVCS) inclusive of pressed-brick shelters, free energy, unlimited desalinated water, in-home compost sewage, free cellular and Internet services, and an aquaponics industry (fish and plants) free of pesticides (Steele, 2013a).

Unrecognized by most other authors in the IA and AI spaces, is the true cost of computing as practiced today; this true cost includes the opportunity cost of failing to empower humans to learn, connect, and decide at every level from local to global, and the tangible cost of an economic model that is 50% waste across the board, drawing down on natural capital (including human capital) at an unsustainable rate, and toxifying the air, land, and sea toward a definitive “sixth extinction” (Kolbert, 2014).

From Class A carcinogenics only legal in China – a central element of “smart phones” that puts hundreds of thousands of Chinese into leukemia wards and early graves (Suleman, 2014) – to the opportunity costs of using technology to perpetuate the financial perversion of the economy known as “flash trading” (Lewis, 2015) computers are “costly” at all levels. The problem, in my view, is that computing today simply does the wrong things righter at greater and greater expense, rather than doing the right things to enable true intelligence augmentation. From massifying a retarded industrial-era educational system — Massive Open Online Courses have a completion rate of 4% to 15% (Jordan, 2015; McKendrick, 2013), to perpetuating a Western development model that glorifies scientific reductionism and a military-police-intelligence-industrial complex, it has become clear that our current society is simply not at all interested in creating a prosperous world at peace.

The existing corporate media system is corrupt to the bone – truths are repressed and official narratives perpetuated, with money displacing integrity on every topic of public import (Dubose and Bernstein, 2006; Lewis, 2014; Rampton and Stauber, 2003; Risen, 2014). The existing modern Western political system is a mix of faux democracy often bordering on outright fascism — corporate control of political puppets (Amato, 2009, Steele, 2013b). Consider that 40 of 42 dictators (Palmer, 2005) are embraced by Western democracies for the convenience that a semblance of control offers, despite the cost of repressed publics, 25% unemployment (Donavan, 2015; Williams, undated), and the millions of illegal immigrants (Steele, 2002a, 2002b) driven by desperation toward Australia, Europe, and the United States.

This chapter is a manifesto on behalf of authentic human intelligence combined with human integrity that outlines very specific steps for empowering humanity toward human-centric computing so as to create a prosperous world at peace. That is, a world that works for all.

The 80-20 rule applies. Although machines could indeed run amok, I worry far more about Artificial Stupidity (not to be confused with but very similar to the Singularity threat). Indeed, the bottom line is that machines require authentic, comprehensive, persistent data to be effective. And that means the capacity to process data in near-real-time at exascale speeds across a broad geospatial and temporal spectrum. Neither of these are likely to happen in our lifetime absent an across-the-board commitment to OSEE.

What could happen in our lifetime is the creation of a World Brain in which all humans are linked to all information in all languages all the time, and augmented by machine processing and machine tools. This is to say, an open cloud augmented by open source processing and open source tools including in line machine translation that understands slang.

To be blunt, the data and computing industries as now trained, equipped, and organized, to serve the narrow self-interest of wealthy elites. Major and minor companies are doing interesting things on the margins. But they are failing to leverage the center of gravity for the creation of infinite wealth with the only source of infinite innovation on Earth: the Human Factor.

Fundamentals of Intelligence

Intelligence – whether human or machine – is an isolated irrelevant capability in the absence of the fundamentals against which it can become applied intelligence. Below are three of the fundamentals that can help us appreciate the potential of both IA and AI going forward. (1) Holistic Analytics, (2) True Cost Economics, and (3) Open Source Everything Engineering (OSEE). I will explain each of these concepts in detail. 
Holistic Analytics 

Holistic analytics includes universal data access and a comprehensive analytic model that allows for free and open consideration of all possible causes and effects across all domains. This is not what occurs today from any of the information “tribes” (Table 1) that largely avoid the sharing of information – or insights.

Universal Data Access.

Below I discuss the data paucity that characterizes our analytic pretense today. Suffice it to say here that collecting 1% (formally published) of 1% (written) of 1% (known) – and processing only 1% of what we collect in the way of “Big Data,” is pitiful (Arnold, 2014; Meeker, 2014). Data may be the “new oil” but our entire modern society is posturing over one “oil spot” and failing to recognize the potential value of liberating all data in all languages and mediums. That is, making available to humans “all information in all languages all the time.” Neither governments nor vendors are helpful to this end. From rigged war games to rigged demonstrations, fractional data sets are used to make grandiose claims that are borderline criminal.

A Comprehensive Analytical Model.

The Earth is a closed system and natural capital is not renewable. Those are facts ignored by all existing analytical models (Limbaugh, 2014). Changes to the Earth that used to take 10,000 years now take three years (Linden, 2006). Indeed, humanity is on the verge of a sixth extinction. These are facts that a few acknowledge, but these facts also are “shut out” by all governments and vendors eager to use fiction as bridge to the future, because fiction allows for profit by the few without accountability to the many. A comprehensive analytic model for decision-making would at a minimum be effective at four levels of decision-making (strategic, operational, tactical, and technical) and it would provide for the simultaneous integrated appraisal of the ten threats to humanity, the top twelve core policies from agriculture to water, and the top eight demographics defining the future irrespective of any decision we make. I explain this below.

Here are two simple models for multi-disciplinary, multi-lingual analytics: (1) demand “teamwork” across all boundaries, and (2) develop a massive new approach to how we collect, process, and analyze data— not just within a single government, but across all eight information “tribes” and across all governments and their eight information “tribes.”

The eight information “tribes” I have been speaking and writing about for the past quarter century are listed below (Table 1):

Table 1

Eight Information “Tribes”

Academic

Civil Society (Labor, Religion)

Commerce (especially Small)

Government (especially Local) 

Law Enforcement

Media (including Bloggers)

Military

Non-Government/Non-Profit 


As I have noted on more than one occasion, there are iron curtains between these tribes, wooden walls between organizations within each tribe, and plastic curtains between individuals within each organization. If we are to achieve peace and prosperity, empowering these “tribes” to share information and make sense on behalf of humanity is a first step.

The dysfunctional nature of our current data universe cannot be over-stated. Add to that the failure of education and the paucity of IA tools for collaborative information-sharing and multi-disciplinary sense-making across time and space, and you have the proverbial Tower of Babel – a very expensive as well as dysfunctional tower.

The graphic below is an idealized depiction of what a “standard” academic organization should be able to do in the aggregate. Instead we have disciplinary stove-pipes with no cross-fertilization to speak of. The current “fad” of multi-disciplinary research and inter-disciplinary teaching is a fraud, in my view – lip service at best. Students need to understand both the data sources and data gaps for each of the disciplines, and the analytical models – their strengths and weaknesses – for each of the disciplines, and how to “do” comprehensive or “holistic” analytics (Figure 1).




Click on Image to Enlarge

Figure 1. Whole Earth Analytic Model. Overview of most of the elements that must be considered to do holistic analytics.

The ten high-level threats (some but not all listed above – all listed below in priority order) are as identified by Lieutenant General Dr. Brent Scowcroft, United States Air Force (Ret) and the other members of the United Nations High-Level Panel on Threats, Challenges, and Change (2004). The twelve core policy domains shown below are as extracted by the EIN team from a review of presidential “Mandate for Change” transition books across four different election cycles. The eight demographics are comprised of the five most populous countries plus Iran and Venezuela, with an open ended Wild Cards slot for countries such as Turkey.

In this second graphic (Figure 2), note that the US Government (USG) and the US secret intelligence community (IC) focus on the two dots – war and terrorism — to the exclusion of all else. In my view, no one in the USG or US IC is serious about intelligence in the public interest.



Click on Image to Enlarge

Figure 2. Operational Analytic Model. Sample national analytic model for the USA – threats, policies, and demographics together. 
True Cost Economics 

I will itemize just three of the many more reasons why I believe that modern Western civilization is collapsing today:

(i) First, scientific reductionism and specialized (stove-pipe) learning has led to multiple generations of PhD/DBA “graduates” who know everything about nothing and nothing about everything (Klavans, 2008).

(ii) Second, in the absence of informed and honest oversight from governments and the public, Western industry has been based on chlorine, oil-based plastics, and corn sugar. All three are toxic beyond the comprehension of most citizens (Thorton, 2001). While “true cost” economics has a niche following, in my own direct experience, 99% of all organizations refuse to recognize this as a foundation for their decision-making. The system is “rigged” toward short-term profits favoring the few over the many. Until the 99% “get a grip” in all that can be known, they will continue to be “farm animals” to be exploited as ignorant consumers.

(iii) Third, government corruption and the deliberate dumbing down of the public have led to an artificial reality in which lies define public understanding, rather than the truth. The USG, for example, lies about unemployment (the real rate is 23%, with 40% characteristic of people of color, single moms, new college graduates, and old guys like me), inflation – as much as 72% in some easily manipulated product categories (Durden, 2015), and justice – there is one legal system for the very rich who can manipulate foreign exchange and interest rates with impunity, and another for the rest of us (Taibbi, 2014), putting more African Americans in prison than there were slaves at the beginning of the Civil War (Alexander, 2012, 2011). 935 now-documented lies led the West into Iraq, Afghanistan, and other countries, at a cost of over 4 trillion dollars and millions of lives lost or displaced (Bilmes and Stiglitz, 2008; Lewis, 2014).

Rather than maintain this trajectory of occlusion and deception, our objective should be to achieve a renaissance of the data and IT industries – and a resurrection of holistic education, intelligence, and research— such that citizens in the aggregate no longer suffer the many atrocities that represent “business as usual” for the West.

At a strategic level, TCE considers both the ecological and social costs of any product, service, policy, or behavior, and the full cradle to grave life-cycle costs of every artifact. Landfills and the toxins emitted from landfills, for example, are a cost not now considered. At the tactical level – and in the aggregate at the strategic level, TCE demands that we accurately calculate across the entire extraction, production, transport, utilization, and disposal cycle, the virtual water, total fuel, toxin emitted, child labor, regulatory violations, and tax avoidance. To be sure, more nuanced studies can detail other factors as well. 
Open Source Everything Engineering (OSEE) 

In the 1980’s I founded and led the Open Source Intelligence (OSINT) movement at the same time that Richard Stallman and others pioneered the Free/Libre/Open Source Software (FLOSS) movement.

Together we watched the concept expand, first to Open Source Hardware and OpenBTS (Base Transceiver Station) along with Open Spectrum, and then to other areas.

My latest book, The Open Source Everything Manifesto: Transparency, Truth, and Trust (Steele, 2012a) itemizes over sixty “opens”. I have since collaborated with Marcin Jacubowski of Open Source Ecology and Michel Bauwens, founder of the Peer to Peer (P2P) Foundation, to define nine major open source domains (Figure 3), with three sub-sets for each (Peer to Peer Foundation, 2015).


Click on Image to Enlarge

Figure 3. Open Source Everything Engineering (OSEE) Baseline. The “opens” needed to create a free and wealthy global society.

This matters hugely because it has been reliably established that OSEE approaches cost ten percent (10%) of the full life-cycle of industrial-era proprietary “solutions” and are especially helpful in eliminating training, licensing, maintenance, and mandatory spare parts sourcing from the original vendor (Jacubowsky, 2016). OSEE also provides for liberal inter-changeability of parts, as pioneered by the GVCS team, and offers a sustainability aspect that is perhaps one hundred times (100X) that of proprietary “solutions” that are designed to fail and fail often.

The admirable and necessary seventeen Sustainable Development Goals (SDG) of the United Nations (UN) – listed below (Table 2) cannot be achieved by today’s industrial-donor paradigm where roughly 20% of the promised donations materialize (Annan, 2005) and only a fraction of the funds – from 1% to 20% of the 20% of the sought funding — reach the village level (Slemrod, 2015; Ortel, 2016). The hard reality is that industrial-era “solutions” are too costly, take too long, are not interoperable, and generally collapse within a few years.

IA – and a new mind-set about how we integrate holistic analytics, TCE, and OSEE – is essential if we are to achieve these seventeen goals that will both restore the viability of Earth as a long-term habitat for all species, and avoid the sixth extinction focused on humans specifically (Steele, 2016b; United Nations, 2016).

Table 2

UN Sustainable Development Goals

01 End Poverty 

07 Energy for All 

13 End Climate Change 


02 End Hunger 

08 Inclusive Economy 

14 Save Oceans 


03 Health 

09 Infrastructure 

15 Save Ecosystems 


04 Education 

10 Global Economy 

16 Justice for All 


05 Gender Equality 

11 Safe Smart Cities 

17 Unify Humanity 


06 Water for All 

12 Sustainable Economy 


Where Have We Gone Wrong?

Bankers and technologists as well as the politicians that serve them have one major flaw in common: a complete disregard for reality at the grass roots level.

There are two breaking points for the globalized economy: the first is the Earth and the limitations of the natural capital that we are not only consuming, but also destroying (geoengineering and fracking are especially pernicious even treasonous endeavors), and the second is the Human Factor – at some point, after the 99% has been screwed 99% of the time, no combination of “wedge issues,” no efforts to revive race wars, no number of lies from the government about unemployment or health or concocted threats, will stop a revolution.

Computers Against the Earth

In the 1970’s, the alleged academic discipline of Political Science (previously known as Current History) took a terrible turn away from ethnographic qualitative field work that required a grasp of foreign culture, foreign history, foreign language, and foreign nuances, and turned instead toward “Comparative Studies.” Comparative Studies in this context is code for never having to learn a foreign language or meet a foreign person or walk a foreign path – it substitutes “data analytics” that can be done from the same air-conditioned cubicle one has always inhabited. Rather than arousing the scorn it merited, this wrong turn was embraced by peers, each recognizing that as long as no one was held accountable for actually understanding anything or producing useful new knowledge, the “bubble” of pretense could see them through a full academic or “think tank” career.

At the national level, the US IC is a manifestation of all that is wrong with excessive reliance on computers over humans. Despite spending over $1.2 trillion dollars (an average of $40 billion a year over each of 25 years, with $100 billion a year at the high point), the US IC is abjectly incapable of producing “decision-support” for the President, Cabinet officers, and major commanders and ambassadors. Unlimited money borrowed and printed in our name, combined with secrecy that is nothing more than a “get out of jail free” card— dismissive of Congressional oversight, media investigation, and accountability to the public— have enabled the spending of hundreds of billions of dollars across three major technical domains: imagery computing, signals computing, and death by drone. Neglected has been the Human Factor – the fifteen slices of Human Intelligence (HUMINT) inclusive of all that can be known from indigenous sources via Open Source Intelligence (OSINT), is illustrated below in Figure 4 (Steele, 2010).


Click on Image to Enlarge

Figure 4: Full-Spectrum Human Intelligence (HUMINT). All types of human intelligence must be manages and integrated as a whole.

In the aggregate, the loss of integrity across the “tribes” of information – academia, civil society, commerce, government, law enforcement, media, military, and non-government/non-profit organizations – has been radically compounded and perpetuated by a global computing industry focused on sales and public relations instead of long term outcomes and engineering.

Information technology only makes bad management worse. (Paul Strassmann, 1992, Steele, 1994)

The militarization of the US economy compounded the debilities imposed by the financialization of the economy. Government specification cost-plus “engineering” displaced competitive and innovative engineering able to do more with less. Franklin (Chuck) Spinney (2014, 1985).

My own applicable quote is “Technology is not a substitute for thinking” (Steele, 2006, 2013c, 2013d). The failure of computers is a human failure. As with any artifact, the responsibility for design, for manufacturing, for utilization, and for harmonization within the larger construct of civilization is a human responsibility.

As this chapter concludes, intelligence with integrity is rooted in human intelligence and human integrity. AI and IA are only as good as the humans that develop and utilize these tools. It is those humans – and the weak arguments dismissive of holistic analytics, true cost economics, and OSEE – that are at the root of our failure to properly augment human intelligence with machine intelligence, while establishing and maintaining integrity across the whole rather than within the parts alone.

Computers Against Humanity

Howard Rheingold is the original modern prognosticator on the importance of tools for thought (2000), and Kevin Kelly is the original visionary who understood “hive mind” as well as the potential (but still unrealized) benefits of networked economies that empower individuals over corporations (1995, 1999). Both have been ignored by the computer industry – and especially by the US IC – because of the inherent corruption of the Western governance paradigm that concentrates wealth and power among the 1%.

It is the lack of intelligence with integrity – human intelligence and human integrity – that permits such gross destabilizing distortions to develop. Machines can assist humans in collecting, processing, and analyzing all information in all languages all the time, but ethics is the root human operating system and machines are not the source of ethical depth and breadth. Ethics is the ultimate operating system. Without human ethics, all machine systems lack root integrity. (Danalylov and Steele, 2016).

More recently Micah Sifry (2014) has written a superb book entitled, The Big Disconnect: Why the Internet Hasn’t Transformed Politics (Yet). His book is so relevant to this chapter that I offer four brief quotes below:

QUOTE (34): “…has not made participation in decision-making or group coordination substantially easier.”

QUOTE (49): “We can save the body politic, but to do so we must remember that the purpose of democracy isn’t only for each of us to have our say, but to blend individual opinions into common agreements. … We need a real digital public square, not one hosted by Facebook, shaped by Google, and monitored by the National Security Agency.”

QUOTE (159): “Many weak causes do not add up to a stronger movement.”

QUOTE (161): “First, we need to insist on tools and platforms that genuinely empower users to be full citizens. And second, we have to take back our own digital agency.”

On the next page I provide a relatively comprehensive depiction of all of the pre-conditions for revolution, most of which exist today in every industrialized country and most but not all other countries. The two greatest pre-conditions for revolution are the concentration of wealth and the loss of legitimacy of the government, in the larger context of a loss of balance across all the domains for any given society. For a revolution to occur, a precipitant is needed (e.g. a Tunesian fruit seller) and, as Chris Hedges (2016) has pointed out in his most recent book, the military, and police must abandon the 1% and align themselves with the protesting public. Below (Figure 5) is my original view of the preconditions of revolution (Steele, 2011).


Click on Image to Enlarge

Figure 5. Pre-Conditions of Revolution. Humans, not machines, revolt when oppressed – these are the reasons why they revolt.

What Is To Be Done?

There are some very specific capabilities that we need to build to manage the transition to effective human-machine integration, capabilities that the major industrial enterprises – old as well as new – have thus far refused to address. I will use the traditional intelligence cycle – collection, processing, analysis, dissemination – to present these, first at the strategic level and then at the end-user level.

Strategic Initiatives

Collection. Every discussion of human and machine intelligence must begin with recognition of the fact that we are collecting less than 1% of the relevant information in all languages and mediums. I cannot over-state this. Of all the scientific papers written, only 1% are published. All of those scientific papers that are written and mostly (99%) not published in turn represent a tiny fraction – perhaps 1% of all of the other publications from non-profit think pieces to graduate student research papers to government studies to citizen advocate white papers and so on. Beyond that is all the knowledge that has not been published – that is resident in the minds of indigenous human beings and transient experts. My conclusion: the “formal” world of “published” information is at best 1% of 1% of what we know. 

We need to achieve, in increments, 20%, then 40%, then 60%, then 80%, eventually 100% of analog and digital information in all languages and mediums and 100% opt-in participation of all humans in all locations as “on call” sensors and “on call” thinkers. This requires, among other things, a universal open sparse matrix on top of a 1:5,000 geospatial plot, and not just Open Access and Open Document but Open Hypertextdocument System (OHS) as conceptualized by Doug Engelbart (1994). 

A World Brain Institute sponsoring World-Brain.net, World-Brain.edu, World-Brain.org, and World-Brain.com, together (Figure 6) with a United Nations Open-Source Decision-Support Information Network (UNODIN) (Steele, 2014b, 2012b, 2010) and an enabling Open Source (Technologies) Agency such as I have recommended to Vice President Joe Biden (Steele, 2015b). Included in this universal open network would be a Global Game supportive of holistic analytics, true cost economics, and OSEE return on investment (RoI) calculations. 



Click on Image to Enlarge

Figure 6. World Brain Concept. Four fully-integrated information domains – together, all information accessible by all humans.

Processing. We process 1% of the “big data” that is in digital form. Bearing in mind that most data is either in analog form or unpublished in any form (human knowledge in situ), and that what we collect and process has been driven by industrial-era scientific reductionism, we must recognize that as with collection, processing is at the 1% of 1%, representing a severe level of dysfunction. 

We need to achieve the ability to process all data in all forms across all threat and policy and demographic domains, in time and space context (Figure 7). 

This requires something we do not have today and are not likely to achieve in our lifetime absent a global consensus on the need: a massive open sparse matrix on top of a 1:5,000 geospatial map of the world, and exascale processing as well as an end to all vendor-specific data processing barriers. 


Click on Image to Enlarge

Figure 7. Earth Intelligence Concept. Applied intelligence demands budget and policy applications – open national conversations.

Analysis. Analysis in the West tends to be monolingual, stove-piped by topic, and devoid of depth while also lacking any semblance of true cost economics. 
We need to abandon the unilateral nationalist top-secret analysis paradigm, and shift instead to the multinational, multilingual open source paradigm (Figure 8). We also need to abandon the stove-pipe organizational approach to analysis in which each organization within each of the eight information “tribes” does its own analysis in its own way against single topics or issue areas. Governments must devise new means of orchestrating information-sharing and sense-making across all boundaries, both at home and abroad. Connecting humans to one another is 80% of the challenge – knowing who knows, knowing who cares, knowing who is open to collaborative engagement. At the strategic level, pattern analysis and anomaly detection across all domains, and the reliable tracking of true cost economic information for all products and services, all policies and behaviors, will be critical to the re-design of everything (all domains now suffer significant waste – much as 50%!). 



Click on Image to Enlarge

Figure 8. Local to Global Information-Sharing Concept. The military may be best suited to serve as a hub for open global sharing.

Dissemination. The death of Thomson Reuters and Elsevier is to be welcomed. They represent a refusal to adapt to the Open Science era, and now Sci-Hub is beginning to bury them both. Knowledge is air. Knowledge is life. This fencing of the commons and its destruction of the tangible physical earth, manifests itself in the intangible world as intellectual property law. Like diapers for babies, this law must be set aside in favor of open dissemination of knowledge and information. 
We need a distributed Internet along the lines of what Sir Tim Berners-Lee is seeking to devise. One that is not only impervious to censorship and data manipulation and destruction, but that also makes possible a universal and complete two-way participation by any human mind desiring to opt-in. “One time data entry, universal access” was the objective for the digital innovators in the 1980’s. However, this worthy objective has been systematically stymied by politicians in the service of greedy corporations that have sought to lock-in customers with information gulags, guarded by constantly mutating Application Program Interfaces (API). What we all require instead is an interactive network allowing any combination of humans to come together at any time on any topic. 

End-User Initiatives

Collection. If collection is hosed at the strategic level (1% of 1%) it is indeed worse at the tactical level, where end-users must put up with – to take just one example – a multiplicity of search engines that are largely dysfunctional. Some of them – Google for example – are positively dangerous with regard to the ways in which they skew search results for financial gain, in some instances, manipulating search results for political reasons (something that is tantamount to an undeclared campaign contribution and a violation of federal electoral laws). I appraise the availability of relevant information to the end-user as being 1% of the strategic 1% of 1%: 0.000001. 

We need a rapid migration from the current tolerance of Portable Document Format (PDF) artifacts that are not full text indexable, and a vast acceleration in the ability of any end-user to migrate an analog document – including crumpled captured documents covered in mud (a requirement I articulated in 1988 and something we still cannot do) – at the same time that all audio and video must be instantly transcribed into full text online. 

The state of processing at the end-user level is roughly equivalent to that of collection. It is simply not possible for the average end-user to make sense of even a modest amount of information – say 2,500 pages – from which all names, dates, locations, times, and subject matter tags must be extracted. As my colleague Stephen E. Arnold has documented so well over the years, and most recently in his new book CyberOSINT: Next-Generation Information Access (2015; Steele, 2015c), none of the current end-user processing systems—and I explicitly include i2 and Palantir— are worthy of consideration as foundations for moving forward. 

What we need is a completely integrated local to global processing system that enables all opted-in processing units to be part of one massive exascale-plus cloud with zero proprietary boundaries supporting anonymity, identity protection, privacy, rights, and security across individual, organizational, national, and multinational boundaries. We also need machine-speed translation across 33 priority languages and another 150 “local knowledge” languages.

Analysis. The analysis domain suffers from a corrupt and inadequate educational system that no longer teaches the art and science of critical thinking. More than this, it lacks the multidisciplinary and holistic skills needed to properly address emergent challenges. Indeed, our so-called graduate students have no idea how to use the Science Citation Index and the Social Science Citation Index, among many other tools for finding exactly the right people with exactly the right knowledge and information. Compounding the paucity of processing tools is the paucity of integrated analytic tools. 
We need to completely reconstruct the educational system, the intelligence system, and the research system (Figure 9). While this idea is strategic, its importance can only be appreciated at the tactical level. We have embedded self-interest and complacency within our youth, even those from the prestigious universities. No amount of technology will substitute for the loss of entire generations of “all-source” analysts able to think. 




Click on Image to Enlarge

Figure 9: Concept for Local to Global Intelligence. Education, intelligence, and research are “one.” 
We need the eighteen integrated tools first defined in 1985, and we need for them to be open source in nature, available for free distribution to all humans across all organizations all over the world. We need to create a new meta-doctorate that integrates holistic analytics, true cost economics, and OSEE to the point that we can achieve the UN SDG within as little as ten years. 

We need computer-assisted tools for analysis such as were clearly defined in 1985-1989 (Webb et al, 1989) and still do not exist today, $1.2 trillion dollars having been spent by the US IC alone, never mind vastly more by IBM and everyone else pretending to do machine analytics (Figure 10). 




Click on Image to Enlarge

Figure 10. Computer-Assisted Tools for Human Intelligence. Defined in the 1980’s, still not available in one open package today.

Dissemination. Across many democratic societies, the public now has a fraction of the attention span needed in order to contribute as citizens and stakeholders. A single graphic and a 3-minute YouTube appear to be the means by which we communicate knowledge and information. Most of what we teach and how we teach is dependent on antiquated textbooks that are grotesquely over-priced. To be sure, classrooms where bright young people are held prisoner for 18-26 years (required to sit still and squelch their creativity) all need to be “flipped.” 
We need to completely displace the Elsevier and Thomson Reuters approach to the dissemination of knowledge, while moving Sci-Hub to the mainstream and deepening it to the point that all publications are subject to both peer review at the paragraph level, and citation linkage at the paragraph level. We need a sparse matrix in the cloud capable of holding all information in all languages all the time, tagged across time and space and independent of language. Above all, we need to empower individuals so that they can move markets overnight by being informed at the hand-held level (Figure 11), to the point that we might put the Koch Brothers, Monsanto, Coca Cola, and Nestle out of business within a few weeks. 


Click on Image to Enlarge

Figure 11. Intelligence at the Hand-Held Level. Public education at the point of sale, not government regulation, is the real revolution.

Conclusion

While I respect what the many researchers, innovators and entrepreneurs are trying to do with IA and AI, it is my firm belief that as long as we ignore the fundamentals of human intelligence, all of this work will be marginal to the kind of augmented intelligence we desperately require. Tom Steyer, the West Coast left-leaning billionaire has been particularly astute on this subject. Steyer acknowledged in 2014 that his $75 million spent trying to influence US politicians on climate change, had been wasted. Reading the media story on that admission woke me up to the fact that no single issue and no single demographic is going to get an honest hearing from the USG or any other government, until we achieve Electoral Reform and restore integrity to our electoral process, our governance process, and thence to our economy and our society.

As I have argued throughout this chapter, both IA and AI are severely limited for analogous reasons. The main issue is that the information technology (IT) industry has become corrupt: good people trapped in bad systems— without question good people but also without question very bad systems. Until the IT industry reconnects with human intelligence and human integrity as “root,” until the IT industry embraces holistic analytics, TCU, and OSEE as the trifecta for optimizing the Human Factor in the larger context of a sustainable Earth able to nurture five billion creative minds (now repressed by poverty, dictators, and predatory financial, legal, and information systems), humans will remain little more than chattel – expendable farm animals.

In fact, I made this case to Paul Allen’s INTERVAL Corporation in 1993 (Steele, 1993). In my humble opinion, the IT industry and the US IC have betrayed the public trust for over a quarter century, precisely because they have disrespected the human as “root” and sought instead to focus on profits for the few rather than productivity for the many.

In fact, it’s never too late to find God, embrace ethics, and be all you can be. In the context of augmenting human intelligence, this means a transformational focus on human-machine integrity.

References

Note: HTML addresses are generally not provided – search for the titles. Phi Beta Iota Public Intelligence Blog is abbreviated at PBI. All graphics are original to the author and can be found in color and expandable form at PBI.

Alexander, Michelle (2012). The New Jim Crow: Mass Incarceration in the Age of Colorblindness. New York, NY: The New Press.


Amato, Theresa (2009). Grand Illusion: The Myth of Voter Choice in a Two-Party Tyranny. New York, NY: The New Press.

Annan, Kofi (2005). Billions of Promises to Keep. New York Times.

Arnold, Stephen E. (2015). CyberOSINT: Next Generation Information Access. Harrod’s Creek, KY: Xenky Press.


Bamford, James (2002). Anatomy of the Ultra-Secret National Security Agency. New York, NY: Anchor Books.

Bilmes, Linda and Joseph Stiglitz (2008). The Three Trillion Dollar War: The True Cost of the Iraq Conflict. New York, NY: W.W. Norton.

Nikolai Danalylov and Robert Steele (2016). Robert Steele on Open Source Everything – Ethics is an Operating System,” Singularity Weblog and YouTube (1:21:32).

Donovan, Sarah (2015). An Overview of the Employment-Population Ratio. Washington, DC: Congressional Research Service.

Dubose, Lou and Jake Bernstein (2006). VICE: Dick Cheney and the Hijacking of the American Presidency. New York, NY: Random House.



Hedges, Chris (2016). Wages of Rebellion: The Moral Imperative of Revolt. New York, NY: Nation Books.


Jordan, Katy (2015). MOOC Completion Rates: The Data. KatyJordan.

Kelly, Kevin (1999). New Rules for the New Economy. New York, NY: Penguin Books.


Klavans, Richard et al (2008). Graphic: Web of Fragmented Knowledge. PBI.

Kolbert, Elizabeth (2014). The Sixth Extinction: An Unnatural History. New York, NY: Henry Holt.

Lewis, Charles (2014). 935 Lies: The Future of Truth and the Decline of America’s Moral Integrity. New York, NY: PublicAffairs.

Lewis, Michael (2015). Flash Boys: A Wall Street Revolt. New York, NY: W. W. Norton.

Linden, Eugene (2006). The Winds of Change: Climate, Weather, and the Destruction of Civilizations. New York, NY: Simon & Schuster.

Linebaugh, Peter (2014). STOP, THIEF!: The Commons, Enclosures, and Resistance. Oakland, CA: PM Press.




Palmer, Mark (2005). Breaking the Real Axis of Evil: How to Oust the World’s Last Dictators by 2025. New York, NY: Rowman & Littlefield.



Rampton, Sheldon and John Stauber (2003). Weapons of Mass Deception: The Uses of Propaganda in Bush’s War on Iraq. New York, NY: TarcherPerigee.

Rheingold, Howard (2000). Tools for Thought: The History and Future of Mind-Expanding Technology. Cambridge, MA: MIT Press.

Risen, James (2014). Pay Any Price: Greed, Power, and Endless War. New York, NY: Houghton Mifflin.

Rothschild, Lynn (2014). Coalition for Inclusive Capitalism, undated, retrieved from http://www.inc-cap.com/.

Sifry, Micah (2014). The Big Disconnect: Why the Internet Hasn’t Transformed Politics (Yet). Sebastopol, CA: O/R Books.

Slemrod, Annie (2015). Only five percent of pledged aid reaches Gaza. Middle Eastern Eye.


_____ (1985). Defense Facts of Life: The Plans/Reality Mismatch. Boulder, CO: Westview Press.





_____ (2015c). Foreword in Stephen E. Arnold, CyberOSINT: Next Generation Information Access. Harrod’s Creek, KY: Xenky Press.






_____ (2013d). The Evolving Craft of Intelligence in Robert Dover, Michael Goodman, and Claudia Hillebrand (eds). Routledge Companion to Intelligence Studies. Oxford, UK: Routledge.

_____ (2012a). The Open Source Everything Manifesto: Transparency, Truth, and Trust. Berkeley, CA: North Atlantic Books, Evolver Editions.



_____ (2010). Human Intelligence: All Humans, All Minds, All the Time. Carlisle, PA: US Army Strategy Studies Institute.

_____ (2006). Reinventing Intelligence. Oakton, VA: Open Source Solutions, Inc.



_____ (1994). Data Mining: Don’t Buy or Build Your Shovel Until You Know What You Are Digging Into. Washington, DC: National Research Council.


Strassmann, Paul (1992). Remarks of the Director of Defense Information. McLean, VA: Conference on National Security & National Competitiveness: Open Source Solutions.

Suleman, Khidr (2014). The human cost of a smartphone. ITPro.

Taibbi, Matt (2014). The Divide: American Injustice in the Age of the Wealth Gap. Berlin, DE: Spiegel & Grau.

Thorton, Joseph (2001). Pandora’s Poison: Chlorine, Health, and a New Environmental Strategy. Cambridge, MA: MIT Press.

United Nations High-Level Panel on Threats, Challenges, and Change (2004). A More Secure World: Our Shared Responsibility. New York, NY: United Nations.

United Nation (2016). Sustainable Development Goals: 17 Goals to Transform Our World. New York, NY: United Nations.

Webb, Diane, Dennis McCormick, and Gordon Oehler, CATALYST: Computer Aided Tools for the Analysis of Science & Technology. Washington, DC: Central Intelligence Agency.

Williams, John. ShadowStats.com.

No comments: