Blog Post

Next-Generation Compute Architectures Enabling Artificial Intelligence—Part II of II

Thomas A. Campbell, Ph.D., FutureGrasp, LLC & Robert Meagley, Ph.D., nR, LLC; ONE Nanotechnologies LLC • Feb 08, 2018
Like what you read? CONTACT FutureGrasp for more information, speaking engagements and/or collaboration

In this second of two installments (for Part I, see https://www.futuregrasp.com/next-generation-compute-architectures-enabling-artificial-intelligence-part-I-of-II ), we examine technologies emerging to enable the next generations of commercial AI. Across these exciting approaches, common trends noted in the first wave of AI entrants in the marketplace continue. These include:

·strategies to reduce noise and increase efficiency

·smaller scale engineering

·a proliferation of advanced materials

·increasing speed, density and efficiency

·diversification in mechanism

Nanomaterials

Nanomaterials are considered promising for extending Moore’s Law beyond CMOS [1] as fundamental transistors, as well as copper interconnects connecting the transistors, to enable a whole new class of integrated circuits (ICs). Carbon based nanomaterials, such as carbon nanotubes (CNTs) and graphene, are especially interesting because of their rapid transport of electrons. [2] CNTs can be made in various diameters and chiralities [3] to optimize electron transport for a given application, and may be grown in situ to facilitate fabrication. Graphene can be vapor deposited as a layer and shaped with lithography. [4] Both materials show desirable switching and optical characteristics, as well as good properties for electronic and optical architectures ( vide infra ). [5] On the horizon are materials with even more favorable transport and switching properties, such as MoS2. [6]

Applying nanomaterials such as CNTs can enable whole new chip architectures. Traditional CPUs are two-dimensional (2D) - ICs are produced with numerous layers of valleys, bridges and interconnects on a flat silicon substrate. Multiple ICs are connected on the 2D landscape of a circuit board. Unfortunately, by being 2D the processing speed is inherently limited in CPUs because of the longer time that electrons must take to transport between ICs.

ICs, and thus AI, would benefit with a more 3D architecture. Recent work by MIT and Stanford demonstrates the promise of 3D architectures built on a backbone of millions of CNTs. “As applications analyze increasingly massive volumes of data, the limited rate at which data can be moved between different chips is creating a critical communication ‘bottleneck.’ And with limited real estate on the chip, there is not enough room to place them side-by-side, even as they have been miniaturized.” [7] [8] On-chip storage and processing speed are both increased by sandwiching CNTs between existing silicon chips. This novel approach offers advantages over traditional IC design. “In perspective this chip is important because of its efficiency (low power consumption thanks to the carbon nanotube and fast processing thanks to the colocation of storage and processing) which makes it very interesting in AI applications.” [9]

Of course, a major challenge with incorporating nanomaterials into traditional IC production is to minimize the changes to semiconductor fabs. Hundreds of billions of dollars have been invested over the last 50 years to optimize IC manufacturing with a silicon architecture. Any new material will have to fit within this process flow at minimal cost. At the end of the day, semiconductor production is all about economics - getting the best and most chips manufactured at the lowest cost.

Optical

Another emerging semiconductor technology is optical computing. Instead of electrons, light (photons) is used as the medium to conduct information between transistors. Thus, computations can literally occur at the speed of light, faster than conventional electron-driven chips. Materials enabling optical computing are in general quite different from electronic materials, and materials for optics that can mesh with ASIC and other nanoelectronic fabrication strategies are subject to intensive research. Silicon (and silicon-germanium) have a strong lead for integrated photonic chips; however, its limitations are driving advances in graphene and other materials. [10] [11] [12]

As with any advanced computational architecture, faster processing speed can benefit AI computation. Deep learning especially requires rapid data analysis with large datasets, all while saving energy compared to traditional CPUs or GPUs. Optical chips hold the potential to offload the matrix calculations needed for deep learning to another chip from the CPU, and thereby save power consumption and minimize space use by multiple CPUs. [13] Recently, collaborators from the University of Bristol , Microsoft, Google, Imperial College , Max Planck Institute , and the Sun Yat-sen University invented a new algorithm for solving the energy structure of quantum systems on optical quantum computers. Their device provides an example of quantum algorithm development implemented on an optical device architecture that can yield mass manufacturable systems. [14]

Groups such as at MIT [15] [16] , the University of Arizona [17] , and the University of Alberta in Canada [18] are driving research in advanced optical materials, switches, lasers and nano-optics to further optical computing. For example, a group in the UK recently shrunk the distance a given set of photons must travel on a chip from more than a centimeter to mere micrometers, thus enabling even faster processing speeds. [19] However promising optical chips might be, much research remains for optical computing to enable more calculations on a given chip than nanoelectronics, and as well to increase scalability for full systems capable of AI. However, advances in this space have been advancing in line with other architectures, and we may expect to see greater deployment of light on chips in the future.

DNA and Other Biological Compute Strategies

Just as neuromorphic is a higher level of neural net integration, adapted from observations of biological computing systems, it is no surprise that it is possible to further emulate and co-opt biochemical information processing systems. DNA [20] is the stuff of life, so wouldn’t it be ironic if we were to design a computer that enables AI with the materials from our own genomes? That approach is one of the most exotic in our list of advanced computer architectures that may someday create highly adaptable parallel processing, and high memory storage density ICs.

“DNA computing is a form of parallel computing in that it takes advantage of the many different molecules of DNA to try many different possibilities at once.” [21] [22] One recent claim by the University of Manchester is for a DNA computer that ‘grows’ as it computes, i.e., its compute pathways (the DNA) copy and replicate themselves as a computation is underway to find a given answer faster. [23] This is wholly different from a fixed silicon IC design.

Another major advantage of DNA is in its potential for memory storage. “Capable of storing 215 petabytes (215 million gigabytes) in a single gram of DNA, the system could, in principle, store every bit of datum ever recorded by humans in a container about the size and weight of a couple of pickup trucks.” [24] Microsoft announced in early 2017 that they plan on adding DNA storage to their cloud. [25] While promising, a barrier for such deployment is cost – DNA storage requires both read and write capabilities, which at this writing cost several thousand dollars per megabyte—too expensive for large-scale cloud storage. But as costs come down and read/write speeds and accuracy improve, we may have some of our future social media posts being stored on the very material we depend upon in our cells.

Other molecules have also been adapted for biological compute. [26] RNA and proteins have been used in biochemical/biological computing architectures. [27] Both of these biomolecules interact strongly with each other and their environments to a greater degree than DNA. As they also have programmable structures, a language of RNA and protein shapes that fit like puzzle pieces and interact like gear teeth is ubiquitous for biological information processing. These shapes can form switches and logic gates, and enable massively parallel processing via chemistry held on surfaces and in liquid solution. [28]

One limitation to biological compute (BC) currently, is that, like quantum computing (QC), BC has challenges in manipulating algorithms and in structuring problems that early limited systems can solve. BC has demonstrated real advantages in certain kinds of complex optimization problems also attractive for QC. [29] [30] A few intrepid startups are already active in this space; for example, Rebel Bio recently demonstrated a DNA-based biological computer that plays Tetris. [31] With increased investment in this area, it is anticipated that systems will continue to improve in generality, flexibility, and convenience. [32]

Integration of BC from bacterial systems has been well established. [33] Other species including slime molds harnessed to solve complex problems and integrate environmental sensing. [34] [35] [36] Far beyond simple organisms, ambitious progress has been made interfacing the nanoelectronic approaches with the biological, culminating in the development of AI implanted in a prosthetic to aid human cognition, as being commercialized in a partnership among INTENT LTD , Intel, and Qualcomm. [37] While BC is perhaps the most forward-leaning AI embodiment, it presents a unique challenge as it is the origin of human consciousness. [38]

Conclusions

We emphasized in Part I that “There is now no longer a single Moore’s Law, but rather a suite of them, with each new law a function of compute architecture, software and application.” The additional emerging technologies discussed here in Part II further embrace and extend the portent of this observation. [39]

Diversification in device and integration of devices offer technology options that enable niche applications. As these new capabilities are proven successful, some niches will expand into more general applications. [40] These developments set the stage for waves of M&A [41] as smaller firms are absorbed into larger concerns producing higher volume.

Ultimately, new compute architectures will ensure many additional generations of continuous improvement, well beyond Moore’s Law. AI will thus be driven to ever higher levels of compute and more capable performance.

Notes


[1] Complementary metal-oxide semiconductor

[2] Z. Chen, et al., “Carbon nanotubes for high-performance logic,” MRS Bulletin, 39, August 2014, 719-726.

[3] CNTs come in multiple sizes, including single-walled and multi-walled, as well as chiralities, or ‘handedness,’ such as zigzag and armchair. Such diverse CNT species are tailored toward the desired electrical and other properties. [ https://en.wikipedia.org/wiki/Carbon_nanotube , accessed January 2018.]

[9] Ibid.

[20] Deoxyribonucleic acid

[22] Lewin, D. I. (2002). "DNA computing". Computing in Science & Engineering. 4 (3): 5–8. doi : 10.1109/5992.998634 , accessed January 2018..

[39] In 2015, one author [TAC] ran a workshop titled “Moore’s Law 2.0.” One conclusion from the series of presentations by industry and academia is that we now have such a diverse realm of compute architectures that there is confusion in both industry and the US Government on which approach to fund the most. No one knows which chipset will be the winner, if there even will be a clear winner.

[41] Mergers & Acquisitions

Like what you read? CONTACT FutureGrasp for more information, speaking engagements and/or collaboration
Diplomacy turmoil in age of coronavirus
By Thomas A. Campbell, Andrew Hyde, Geoffrey M. Odlum, Ariel Ahram 11 Jun, 2020
The coronavirus pandemic has utterly transformed international diplomacy. Statecraft has long been driven by face-to-face interactions that expand through informal mechanisms, such as body language, spontaneous conversations, and an understanding of the views of other actors beyond rote statements of positions. Such meetings are now difficult if not impossible in the era of social distancing and work from home mandates. Meetings postponed or canceled altogether have slowed a range of negotiations while many important international conferences have also been delayed or gone virtual. Cybersecurity risks have amplified as unsecure networks are leveraged for conversations that would have normally occurred face to face. Diplomats must find new ways to work in this pandemic era, as well as what the current turmoil in operations may mean post-COVID19. The practice of diplomacy has always been intertwined with state-of-the-art methods of information and communications technology (ICT). For example, telegraphy revolutionized the practices of diplomacy by accelerating the speed of communication between embassies and the metropole. The internet has had similar revolutionary impact on the practices of statecraft and diplomacy. Web-enabled ICT multiplied the number of voices and interests involved in international policymaking; complicated international decision-making; reduced exclusive control of States in the process; accelerated and freed the dissemination of information, accurate or not, about any issue or event; and enabled traditional diplomatic services to be delivered faster and more cost-effectively, both to ones’ own citizens and government, and to those of other countries. Following the development of social media, multiple government agencies are now avid posters on Twitter and other platforms in their attempts to shape public discourse at home and abroad. _________ TURN TO TURMOIL The novel coronavirus has imposed suddenly a full-court-press for diplomats to take to internet ICT, but accompanying this rapid change are new communications stresses and nuances. The functionalities and efficiencies of diplomatic ICT - including phone, email, social media messaging, and virtual meetings – are now routinely in question. The limitations of video-conferencing and other ICT-driven modes of communication have long been documented . U.S. Foreign Service Officers (FSOs) are specifically trained in face-to-face interactions and talking to groups. Exclusive reliance upon teleconference platforms such as Zoom and WebEx affect multiple core capabilities of the FSO. It is difficult to detect and project desired body language, thus introducing challenges in convincing foreign interlocutors to share private thoughts and intentions over a monitor. Negotiations are both enabled and constrained online, thus pushing the dialogue in sometimes unexpected directions. Reduced or no travel, fewer in-person meetings, and overall, limited personal interactions compromise statecraft. Video engagement changes how information and intelligence is collected, thus impacting institutional memory. Diplomatic activity and initiatives have slowed considerably as many major meetings and conferences have gone online, with others outright canceled. US Government agencies have suspended many pre-pandemic priorities and pivoted almost entirely to focusing now on coping with the virus. Security in the White House has increased with individuals working near POTUS being given daily coronavirus checks , thus self-limiting the frequency of high-level meetings. Intelligence officers at the CIA are working shifts , such as three days on followed by three days off, to improve social distancing by reducing the given number of people in buildings at any given time. In New York City, itself a pandemic epicenter, all aspects of life have been severely compromised for over a month now, detrimentally affecting United Nations Headquarters operations and activities. For example, COVID-19 has prompted changes in the working methods of the UN General Assembly : “Physical distancing and stay-at-home restrictions mean representatives can no longer meet in person, including to vote on resolutions, which are now circulated under a process known as ‘silence procedure.’ Ambassadors are given a 72-hour window to consult their capitals. If they all agree on a resolution, it is passed. If not, the resolution is not adopted as the ‘silence’ has been broken.” Webinars and online meetings are the new norm, but they impact the large-scale and often slow-moving collaborative work of drafting and editing of diplomatic agreements and statements. These processes now often involve dozens of people all working remotely using a "track changes" mode of interaction. But direct contact and interaction often undergird these collaborations, as interlocutors know each other and can converse face to face before they move into the web environment. Going straight to the web without initial constructive in-person discussions in the best case merely compromises the speed of production; in the worst case it stymies it entirely. Cybersecurity risks increase when executing all diplomacy online. A singular reliance on ICT raises new questions about the reliability and vulnerability of networks to disruption or espionage; unpreparedness opens new cyber-attack vectors for hackers, both amateurs and State-sponsored professionals. Governments may not have the requisite cyberinfrastructure and trained employees to seamlessly continue diplomacy operations from home offices. Sharing of sensitive documents either cannot be done at all, or only in a limited manner. Diplomatic online sessions can be recorded and played back with deep analyses of verbatim words and body language – normally hurried notes taking and brief summarizations are what constitute the record for face-to-face conversations. Since different organizations and different countries can adopt disparate and non-interoperable technological approaches, ICT compatibility confusion throws additional barriers at smooth statecraft. Some diplomats may avoid negotiations altogether because of security concerns, thus compromising statecraft progress in general. _____________ ACCESSING THE ADVANTAGE State engagements must continue apace regardless of the pandemic. Negotiations of peace, non-virus global health issues, human rights, climate change, nuclear nonproliferation, and trade all must occur, in whatever limited way. Embassies need to find ways to provide emergency consular services to visiting or expatriate citizens in need without putting diplomats at risk of infection. Diplomats face challenges in assessing emerging political, economic, and social trends when prevented from face-to-face meetings or travel around a country. National elections scheduled during this period of social distancing may take place without the confidence-building presence of foreign election observers, increasing potential for corrupt or anti-democratic practices to alter election outcomes. Diplomatic professionals will have to consider how the new online reality impacts their ability to advance their national interests, to understand rapidly shifting geopolitical events, and to reach publics overseas. Key is to understand how diplomacy morphs into a new normal in the coming months and years, and to identify means that diplomats and senior policymakers can prepare. Tactical challenges are to identify best ways of doing business and to implement them in real time during the pandemic. Strategic challenges are to consider how diplomacy following the pandemic might inalterably change, what new opportunities are present, and how to position diplomats accordingly. Advantages and opportunities exist in this new online reality, nevertheless. Frictionless meeting logistics with accompanying financial savings for internet-only engagements; increased opportunities to hear directly from key players; younger diplomats shining with their greater digital skills – all these and more are coming to light in the pandemic era. As stated by the United Nations’ Under-Secretary General for Political and Peacebuilding Affairs, Rosemary DiCarlo , “Although we recognize that the limitations of the processes in which face-to-face meetings are restricted, the increased use of technology has the potential to create new opportunities [and] enhance the inclusivity of peace processes - for example, including the participation of women and young people.” Concomitant with this new reality is the increased potential to leverage big data, predictive analytics, and other digital technologies. Investments should be amplified in artificial intelligence (AI) to analyze all data that can be extracted on any given country, region, or topic. Predictive analytics using machine learning (and its subset, deep learning) can then be applied to offer novel insights for meeting preparedness and briefings to senior policymakers; such approaches offer new opportunities in optimized human-machine hybrid decision-making. AI can also be used to automatically assess treaty compliance, as well as to monitor and predict events before they occur . In-situ machine language translations and real-time content assessment for both verbal and written communiques are feasible now. Increased security of voting processes via blockchain holds great potential. Virtual reality avatars can be applied to replicate the feeling of sitting across a table from a counterpart. Ultimately, there is a wide range of technological applications and solutions that await adoption and deployment by forward-leaning diplomatic corps. However long the pandemic lasts, diplomacy has been forever changed. How diplomats respond to the recent step-change in personal interactions and resultant altered statecraft will dictate whether national objectives are met, whether treaties are negotiated, and whether normal operations return - or diplomacy turmoil continues. The opinions and characterizations in this piece are those of the authors and do not necessarily represent those of the U.S. Government. Thomas A. Campbell, Ph.D. (tom.campbell@futuregrasp.com) is Founder & CEO of FutureGrasp, an advisory group that works with organizations globally to identify policy and business opportunities from emerging and disruptive technologies, especially artificial intelligence. Previously, he was the first National Intelligence Officer for Technology with the National Intelligence Council, Office of the Director of National Intelligence (ODNI). Andrew Hyde (andrew.hyde@futuregrasp.com) is a Nonresident Fellow at the Stimson Center, a Washington, D.C. based think tank, with the Transforming Conflict and Governance Program, and a Senior Advisor with FutureGrasp. Previously, he was a Foreign Service Officer at the U.S. Department of State and a Congressional staffer. Geoffrey Odlum (geoffrey.odlum@futuregrasp.com) served as a Foreign Service Officer at the U.S. Department of State from 1989 to 2017. He is currently the President of Odlum Global Strategies, a national security and technology policy consulting firm, and is a Senior Advisor with FutureGrasp. Ariel I. Ahram, Ph.D. (ahram@vt.edu) is Associate Professor and Chair of Government and International Affairs (GIA) at the School of Public and International Affairs, Virginia Tech.
By Thomas A. Campbell, Ph.D. 04 Apr, 2019
It is the position of FutureGrasp, LLC that every nation state should have an AI national plan. To remain economically competitive internally, as well as to facilitate collaboration across borders, it is imperative that governments recognize the need to channel whatever resources they possess to produce plans that enable strategic directions in the rapidly growing technology sector of AI.
By Tom Campbell, PhD; Founder & President; FutureGrasp, LLC 07 Sep, 2018
As we move into future years with more advanced compute and exponential data availability, there will be an inexorable push to simulate more of our world. Populating our digital worlds with digital twins and digital doubles will ultimately serve us greatly in our increasingly complex real worlds.
By Tom Campbell, PhD; Founder & President; FutureGrasp, LLC 10 Aug, 2018
It is often said that history repeats itself—we are seeing these same trends today with autonomous weapons.
By Tom Campbell, PhD, Founder & President, FutureGrasp, LLC 20 Jul, 2018
We offer here a review of several facets of artificial intelligence (AI) that could benefit or compromise law enforcement (LE). As AI becomes more advanced, the potential for its application by LE experts (and criminals) will only increase. Let us hope that LE officers in the field and analysts in the office are able to leverage AI to protect people around the world.
By Tom Campbell, Founder & President, FutureGrasp, LLC 19 Jun, 2018
Voice is all the buzz (no pun intended) in Silicon Valley. Many technology leaders (Apple, Amazon, Microsoft, etc.) are betting that verbal commands will soon be the major means of communication for our numerous devices. But is verbal chatter the terminal point for our computer communications?
By Thomas A. Campbell, Ph.D., FutureGrasp, LLC 04 May, 2018
Quantum supremacy, the point at which a quantum computer exceeds the capabilities of any classical computer , is now near at hand. I explain why and what that means for computer science.
By Thomas A. Campbell, PhD, FutureGrasp, LLC & Robert Meagley, PhD, nR, LLC 02 Feb, 2018
We review the state of the art in advanced compute architectures as relevant to artificial intelligence (AI).
By Thomas A. Campbell, Ph.D., FutureGrasp, LLC & Matt Wolfe, Ph.D., Virginia Tech Applied Research Corporation (VT-ARC) 09 Jan, 2018
We review the challenges to qualitative forecasting of science and technology, and detail the advantages to taking a more quantitative approach using big data analytics and AI.
By Tom Campbell 04 Dec, 2017
The only way we can survive and thrive in this constantly changing and increasingly complex world is to get assistance from our machine creations. Artificial intelligence (AI) must be a core aspect of that help, as it offers unprecedented capabilities to monitor, control and assess a wide range of situations. AI will not (at least in the near term) remove the human from the loop; instead, it will augment our capacities for better data collection, analysis and decision making.
Show More
Share by: