Blockchain Based Decentralised Cloud Computing

Overview and Use Cases

Eterna Capital
18 min readFeb 21, 2019

I want to thank my team at Eterna Capital, Chandler Song (Co-Founder and CEO, Ankr Network), Dominic Tsang (Head of Business Development, Solana) for their valuable contribution.

By Mattia Mrvosevic — Head of Research at Eterna Capital.

Cloud computing is fuelling many of today’s internet-based applications and services, generating revenues in excess of $17 billion per quarter. How data is shared and stored, how people collaborate, and how remote applications are used represent a few examples of the use of this technology in people’s day to day lives [1, 2].

This post aims to give an overview on the current cloud computing market and risks — as perceived by senior risk executives — in the context of the growing worldwide production of data generated, among other sources, by Internet of Things. It is predicted by Cisco, in fact, that the “usable” data produced in 2021 (nearly 90% of the predicted data generated by 2021 will be ephemeral in nature and will be neither saved nor stored) is larger in size compared to the forecasted data centre traffic generated per year by a factor of four. This gap could potentially be filled with edge / fog computing or different approaches, such as blockchain-based cloud solutions. The post outlines the main challenges encountered by decentralised approaches to manage data ensuring right incentivisation plans, scalable infrastructure, reliable computation and data feed and finally, provides an architectural overview of the consensus mechanisms of three potential solutions, namely Ankr, Dfinity and Solana.

Introduction

Cloud computing epitomises the basis of digital business. This industry has been growing at an extraordinary pace in the last 10 years.

Synergy Research Group [3] compares the worldwide market share of several cloud service providers in Q3 2018 as shown in figure 1. The estimated cloud infrastructure service revenues are in excess of $17 billion each quarter, with Amazon Web Services, the leading cloud service provider, claiming a market share of around 33%, a steady average for the fourteenth quarter (considering the market has also tripled in three and a half years).

Figure 1 — Cloud Infrastructure Services — Market Share. The market includes raw computing and storage, services for running applications and hosted private cloud.
Adapted from: Synergy Research Group [3] (2018)

In Q3 2018 in particular, the cloud computing industry followed a 45% growth rate, compared with a 2017 growth rate of 44% and a 2016 growth rate of 50%. Global research and advisory firm, Gartner, predicts that cloud computing and services will be a $300 billion business by 2021 [4], up $120 billion from 2017, with service providers tripling by 2020 [5]. Established players such as Apple, OpenText, or BMC will enter the cloud computing industry, as well as newly established start-up companies, such as Box, Workday and Coupa, all bringing new and different features to the market.

The increasing interest in cloud computing is resulting in faster technological innovation, leading to the development of thousands of new cloud features; for example, major players Amazon Web Services and Microsoft Azure release 40 to 50 new cloud features in any given month. Despite this on average, firms only require a handful of application services, and a dozen infrastructure services, to run their business [5]. This pace of innovation is helpful for the market as a whole, but on the other hand leads to a higher level of complexity and confusion for organisations with already complex IT infrastructure [6].

The risks of cloud computing are increasing

Every quarter, Gartner prepares a survey for senior risk executives at leading organisations, to identify their respective company’s top risks. The survey analyses new and unforeseen risks. If the same risk is repeated every quarter for a year, it is no longer considered new.

Cloud computing is at the top of the emerging risk list, as reported in the last survey in 2018, where 110 global executives participated [7]. Specifically, issues predominantly related to security were addressed, such as the unauthorised access to sensitive or restricted information. Indeed, the rising level of complexity of cloud computing has made it an attractive target for attackers worldwide, who constantly monitor the security of data storage providers. In addition, senior executives are expressing mounting concerns about the possibility that cloud providers won’t be able to provide access to information as a result of disruption in their own operations, causing massive downtime in their network.

Bruce Schneier, a renowned global cryptographer, computer security professional, privacy specialist and writer, also points out additional limitations of cloud computing technology [8]. Mr. Schneier states that cloud providers may fail to meet a company’s legal needs. As regulatory changes are on the rise, the risks and infrastructure complications associated with cloud computing are growing as a consequence. For example, cloud service providers often store data within multiple locations, including locations outside of the European Economic Area (EEA). The implications of the recent EU General Data Protection Regulation (GDPR) has resulted in very complex migration infrastructure projects for companies to be regulatory compliant.

Mr. Schneier also emphasises how moving data to the cloud implies granting an increase in power to cloud providers, who are now effectively controlling that data.

Likewise, we are witnessing a growing number of free cloud services available to end consumers and companies, with increasing risks related to data control and loss as a direct consequence. These cloud providers may also delete some of the client’s data at will if they feel the client has violated some terms of service, and subsequently offer little to no recourse to affected clients [8].

In addition to this, cloud computing by setup may not be beneficial to small companies or short-term projects. In contrast, for many large enterprises, cloud computing becomes beneficial as it could result in economies of scale.

Since cloud computing systems are centralised, service outages are usually an unfortunate possibility. According to Cloud Academy [9], network downtime is often cited as one of the biggest disadvantages of cloud technology.

In the future, more and more data will be produced on a global level

Intelligence advisory company IDC, predicts that worldwide spending on the Internet of Things (IoT) is forecasted to reach $1.1 trillion by 2021 [10], up $420 billion from 2017. Internet of Things will create many opportunities, such as efficiency improvements, economic benefits, and reduced human exertion. Internet of Things shall generate an exceptionally high amount of data that will then need to be analysed in an optimised way [11, 12].

To provide a practical example, according to Tom Coughlin — founder of Coughlin Associates — one single self-driving car can generate 1 Gigabyte of raw data per second [13]. This means that to drive a 10 minutes distance to your local grocery store and return home, your car can generate up to 1.2 Terabytes (TB) of data. Considering that retail personal computers are sold on average with a 1 to 2 TB capacity hard-disks, it’s easy to understand the enormous amount of storage and power required to analyse this data.

In a recent white paper by global technology conglomerate Cisco [14] it was estimated that, by the end of 2021, people, machines, and things will generate more than 850 Zettabytes (ZB; 1 ZB = 10^9 TB) in data: 630 ZB greater than the 220 ZB generated in 2016. In addition to the IoT, industries and scientific communities will also require a large amount of computing power to run larger applications and process huge volumes of data at a significantly faster pace.

Jerry Cuomo, Vice President, Blockchain for Business at IBM, believes that the demand for computing power will increase exponentially, as business processes are completed at a significantly faster pace than ever before. According to Gartner, as the volume and velocity of data increases, so does the inefficiency of streaming all this information to a centralised cloud or data centre for processing [15, 16, 17].

The question raised by many is — how can a centralised cloud infrastructure support such data and computing power growth by guaranteeing security, scalability and resources optimisation at the same time?

Introducing decentralised approaches to manage data

Worldwide organisations have realised that a more decentralised approach is required to address digital business infrastructure requirements [18, 19]. Business Insider Intelligence, in fact, predicts that more than five billion IoT devices will require edge solutions by 2020 [20, 21]. Edge computing moves some portion of an application, its data or services, away from one or more central nodes to the other “edge”, often in contact with the end users. Edge computing uses a mix of peer-to-peer ad hoc networking, local cloud computing, grid computing, fog computing, distributed data storage and other more sophisticated solutions [22].

Of the $500 billion in growth expected for IoT through 2020, global advisory firm McKinsey estimates that about 25% will be directly related to edge technology [21]. To visualise the order of magnitude difference between “usable” data per year and data centre traffic per year (21 ZB by 2021), Cisco graphically compared each metric, as shown in figure 2 [14]. Nearly 90% of the predicted 850 ZB that will be generated by 2021 will be ephemeral in nature and will be neither saved nor stored. From figure 2, it is clear that the usable data — calculated using an average of 10% cut of the total created data (i.e. 850 ZB) — is larger in size compared to the forecasted data centre traffic generated per year by a factor of four. This gap could potentially be filled with edge or fog computing.

Figure 2 — Usable data created per year vs. data centre traffic per year — the opportunity for edge or fog computing.
Adapted from: Cisco [14] (2018)

Edge computing helps improve data compression, transfer data in the connectivity layer of the technology stack, and reduce network bandwidth. Fog computing could likewise see a transformation of the populations’ personal computers, servers and smartphones in nodes able to store data and / or perform computations for others, thus making a wider range of IoT applications possible.

Blockchain-based decentralised approaches and main challenges

Computational resource sharing platforms have existed for years. One example is SETI@home, scientific experiment based at UC Berkeley, which uses internet-connected computers in the Search for Extra-terrestrial Intelligence (SETI), in which people can participate by running a free program that downloads and analyses radio telescope data. The main issue of such platforms, when brought to scale, is the dependence on central nodes to distribute and manage tasks, thus encountering similar issues of centralised architectures described in the previous sections, alongside issues related to proper incentivisation plans to computing power providers and correct verifiability of the performed computation.

In this scenario, blockchain technology emerges as a strong facilitator of decentralised cloud solutions.

Indeed, the consensus and reward mechanisms used within blockchain-based architectures can provide distributed computing a strong support to overcome some of the issues mentioned. For example, re-paying node hosts using a platform’s own medium of exchange (i.e. tokens) is not only a useful incentivisation method for interested parties to put their resources online, but it also helps avoid misbehaving actors.

Blockchain-based decentralised cloud solutions present many challenges, an exhaustive list of which is outlined by Prof. Brundo Riarte and Prof. De Nicola [23].

Some of the challenges delineated are:

  • Ensuring the right incentivisation plan is in place for resource providers by guaranteeing fair income distribution.
  • Making the infrastructure scalable, considering the current scalability limitations of blockchain infrastructures.
  • Verifying the computation is done in a proper manner, to avoid potential malicious attacks (i.e. a provider claiming to have provided the service without effectively doing it). Some projects use reputational management techniques, though these techniques need to offer the right balance between weight of reputations and market entry cost.
  • Use trusted oracles*, since oracles are not decentralised by nature. There are decentralised proposals for oracles that could mitigate the trust issue [24, 25].
  • Manage the right of erasure of data in the event of a malicious attack or other (common) inconveniences.

Several projects are proposing ideas to use blockchain within distributed computing systems. The next section describes the use case for blockchain in decentralised approaches proposed by the Ankr, Dfinity and Solana projects.

* Oracles are interfaces that connect a smart contract with real-world occurrences.

Blockchain-based decentralised cloud solutions

  1. Ankr

The Ankr project [26] is a decentralised cloud solution under development with an aim of offering Clients the infrastructure to run applications at cheaper prices compared to traditional cloud service providers, and Data Centres the infrastructure to create new revenue streams from their under-utilised capacity. This will be obtained by ensuring a high level of service availability, easy integration and secure communication. This scope will be reached by leveraging containerisation, cluster orchestration and Trusted Execution Environments (TEE).

Traditionally, cloud computing has been developed by creating Virtual Machines, emulations that make it possible to run what appear to be many separate computers on hardware that is actually one computer. Virtual Machines take a lot of system resources (CPU and RAM) to run, take minutes to start and they incur in complex software development resource management. Containerisation, on the other side, only virtualises the Operating System of a computer, allowing distributed applications to run without launching an entire virtual machine (VM) for each application. Containers take only seconds to start and the underlying orchestration mechanism, which manages the interconnections and interactions among workloads on the cloud infrastructure, enables easier and faster software development resource management. In addition, TEE, such as Intel SGX, allow the execution of an application inside processor-hardened enclaves or protected areas of execution in memory that increase security even on compromised platforms. This infrastructure can testify the miners’ useful computation for the system and, as such, provide rewards accordingly.

The Ankr team has developed a novel consensus mechanism called Proof of Useful Work (PoUW), that has the scope to achieve a high-security standard with minimal energy waste.

The key component of Ankr’s mining scheme, where the main actors of the system are examined, is shown in figure 3.

Figure 3: Ankr’s Proof of Useful Work consensus overview.
From: Ankr [26] (2018)

Blockchain Agents collect transactions and generate a block template without PoUW. Once Miners provide PoUW and embed it into the template, Blockchain Agents validate the proof, publish the block on the blockchain and receive the corresponding reward.

Useful Work Providers provide Miners useful work tasks and receive results. Useful work tasks are composed by a PoUW enclave (created by an SGX-compliant code with the tools that Ankr provides) and some task inputs.

Miners take useful work tasks from Useful Work Providers plus block templates from Blockchain Agents and launch an SGX enclave to load and run the useful work task. Once the work has been executed, the process to decide which instruction, and consequently Miner, deserves the reward, involves the PoUW enclave and Intel SGX’s random number generator (SRNG). The PoUW enclave generates a random number using SRNG and checks whether this number is smaller than the desired difficulty. If so, the instruction gains the consensus leadership and the enclave will create and attach to the block template an Intel SGX-produced attestation to prove the PoUW compliance plus another attestation that states that the task was finished at a given difficulty level, which will be used as the baseline for the next block iteration. Miners then return the result to the Useful Work Providers.

The PoUW consensus is therefore able to achieve a high-security standard whilst minimising energy waste due to the represented mechanism.

2. Dfinity

The Dfinity project [27] is a decentralised cloud solution whose aim is to provide a world supercomputer with “infinite” capacity and computational power. It introduced the concept of “The AI is law” in which everything is subject to an intermediary-free algorithmic governance system, that combines crowd wisdom and traditional AI technologies to freeze miscreant smart contracts which harm the interests of those using the platform. This effectively means that some transactions can be changed and reverted back if approved by the algorithmic governance system, an opposite approach compared to that of projects such as Bitcoin or Ethereum, where the rule “The Code is law” commands — a user in fact cannot revert back a transaction when processed.

The Dfinity consensus mechanism structure is shown in figure 4 below.

Figure 4 — Dfinity’s consensus mechanism layers
From: Dfinity [27] (2018)

The first layer is the Identity Layer, which involves new Clients (participants) that want to participate in the network, and that need to register (i.e. have a permanent identity) in the network. If registration requires a security deposit, a Client will be required to put the deposit followed by a lock-up period. This mechanism provides an efficient solution compared to traditional Proof-of-Work mechanisms where miners only forego the block reward during the time of misbehaviour — here in fact a misbehaving Client would lose their entire deposit at least.

The Random Beacon Layer is a Verifiable Random Function (VRF)* that is produced jointly by registered clients. Each random output of the VRF is unpredictable by anyone until just before it becomes available to everyone. In addition, since the last actor could still be able to abort the protocol, randomness is created via a threshold mechanism, whereby even the last actor does not know the next random value [28].

The Blockchain Layer has what Dfinity call a Probabilistic Slot Protocol (PSP), in which Clients are initially ranked based on the unbiased output of the random beacon, then blocks are weighted based on Clients’ rank such that the highest weights are received by blocks from the highest-ranking Clients. The PSP Protocol ensures strong scalability, as ranking is available instantaneously allowing for a predictable, constant block time, and ensuring there is a rank, avoiding Clients to “fight” for block allocation, and allowing a homogenous network bandwidth utilization.

Finally, the Notary Layer applies the final confirmation (i.e. notarization) to the blocks that can be included in the chain, giving notarization only to the highest-ranked Clients provided in the random beacon. This allows for the possibility of having more than one block notarized at a specific height due to adverse timing, causing a potential issue in the finalization of the transaction. To overcome this possibility, a transaction is considered final only when there are two notarized confirmations, in addition to waiting for a network time which confirms there are no other blocks at a specific height. The Dfinity project achieves its high speed due to this notarization process.

In Dfinity, the random beacon and the notarization processes are delegated to a Committee (a sub-set of all registered Clients) that, after having temporarily executed the protocol on behalf of all Clients, delegates the final execution to another Committee.

This mechanism allows the infrastructure to scale in a consistent way, by remaining fair and secure at the same time.

A VRF is a function that provides publicly verifiable proofs of its outputs’ correctness, without the need of the secret key necessary to provide these outputs.

3. Solana

The Solana project [29] aims to create a new blockchain architecture based on a novel concept, Proof of History (PoH). PoH uses a verifiable delay function to provide the network with a trust-less sense of shared time. Additionally, transaction processing on Solana is handled by GPUs, an approach which scales with Moore’s law. Moore’s law is the observation that the number of transistors in a dense integrated circuit doubles about every two years, thus allowing for much more computational power. This mechanism, used in combination of a Proof of Stake (PoS) consensus algorithm, can make the infrastructure highly scalable.

The infrastructure allows for different use cases, including decentralised cloud computing.

The PoH consensus mechanism provides a proof for verifying order and passage of time between events, fundamentally encoding a trust-less passage of time into the ledger. The mechanism uses a hash function (that runs in a single core of a computer) which can be called multiple times. The output of the function can be re-computed (an index is put for convenience to represent the number of times the function has been calculated) and verified by external computers in parallel, by checking each sequence on a separate core.

The recording of the state of the function, the index and data as it was appended into the sequences provides a timestamp that can guarantee that the data was created sometime before the next hash was generated in the sequence. This design supports horizontal scaling as multiple generators can synchronise amongst each other by mixing their state into each other’s sequences.

The Solana transaction flow is shown on figure 5.

Figure 5: Solana’s transaction flow throughout the network
From: Solana [29] (2018)

At any given time, a system node is designated as Leader to generate a PoH sequence, providing the network global read consistency and a verifiable passage of time.

The Leader sequences user messages and orders them so that they can be efficiently processed by other nodes in the system. The Leader executes the transactions on the current state that is stored in the RAM memory and publishes the transactions and a signature of the final state to the replication nodes, called Verifiers. These Verifiers execute the same transactions on their copies of the state, and publish their computed signatures of the state as confirmations. The published confirmations serve then as votes for the consensus algorithm. A specific instance of the PoS consensus mechanism is used to confirm the current sequence produced by the Proof of History generator, for voting, to select the next PoH generator and for punishing any misbehaving validators.

Election for a new PoH generator occur when a PoH generator failure is detected. The validator with the largest voting power is then picked as the new PoH generator.

PoH allows Verifiers of the network to observe what happened in the past with some degree of certainty of the time of those events. As the PoH generator is producing a stream of messages, all the Verifiers are required to submit their signatures of the state within 500ms. This number can be reduced further depending on network conditions. The overall system enables the infrastructure to reach up to 710k transactions per second based on today’s hardware.

Conclusions

In this post, an overview of the current cloud computing industry market has been presented alongside its increasing risks. These risks have been contextualised within the growing industry of IoT, providing the logic behind moving towards decentralised solutions to manage data. Three blockchain-based decentralised cloud solutions have been presented, showing three different approaches on achieving consensus between parties.

In the future, the expectation is that new decentralised cloud providers will enter the market, competing with already existing players by offering new services. Beyond the three solutions herein analysed, several other projects have been in development for 2–3 years already, and are moving now in the testing phases. Blockchain-based decentralised cloud architectures are still in their infancy and, for this reason, important metrics such as the quality of service, performance, scalability, security of these platforms will be crucial in the long-run to foster the economic growth of these projects in the open markets.

Disclaimer: Eterna Capital has an investment in Dfinity.

References

[1] Wikipedia, “Cloud Computing,” [Online]. Available: https://en.wikipedia.org/wiki/Cloud_computing.

[2] Educba, “What Is cloud computing ? | Basic | Concept | Benefits,” [Online]. Available: https://www.educba.com/cloud-computing-definition/.

[3] S. R. Group, “The Leading Cloud Providers Increase Their Market Share Again in the Third Quarter,” October 2018. [Online]. Available: https://www.srgresearch.com/articles/leading-cloud-providers-increase-their-market-share-again-third-quarter.

[4] Gartner, “Gartner Forecasts Worldwide Public Cloud Revenue to Grow 21.4 Percent in 2018,” April 2018. [Online]. Available: https://www.gartner.com/en/newsroom/press-releases/2018-04-12-gartner-forecasts-worldwide-public-cloud-revenue-to-grow-21-percent-in-2018.

[5] Gartner, “Hidden Cloud Growth Opportunities for Technology Service Providers,” June 2018. [Online]. Available: https://www.gartner.com/smarterwithgartner/7-hidden-cloud-growth-opportunities-for-technology-service-providers/.

[6] Forbes, “10 Key Takeaways From Gartner’s 2018 Magic Quadrant For Cloud IaaS,” April 2018. [Online]. Available: https://www.forbes.com/sites/janakirammsv/2018/06/02/10-key-takeaways-from-gartners-2018-magic-quadrant-for-cloud-iaas/#1d8a2f3e14df.

[7] Gartner, “Cloud Computing Tops List of Emerging Risks,” September 2018. [Online]. Available: https://www.gartner.com/smarterwithgartner/cloud-computing-tops-list-of-emerging-risks/.

[8] B. Schneier, “Should Companies Do Most of Their Computing in the Cloud?,” June 2015. [Online]. Available: https://www.schneier.com/blog/archives/2015/06/should_companie.html.

[9] Cloud Academy, “Disadvantages of Cloud Computing,” June 2018. [Online]. Available: https://cloudacademy.com/blog/disadvantages-of-cloud-computing/.

[10] IDC, “Worldwide Semiannual Internet of Things Spending Guide,” [Online]. Available: https://www.idc.com/getdoc.jsp?containerId=IDC_P29475.

[11] O. Vermesan and P. Friess, “Internet of Things: Converging Technologies for Smart Environments and Integrated Ecosystems,” 2013. [Online]. Available: http://www.internet-of-things-research.eu/pdf/Converging_Technologies_for_Smart_Environments_and_Integrated_Ecosystems_IERC_Book_Open_Access_2013.pdf.

[12] F. Mattern and C. Floerkemeier, “From the Internet of Computers to the Internet of Things,” Distributed Systems Group, Institute for Pervasive Computing, ETH Zurich, 2010.

[13] CNN, “Your car’s data may soon be more valuable than the car itself,” February 2017. [Online]. Available: https://money.cnn.com/2017/02/07/technology/car-data-value/index.html.

[14] Cisco, “Cisco Global Cloud Index: Forecast and Methodology, 2016–2021 White Paper,” Cisco, 2018.

[15] Gartner, “Top 10 Technology Trends Impacting Infrastructure & Operations for 2018,” December 2017. [Online]. Available: https://www.gartner.com/smarterwithgartner/top-10-technology-trends-impacting-infrastructure-operations-for-2018/.

[16] Information Age, “Moving from central to the edge: Is cloud decentralisation inevitable?,” June 2018. [Online]. Available: https://www.information-age.com/cloud-decentralisation-inevitable-123472539/.

[17] Mimik, “As networks choke, edge cloud is the saviour,” July 2017. [Online]. Available: https://mimik.com/as-network-chokes/.

[18] Gartner, “What Edge Computing Means for Infrastructure and Operations Leaders,” October 2017. [Online]. Available: https://www.gartner.com/smarterwithgartner/what-edge-computing-means-for-infrastructure-and-operations-leaders/.

[19] Gartner Blog Network, “The Edge Will Eat The Cloud,” March 2017. [Online]. Available: https://blogs.gartner.com/thomas_bittman/2017/03/06/the-edge-will-eat-the-cloud/.

[20] Business Insider, “Microsoft brings IoT to the Edge,” May 2017. [Online]. Available: http://uk.businessinsider.com/microsoft-brings-iot-to-the-edge-2017-5.

[21] McKinsey & Co., “Modernizing IT for digital reinvention,” 2018. [Online]. Available: https://www.mckinsey.com/~/media/McKinsey/Business%20Functions/McKinsey%20Digital/Our%20Insights/Modernizing%20IT%20for%20digital%20reinvention/Modernizing-IT-for-digital-reinvention-Collection-July-2018.ashx.

[22] Wikipedia, “Decentralized Computing,” [Online]. Available: https://en.wikipedia.org/wiki/Decentralized_computing.

[23] R. Brundo Uriarte and R. De Nicola, “Blockchain-Based Decentralised Cloud/Fog Solutions: Challenges, Opportunities and Standards,” September 2018. [Online]. Available: https://www.researchgate.net/publication/326346449_Blockchain-Based_Decentralised_CloudFog_Solutions_Challenges_Opportunities_and_Standards.

[24] S. Ellis, A. Juels and S. Nazarov, “ChainLink — A Decentralized Oracle Network,” September 2017. [Online]. Available: https://link.smartcontract.com/whitepaper.

[25] A. S. De Pedro, D. Levi and L. I. Cuende, “Witnet: A Decentralized Oracle Network Protocol,” November 2017. [Online]. Available: https://witnet.io/static/witnet-whitepaper.pdf.

[26] C. Song, S. Wu, S. Liu, R. Fang and Q.-L. Li, “ANKR — Build a Faster, Cheaper, Securer cloud using idle processing power in data centers and edge devices,” [Online]. Available: https://www.ankr.network/.

[27] T. Hanke, M. Movahedi and D. Williams, “Dfinity — The Internet Computer,” [Online]. Available: https://dfinity.org/faq/.

[28] T. Hanke, M. Movahedi and D. Williams, “DFINITY Technology Overview Series — Consensus Mechanism,” [Online]. Available: https://assets.ctfassets.net/ywqk17d3hsnp/2C8QU0x3q4AwuYYU4qiEmo/fb575987ca36672152a47b83ec96c0fc/dfinity-consensus.pdf.

[29] A. Yakovenko, “Solana: A new architecture for a high performance blockchain,” [Online]. Available: https://solana.com/solana-whitepaper.pdf.

--

--

Eterna Capital
Eterna Capital

Written by Eterna Capital

Investment company focused exclusively on blockchain technology. www.eternacapital.com

Responses (2)