submitted by QuarkChain to quarkchainio [link] [comments]
In which direction should DeFi develop in the next step?
The market is changing dramatically. The past few days have been like riding a roller coaster. But after several rounds of fluctuations, the DeFi segment in the stock market is still unabated. However, the hidden worries lurking under the surface are always existing.
Almost all resources in the DeFi ecology are on Ethereum. However, there are problems with the DeFi network built by Ethereum, such as the single system performance brought by the foreseeable homogeneous sharding in the future, high gas fee, low security, and low scalability, etc. These vulnerabilities make the many applications hard to use on the DeFi network, including high-frequency trading and the transaction matching modes (We use the Uniswap asset pool model today.)
The problem with ETH1.0 is that the performance is limited, and all the transactions are mixed without any organization. Although there is composability for the DeFi applications, the network needs to operate both DeFi applications and other transactions or DApps.
Network congestion and skyrocketing gas feesAs we all know, Ethereum relies on the consumption of GAS to run its economic operation. Every step of the chain requires the consumption of GAS. Bitcoin plummeted by almost 50% to $3,800, and ETH fell as much as 65.2% just on March 12 and 13, 2020. The plummet caused a run, the Ethereum miner fees that carried a large number of DeFi and DApps skyrocketed, and the network was also congested. The Ethereum GAS fee increased to 10 times of the usual, and the GAS fee was once as high as 1 ETH to successfully package transactions. After that, because the lending operations of DeFi applications require frequent interaction with contracts, the gas fees on Ethereum have also remained high.
Problems inherited from ERC20 tokens are affecting the DeFi products on Ethereum.If you use Ethereum’s native token ETH, the operation is simple. As long as the ETH is transferred to the contract of the target DeFi application, the contract operation will be the same as when we use cash to invest in stocks or wealth management products. No other operations are required.
However, the operation of tokens minted using ERC20 contracts is very different from native ETH, regardless of whether the tokens minted by these ERC20 contracts are well-known. Before trading, the ERC20 contract first authorizes the DeFi platform’s contract to transfer a specified number of ERC20 tokens on the account, such as USDT, USDC, or WBTC. After approval, the DeFi contract is called to transfer money. The intuitive understanding is to avoid frequent password input in small transactions, we authorized Paypal to open a password-free payment, so that the payment can be directly deducted during consumption. It sounds convenient, but is it that good?
There is a crucial problem here: if the DeFi contract is malicious during the approval process, this DeFi contract has the right to transfer all the ERC20 tokens on our account to any account. It is similar to that we authorize Paypal to perform a password-free operation of the balance, but if a hacker attacked Paypal successfully, this hacker could transfer all our money to his account. Similar things have happened before.
There is a famous project called Bancor, which used to rely on the type of authorization contract for ERC20 processing. However, there was a bug in the contract that allowed the contract to transfer the tokens in the user’s wallet to any hacker designated address after the user was authorized, which caused a loss of almost 100,000 US dollars.
The loss was not so significant because it occurred in the early stage of DeFi development. If it happens today that the DeFi asset scale on Ethereum already reached hundreds of millions, it would cause severe damage to the entire Ethereum ecosystem and the development of DeFi.
Cold shard and hot shardDeFi needs composability, convenience, and a stronger capability of anti-run. If the throughput is insufficient, sharding technology can be introduced, which is what ETH2.0 does. However, due to the combinability of DeFi, these applications tend to aggregate into one shard, which is prone to clustering effects. This will result in different shards gathering different contents. This is called hot shards and cold shards, which are analogous to different types of cities such as metropolises as New York and Tokyo, and other places like Kyoto and Alaska. Some places have become Wall Street, while other places may become scenic or living areas. Because of the aggregation of different functions, different shards will have different features.
It is quite unwise to develop algorithms to forcibly redistribute load balancing on shards. This is equivalent to using a simple system to determine the development of a complex system, much like a planned economy. However, we can design different features in advance to make them more capable to display their own features, just as humans transformed and utilized the natural resources based on their understanding of nature, thereby improving efficiency. That means, to set up some shards with different performance and even different consensus algorithms (e.g., the features of PoW and PoS are different).
Maybe there will be a major financial shard, like London, or two other special shards with their own features, like New York City and Chicago. Financial shards require high throughput and high cost. These are called hot shards, which carry large-value transactions, otherwise, the gas fee may be too high. Most people will live in the countryside, which means cold shards here. When you need the hot shard features, you don’t need to live in Manhattan, nor do you need to travel to Manhattan occasionally. Most of the time, you will live well on another shard. When one really needs to run on a DeFi shard, it only takes a few minutes of cross-shard transactions.
But the problem generated from this is that since each shard has its own features, it may cause the shards to be independent. What we need is that shards can be harmonious but keep their differences, that is, cross-sharding DeFi needs to be achieved. Today’s multi-chain heterogeneous technology can contribute to solving this problem. Only by solving these problems can more DeFi applications be stimulated.
In our opinion, a mature DeFi platform must have the following features:
Higher Efficiency: Have faster concurrent processing capabilities, i.e., high TPS.
Lower Gas Fee: Lower gas fee can stimulate the enthusiasm of DeFi users and even catalyze the development of high-frequency trading.
More Secure: There are fewer interactive processes in the contract, at least structurally to avoid the problems ERC20 caused due to the different permissions, which leads to complicated interactions and lengthens the operation chain and increases loopholes.
Easier to Use: Various multi-native tokens can be used to pay gas fees during transactions, and thus no need to use designated tokens to pay gas fees.
Easier Combination: It can support the combination of a wide range of contracts, including the combination of different consensus in the same chain, ledger structure, and other elements, and even cross chains, making DeFi a real “Lego”.
Multi-chain heterogeneous + DeFi, one unhindered currency is helping to reach the perfectMulti-chain heterogeneity has formed “cities” and “villages”, and DeFi has become the financial center among the cities. Since we use cities for comparison, how can we avoid each city’s independent governance and link up the chains of urban interests to form a greater network? The answer is the same as in real life, that is, the so-called currency everywhere.
Ethereum also provides currency, but this currency is not only inefficient, but also indirectly causes security risks. If you want long-term development, such a design is unreasonable.
In the QuarkChain mainnet, multi-native tokens are our primary function for building the next generation of DeFi. Multi-native tokens have basically the same status as QKC in the QuarkChain system. They can call contracts, perform cross-chain operations, and pay gas fees under certain conditions. Native tokens can achieve all of QKC’s functions, including cross-chain transactions, except participating in QKC governance. Most of the non-native asset inconvenience problems faced by Defi can be solved. In the future contracts, the functions of native tokens will be exactly the same as QKC, with the last barrier to the application of multi-native tokens being removed. This also avoids the problem of reducing the security of the entire DeFi system due to the ERC20 token’s authority issue. Next, we will launch our DEX, and then users will have the true feeling of the unimpeded DeFi platform on QuarkChain. Thus, the last piece of the puzzle of multi-chain heterogeneous + DeFi + multi-native tokens has been fulfilled, which brings cost efficiency, user easiness, and security to a new level.
Ethereum’s performance and contract security restrictions have affected development. After our repeated introduction and numerous testing, the multi-native token function is ready to be officially delivered to the community. Soon, community members can mint their own tokens and use them to transfer funds (including cross-sharding), pay gas fees, directly call smart contracts, etc. In conjunction with the DEX that we will launch in the next step, users can actually experience the convenience and innovation brought by multi-native tokens to the blockchain system.
To verify the validity of this theory, we recently launched the Game of DeFi Campaign. In the last stage of the campaign, we launched a simple DEX application and a game: QSwap — the multi-native token version of Uniswap, and Element Miner — a fun mining trading game. This is the new value that DEX and game-based mining will be able to bring to DApp and DeFi applications based on the verification of multi-native tokens with the game format. Because the gas fee is low enough, every step of the operation will be on the chain to ensure security. Meantime, instead of ETH’s high gas fee, which made users either high-cost and low-efficiency, or low-cost and low-security, the multi-native token proves the real security and convenience.
Our Game of DeFi Campaign has already entered the final stage. There are still millions of QKC reward pools waiting for the users to share. Users can download QPocket wallet to participate in this event.
Phase III: King’s Landing — Dex and Liquidity MiningIn this phase, all the community members can have the experience to use our two new products:
QSwap: Multi-native token version UniswapUnlike Uniswap, which can only support ERC20 tokens, QSwap supports multi-native tokens. Thus, no extra pre-authorized approval is required in the process, and any multi-native token can be used to pay gas fee ( not only QKC ). Users will get a better experience and maintain more security by avoiding granting unlimited authorization. Moreover, there will be a much lower gas fee due to sharding technology provided by QuarkChain infrastructure.
Element Miner: Interesting mining and trading DApp gameThe player’s goal is to collect 5 elements to join the reward pool. However, since these elements are reinforcing to each other (just like the mining throughputs from different projects are different), using QSwap will be the most efficient approach.
One last question: This DeFi campaign uses test tokens. What if the network uses tokens with real value?
submitted by UMITop to u/UMITop [link] [comments]
Ethereum cryptocurrency that comes second in terms of capitalization on the crypto market is traditionally seen as fast and profitable. However, over the last few weeks it's had a rough patch. Since early August, the network has had huge queues of transactions pending processing while fees have skyrocketed and surpassed the historical high.
The main issue though is that even fees of a few dollars per transfer don't help get rid of the“traffic jams”. The cause of this is numerous DeFi projects and a huge number of financial pyramids based on the Ethereum platform. Both generate excessive load on the network.
The situation is downright unpleasant, and our users might question whether the UMI network could face a similar challenge? We'd like to assure you it could not. The UMI network is by default protected against these problems — it cannot have “traffic jams”, fees or financial pyramids. But first things first.
How has the Ethereum network ground to a halt?
In its report dated August 4, Arcane Research that provides analysis within the field of cryptocurrency stated that over the previous week the daily size of transaction fees in the Ethereum network has surged up to a record high for over two and a half years. On August 3, the median value #%D0%9F%D1%80%D0%B8%D0%BC%D0%B5%D1%80_%D0%B8%D1%81%D0%BF%D0%BE%D0%BB%D1%8C%D0%B7%D0%BE%D0%B2%D0%B0%D0%BD%D0%B8%D1%8F)of the fee amounted to $0.82, with the overall amount of transaction fees totaling $2 mln. However, it only signaled the start of real problems.
Over the next week, fees continued to grow and by August 11 the median fee value almost doubled equaling $1.57. Larry Cermak, an expert at a big analytical and news-making crypto portal The Block, wrote in his August 15 tweet that over a week the total amount of transaction fees in the Ethereum network totaled $34.5 mln, having surpassed its historical high. Meanwhile, in the Bitcoin network that is seen as too expensive the fees were almost four times lower at $9 mln.
The total fee amount paid by cryptocurrency users over a week:
Historical Growth Chart for Ethereum Fees. Source
The existing situation shows that Ethereum is actually not as fast and profitable as commonly cited. Additionally, this could happen to almost any cryptocurrency except UMI that charges no fees whatsoever. We will tell you why.
Why have these problems emerged?
There is nothing unoriginal: the Ethereum network simply can't handle an increased load. Arcane Research analysts consider that a principal cause of this situation is the constantly increasing number of the DeFi ecosystem projects built on the Ethereum blockchain. Their number is growing all the time which causes the overload of the network. As of August 12, the total amount of funds in DeFi applications reached $4.3 billion which is 19.5% higher than that in the past week. At the time of writing this article, the amount surged to $6.21 billion. You can see the current data here. What is the most unpleasant about DeFi protocols is that a lot of them are scam projects.
Which is not the worst part though. There is also another factor that significantly slows down the Ethereum network. There are a lot of pyramid-like projects that are built on the EOS platform and use smart contracts. One of them is SmartWay Forsage, which regularly overloads the network with a large number of transactions, causes traffic jams, and, consequently, leads to increased fees (keep in mind that Ethereum miners choose transactions with a higher commission). Vitalik Buterin, the co-founder of Ethereum, revealed his disapproval of the SmartWay Forsage methodology and asked them to "leave and not pollute Ethereum ecology in the future". However, the project is slow to do this — it continues to deceive users.
This is only the tip of the iceberg of scam projects which abounds on the EOS network –– they continually emerge, work for a while, then go down as scams and are replaced with new ones. This never-ending stream of "investment projects" based on the Ponzi scheme overloads the system. This is the reason why Adam Back, a pioneer of the crypto industry and founder of the technology company Blockstream, equated Ethereum with such infamous projects as Onecoin and Bitconnect. Adam Back's solid dig at Ethereum became the subject of much debate among crypto enthusiasts.
Of course, it all doesn't mean that Ethereum is a bad cryptocurrency. On the contrary, it has a lot of advantages over other coins. But all that has happened exposes Ethereum's faults which must be eliminated. The problem is that they may not be fixable. It is far from certain that the developers will be able to get rid of all the defects as the system has huge scalability problems.
The crypto community has to admit that Ethereum, like other first-generation cryptocurrencies, has issues with capacity, fees, and scalability and is gradually becoming obsolete.
2020 is the time for young innovative cryptocurrencies such as UMI.
UMI is the flagship of new-generation cryptocurrencies.
In real fact, any cryptocurrency could face it. Each cryptocurrency charges fees which typically surge when the network is overloaded or the price is going up. Everyone will remember 2017 when in line with price growth and the network's overload Bitcoin transaction fee reached a high of around $40.
But when it comes to UMI, it works the other way round. The UMI network's advantages are high capacity, no fees, and scaling possibilities. It uses the best and fastest crypto industry solutions and excludes all inefficient methods by default. Smart optimization in combination with the Proof-of-Authority technology operating on the master node basis enables almost instant payments.
At the stage of network testing, an incredibly high capacity was achieved:
The UMI network can process transactions that Ethereum processes over a year in a few days and with no fees. More details
What is more important is that less than 0.001% of the network's overall potential is used now. The UMI network has a lot of reserve capacity and can handle hundreds of thousands of times heavier load. Moreover, with scaling possibilities, UMI can keep up with the times. The UMI code ensures the safe introduction of any upgrades — the network can be easily modified and scaled with cutting edge technology solutions. In other words, traffic jams will never pose a problem for us. UMI will instantly process all transactions, with no fees. Always.
A real-time speedometer displays the number of transactions processed by the UMI network per second. Link
Additionally, unlike Ethereum and other cryptocurrencies, the UMI's staking smart contract prevents possibilities of any pyramid schemes, meaning eliminates their negative influence. Our staking is completely safe and secured against scammers. Read more about this in our article. Any UMI staking structure could work forever. In other words, you can multiply your coins at a rate of up to 40% per month for an indefinitely long period of time.
UMI doesn't inherit the disadvantages of the first-generation cryptocurrencies. This is an innovative, carefully designed network based on state-of-the-art technologies. UMI is an ambitious step toward the future. And we're making it together right now!
Sincerely yours, UMI team
With the development of blockchain technology, obtaining data on the chain only is no longer satisfying and how to bridge the real world and the blockchain world has always been the direction of the technological breakthrough. Under this background, Oracle Machine came to our attention. In particular, with the popularity of the DeFi concept, the industry starts to witness a boom of the application of Oracle Machine in financial derivatives, trading platforms, gambling games, and prediction markets.submitted by ThemisOracle to u/ThemisOracle [link] [comments]
At present, Oracle Machine represented by Themis is developing fast with a good momentum, leading the trend of the development of Oracle Machine and continuing to consolidate the basic technical support for the DeFi revolution. Themis’ mining system has been launched in the market, which is refreshing and appealing (see https://themisoracle.com/#/credit for details on the Themis mining).
90% of MIS, the native token of Themis, will be used for mining output. The entire mining mechanism runs through a distributed oracle protocol, which sets up three roles: data provider, data validator, and arbitration node. Reward and punishment mechanisms are applied to ensure the smooth ecological operation.
How does Themis mining work? Is it a new way to become wealthy? What are the characteristics? To answer these questions, we need to analyse the distribution mechanism, mining mechanism, and token value of Themis.
With a fairer mining mechanism, small and medium-sized miners can enjoy better benefits
One of the core values of blockchain is fairness and justice, and allowing everyone in the network to play a role in the system without permission. However, Bitcoin mining is now monopolized by several mining machine vendors such as Bitmain, leaving little space for other miners to participate. If those old PoW public chains, such as Bitcoin, has formed the head effect in mining, what about those new projects? Let's take Cosmos as an example. Since Binance joined its validator node, it has instantly ranked top with the strong financial strength and user base of the top exchange, making the small and medium nodes hard to participate.
After comparison, we can find that the mining mechanism of MIS is very friendly to ordinary users. Assuming that there are 12 mining transactions in a block, the ranking according to the MIS pledged by each transaction would be as follow:
The pledge ranking is based on the jump ranking weighting algorithm rather than the weighted average of the user pledge amount, which can prevent MIS from being controlled by a small number of people, avoid monopoly, creating a win-win situation in the Themis community.
Compared with other mining projects, Themis has introduced a unique pledge ranking method in the mining design. Users in the best ranking area will get the most benefits, which is a good mechanism guarantee for attracting more users to participate in mining. At the same time, it can lead to the decentralization of data providers, ensuring the decentralization of the oracle system and the positive development of the community.
How can miners join in Themis mining? The answer is to become a part of the ecology by playing the role of either data provider, data validator, or arbitration node.
The data provider is mainly responsible for providing various types of data, and the data validator verifies and challenges the data offered by the data provider and provides new data. The arbitration node arbitrates the query raised by the data validator and come up with the final result.
Both the data providers and validators of Themis need to pledge MIS to obtain the qualifications, and the caller of external data also needs to pay MIS assets when accessing the data of Themis oracles. If the data has been verified as correct, data providers and validators will receive mining rewards, and the more they pledge, the more rewards they will receive.
In the mining design of Themis, miners can acquire MIS by providing verifiable random number or offering the price of in-chain assets. Whenever miners call mining contracts, the system will charge no service fee (excluding the service fee of ETH). In addition, if no mining transaction occurs within a certain period of time, the first newly-emerging block containing mining transactions will acquire all the MIS rewards. In this way, miners can be encouraged to continue mining and maintain the ecological stability of Themis.
The number of MIS mining for each mining transaction of miners is calculated as follows:
First, calculate the number of MIS mining rewards N contained in the block of the packaged mining transaction. If the height difference between the block and the previous block containing the mining transaction is y, then N = y * 20.
The MIS mining quantity of this mining transaction is M, then M=Xi/(📷)×N. Among them, X is the ranking of the MIS pledge amount in the block, and those who pledge the same amount of MIS have the same ranking.
Few official pre-mining, while 90% belongs to the community
Based on the official announcement, the distribution of MIS is:
The total amount of MIS is 1 billion, 10% is reserved for early project promotion, the remaining 90% are produced by mining, in which 75% are directly awarded to data providers, 10% to developers, and 5% as reward for arbitration nodes and ecological incentive. The production of mining will be progressively decreased and released with ETH. For some current popular VC-invested projects, institutional holdings hold more than half of blocks and unlock the block every month, which is a huge stress for ordinary pledge users. Many projects also went wrong because institutional investors do not abide by the rules. For MIS, because there is fewer official pre-mining, the selling pressure will be smaller, which is more in line with the value of the blockchain.
The release plan of developer and arbitration node and ecological incentive is as follows:
The release plan of data provider incentive is as follows:
The MIS awarded per block reduces by 10% in every 4 million blocks, and the reward per block at present is 20 MIS.
We can see that the allocation of MIS follows the following principles.
First of all, as MIS is the platform certificate of Themis, it is very reasonable to reserve 10% of MIS for early project promotion.
Secondly, 90% of MIS is produced through sustainable mining. This proportion can motivate contract users and miners to conduct contract mining, truly implementing the spirit of win-win community and token economy.
Finally, among the 90% of MIS, better incentive mechanisms have been adapted, mining reward ratios are subdivided, which can attract more investors to participate in mining.
Reasonable mining mechanism highlights the project value of Themis
Themis, as a public chain that provides a mechanism to solve the problems in Oracle Machine, has a unique charm in the value of MIS.
From the perspective of the number of tokens, the total amount of MIS is 1 billion, and the total mining pool is 900 million. 90% of the tokens are generated by mining, and the mining output gradually decreases its release with the Ethereum block, showing a great potential in its future added value. The earlier you participate in mining, the more profit you can gain.
From the perspective of Themis’s ecological design, Themis is committed to the original intention of building a price oracle. The data provider pays on-chain fees and pledges a certain amount of MIS, and determines the income obtained according to the scale of the pledge; the validator can make profit from challenging the data. Also, any smart contract developer or user need to pay the corresponding fee when calling Themis, and this part of the profit will be distributed to the data provider in proportion. Through this design, a logical closed loop is completed to ensure the healthy operation of the entire ecology and achieve the goal of mutual benefit. In Themis, all parties in the ecology can work together to grow more wealth.
In all, MIS has a huge potential for future development and arbitrage, and of course, a great profit potential as well.
Today, public chains like Themis are not just a technology platform, but also a symbol of future economic operation mode which connect between the blockchain and the real world. Themis, with a fair, justice and open network through mining, is building a strong token ecology, connecting external chain data and the systems, realising data interaction between blockchain and the real world, and more importantly, creating a new mode of token economy.
submitted by CelesOS to u/CelesOS [link] [comments]
The consensus mechanism is one of the important elements of the blockchain and the core rule of the normal operation of the distributed ledger. It is mainly used to solve the trust problem between people and determine who is responsible for generating new blocks and maintaining the effective unification of the system in the blockchain system. Thus, it has become an everlasting research hot topic in blockchain.
This article starts with the concept and role of the consensus mechanism. First, it enables the reader to have a preliminary understanding of the consensus mechanism as a whole; then starting with the two armies and the Byzantine general problem, the evolution of the consensus mechanism is introduced in the order of the time when the consensus mechanism is proposed; Then, it briefly introduces the current mainstream consensus mechanism from three aspects of concept, working principle and representative project, and compares the advantages and disadvantages of the mainstream consensus mechanism; finally, it gives suggestions on how to choose a consensus mechanism for blockchain projects and pointed out the possibility of the future development of the consensus mechanism.
First, concept and function of the consensus mechanism
1.1 Concept: The core rules for the normal operation of distributed ledgers
1.2 Role: Solve the trust problem and decide the generation and maintenance of new blocks
1.2.1 Used to solve the trust problem between people
1.2.2 Used to decide who is responsible for generating new blocks and maintaining effective unity in the blockchain system
1.3 Mainstream model of consensus algorithm
Second, the origin of the consensus mechanism
2.1 The two armies and the Byzantine generals
2.1.1 The two armies problem
2.1.2 The Byzantine generals problem
2.2 Development history of consensus mechanism
2.2.1 Classification of consensus mechanism
2.2.2 Development frontier of consensus mechanism
Third, Common Consensus System
Fourth, Selection of consensus mechanism and summary of current situation
4.1 How to choose a consensus mechanism that suits you
4.1.1 Determine whether the final result is important
4.1.2 Determine how fast the application process needs to be
4.1.2 Determining the degree to which the application requires for decentralization
4.1.3 Determine whether the system can be terminated
4.1.4 Select a suitable consensus algorithm after weighing the advantages and disadvantages
4.2 Future development of consensus mechanism
Last lecture review: Chapter 1 Concept and Function of Consensus Mechanism plus Chapter 2 Origin of Consensus Mechanism
Last lecture review: Chapter 3 Common Consensus Mechanisms
Chapter 3 Common Consensus Mechanisms (Part 2)
Figure 6 Summary of relatively mainstream consensus mechanisms
Source: Hasib Anwar, "Consensus Algorithms: The Root Of The Blockchain Technology"
The picture above shows 14 relatively mainstream consensus mechanisms summarized by a geek Hasib Anwar, including PoW (Proof of Work), PoS (Proof of Stake), DPoS (Delegated Proof of Stake), LPoS (Lease Proof of Stake), PoET ( Proof of Elapsed Time), PBFT (Practical Byzantine Fault Tolerance), SBFT (Simple Byzantine Fault Tolerance), DBFT (Delegated Byzantine Fault Tolerance), DAG (Directed Acyclic Graph), Proof-of-Activity (Proof of Activity), Proof-of- Importance (Proof of Importance), Proof-of-Capacity (Proof of Capacity), Proof-of-Burn ( Proof of Burn), Proof-of-Weight (Proof of Weight).
Next, we will mainly introduce and analyze the top ten consensus mechanisms of the current blockchain.
Delegated Byzantine fault tolerance. The improved Byzantine fault-tolerant algorithm makes it suitable for blockchain systems. The system consists of nodes, delegators (who can approve blocks), and speakers (who proposes the next block). It is a consensus algorithm that guarantees fault tolerance implemented inside the NEO blockchain.
In this mechanism, there are two participants: the professional bookkeeper "bookkeeping node" and the ordinary users in the system.
Ordinary users vote based on the proportion of holding stake to determine the bookkeeping node. When a consensus is required, a spokesperson is randomly selected from these bookkeeping nodes to draw up a plan, and then other bookkeeping nodes will vote basing on the Byzantine fault tolerance algorithm.That is, majority principle. If more than 66% of the nodes agree to the spokesperson’ plan, a consensus is reached; otherwise, the spokesperson is re-elected and the voting process is repeated.
-Representative application: Neo, etc.
Proof of authority. That is, certified by some accredited accounts, these accredited accounts are called "validators". The software that the verifier runs that supports the verifier to place transactions in blocks.
-Representative applications: VeChain, etc.
Directed acyclic graph. Each newly added unit in the DAG is not only added to the long chain block, but added to all the previous blocks, verifying each new unit and confirming its parent unit and the parent unit of the parent unit, and gradually confirming until the genesis unit. As the hash of its parent unit is included in its own unit, the blockchains of all transactions are connected to each other to form a graph-like structure with time.
In the DAG network, each node can be a trader and a validator, because the transaction processing in DAG is done by the transaction node itself. Taking IOTA as an example, IOTA’s Tangle led
ger does not need to pay transaction fees while ensuring high-speed transaction processing. However, it does not mean that the transaction is free, because in this ledger, the initiation of each transaction needs to verify the other two random transactions first, and connect the transaction initiated by itself to these two transactions, so the responsibility that miners on the blockchain bear is distributed to all traders. The DAG method of processing transactions can be called asynchronous processing mode.
Figure 10 The difference between the traditional blockchain structure and the DAG structure
-Representative applications: IOTA, etc.
Proof of elapsed time. That is, it is usually used in a permissioned blockchain network. It can determine the mining rights of the block holders in the network. The permissioned blockchain network requires any prospective participants to verify their identity before joining. According to the principles of the fair lottery system, each node is equally likely to become the winner.
Each participating node in the network must wait for a randomly selected period, and the first node to complete the set waiting time will get a new block. Each node in the blockchain network will generate a random waiting time and sleep for a set time. The node that wakes up first, that is, the node with the shortest waiting time, wakes up and submits a new block to the blockchain, and then broadcasts the necessary information to the entire peer-to-peer network. The same process will be repeated to find the next block.
Proof of stake velocity. Proposed by Reddcoin, drawing on the concept of "money circulation speed" in economics, it mainly allocates bookkeeping rights based on the coin age of nodes participating in the competition.
PoSV also allocates accounting rights according to the coin age of the nodes participating in the competition, but modifies the coin age calculation formula to a function of exponential decay of growth rate. Taking Reddcoin as an example, Reddcoin sets the half-life of the coin age growth rate to 1 month. Assuming that the unit token can accumulate 1CoinDay coin age on the first day, only 0.5CoinDay coin age can be accumulated on the 31st day, and only 0.25CoinDay coin age can be accumulated on the 61st day, and so on. In this way, the nodes are encouraged to use the token to conduct a transaction after holding the token for a period of time, thereby restarting the calculation of the coin age and increasing the circulation speed of the token in the network.
-Representative applications: Reddcoin, etc.
Table 2 Comparison of the advantages and disadvantages of current mainstream consensus mechanisms
Source: network resources
Chapter 4 Summary of the Selection and Status Quo of Consensus Mechanism
4.1 How to choose a consensus mechanism that suits you
Step 1: Determine whether the final result is important
For some applications, the end result is very important. If you are building a new payment system that can support very small amounts, it is acceptable for the transaction result to change. Similarly, if you are creating a new distributed social network, 100% guarantee that the status is updated immediately is not particularly necessary. On the contrary, if you are creating a new distributed protocol, the final result is critical to the user experience. For example, Bitcoin has a final confirmation time of about 1 hour, Ethereum has a final confirmation time of about 6 minutes, and Tendermint Core only has a final confirmation time of 1 second.
Step 2: Determine how fast the application process needs to be
If you are building a game, is it reasonable to wait 15 seconds before each action? Due to the low block processing time of Ethereum, games built on it will cause poor user experience due to Ethereum's throughput. However, the application for the transfer of housing property rights can be run on Ethereum. Use the Cosmos SDK to build an application that allows developers to freely use Tendermint Core. It has a short block processing time and high throughput, and is capable of processing 10,000 transactions per second. You can reduce the required communication overhead and speed up the application by setting the maximum number of validators for the application.
Step 3: Determine the application's demand for decentralization
Some applications such as games may not require very high censorship resistance as a by-product of decentralization. In theory, does it really matter that the validator can create a cartel in the game and reverse the transaction result for profit? If it is not important, a blockchain such as EOS may be more suitable for your needs because of the fast transaction speed and free fees. However, some applications such as autonomous banks are more powerful and decentralized. Although Ethereum is considered to be decentralized, some supporters claim that Ethereum's mining pool is an important part of centralized platform, although there are actually only 11 validators (mining pools). One of the major benefits of building your own blockchain instead of building on a smart contract platform is that you can customize the way the application completes verification. However, it is difficult to build your own blockchain, so the Cosmos SDK is very useful, you can easily build your own blockchain and customize the degree of decentralization you need.
Step 4: Determine whether the system can be terminated
If you are building a new application similar to a distributed ride-sharing service, then ensuring 24/7 service must be the first priority, even if there are occasional errors in accounting similar to transactions. One of the properties of Tendermint Core is that if there is a disagreement between network validators, the network will suspend operations instead of proceeding erroneous transactions. Applications such as decentralized exchanges require correctness at all costs-if there is a problem, it is far better to suspend trading on the decentralized exchange than there may be trading problems.
Summary: Choose a suitable consensus algorithm after weighing the advantages and disadvantages
All in all, there is no single best consensus algorithm. Each consensus algorithm has its own value and advantages. You need to have your own judgments and choices. However, by understanding the relevant processes of the consensus mechanism, including proposals and agreements, and establishing a framework to consider the types of consensus algorithms that your application may require, you should be able to make wiser decisions.
4.2 Future development of consensus mechanism
The consensus algorithm is one of the core elements of the blockchain. Although there are more than 30 consensus mechanisms listed in the article, there are still many niche consensus mechanisms that may not be discussed. As the blockchain technology is gradually known and accepted by the public, more and more newer and better consensus algorithms may appear in the future, which may be brand-new consensus algorithms, and more should be improvement and optimization version based on the current consensus algorithm.
After 2016 and 2017 years’ fast development, the current consensus algorithm does not have a recognized evaluation standard, but is generally more biased towards fairness and decentralization, as well as some technical related issues, such as energy consumption and scalability , Fault tolerance and security, etc. However, blockchain technology must be combined with requirements and application scenarios, and the consensus mechanism algorithm and incentive mechanism are inseparable. How to customize a suitable consensus mechanism according to the characteristics of your own project and optimize the current consensus mechanism will become the future direction of consensus mechanism development
As the first DPOW financial blockchain operating system, CelesOS adopts consensus mechanism 3.0 to break through the "impossible triangle", which can provide high TPS while also allowing for decentralization. Committed to creating a financial blockchain operating system that embraces supervision, providing services for financial institutions and the development of applications on the supervision chain, and formulating a role and consensus ecological supervision layer agreement for supervision.
The CelesOS team is dedicated to building a bridge between blockchain and regulatory agencies/financial industry. We believe that only blockchain technology that cooperates with regulators will have a real future. We believe in and contribute to achieving this goal.
submitted by D-platform to u/D-platform [link] [comments]
1. What is Bitcoin (BTC)?
2. Bitcoin’s core featuresFor a more beginner’s introduction to Bitcoin, please visit Binance Academy’s guide to Bitcoin.
Unspent Transaction Output (UTXO) modelA UTXO transaction works like cash payment between two parties: Alice gives money to Bob and receives change (i.e., unspent amount). In comparison, blockchains like Ethereum rely on the account model.
Nakamoto consensusIn the Bitcoin network, anyone can join the network and become a bookkeeping service provider i.e., a validator. All validators are allowed in the race to become the block producer for the next block, yet only the first to complete a computationally heavy task will win. This feature is called Proof of Work (PoW).
The probability of any single validator to finish the task first is equal to the percentage of the total network computation power, or hash power, the validator has. For instance, a validator with 5% of the total network computation power will have a 5% chance of completing the task first, and therefore becoming the next block producer.
Since anyone can join the race, competition is prone to increase. In the early days, Bitcoin mining was mostly done by personal computer CPUs.
As of today, Bitcoin validators, or miners, have opted for dedicated and more powerful devices such as machines based on Application-Specific Integrated Circuit (“ASIC”).
Proof of Work secures the network as block producers must have spent resources external to the network (i.e., money to pay electricity), and can provide proof to other participants that they did so.
With various miners competing for block rewards, it becomes difficult for one single malicious party to gain network majority (defined as more than 51% of the network’s hash power in the Nakamoto consensus mechanism). The ability to rearrange transactions via 51% attacks indicates another feature of the Nakamoto consensus: the finality of transactions is only probabilistic.
Once a block is produced, it is then propagated by the block producer to all other validators to check on the validity of all transactions in that block. The block producer will receive rewards in the network’s native currency (i.e., bitcoin) as all validators approve the block and update their ledgers.
Block productionThe Bitcoin protocol utilizes the Merkle tree data structure in order to organize hashes of numerous individual transactions into each block. This concept is named after Ralph Merkle, who patented it in 1979.
With the use of a Merkle tree, though each block might contain thousands of transactions, it will have the ability to combine all of their hashes and condense them into one, allowing efficient and secure verification of this group of transactions. This single hash called is a Merkle root, which is stored in the Block Header of a block. The Block Header also stores other meta information of a block, such as a hash of the previous Block Header, which enables blocks to be associated in a chain-like structure (hence the name “blockchain”).
An illustration of block production in the Bitcoin Protocol is demonstrated below.
Block time and mining difficultyBlock time is the period required to create the next block in a network. As mentioned above, the node who solves the computationally intensive task will be allowed to produce the next block. Therefore, block time is directly correlated to the amount of time it takes for a node to find a solution to the task. The Bitcoin protocol sets a target block time of 10 minutes, and attempts to achieve this by introducing a variable named mining difficulty.
Mining difficulty refers to how difficult it is for the node to solve the computationally intensive task. If the network sets a high difficulty for the task, while miners have low computational power, which is often referred to as “hashrate”, it would statistically take longer for the nodes to get an answer for the task. If the difficulty is low, but miners have rather strong computational power, statistically, some nodes will be able to solve the task quickly.
Therefore, the 10 minute target block time is achieved by constantly and automatically adjusting the mining difficulty according to how much computational power there is amongst the nodes. The average block time of the network is evaluated after a certain number of blocks, and if it is greater than the expected block time, the difficulty level will decrease; if it is less than the expected block time, the difficulty level will increase.
What are orphan blocks?In a PoW blockchain network, if the block time is too low, it would increase the likelihood of nodes producingorphan blocks, for which they would receive no reward. Orphan blocks are produced by nodes who solved the task but did not broadcast their results to the whole network the quickest due to network latency.
It takes time for a message to travel through a network, and it is entirely possible for 2 nodes to complete the task and start to broadcast their results to the network at roughly the same time, while one’s messages are received by all other nodes earlier as the node has low latency.
Imagine there is a network latency of 1 minute and a target block time of 2 minutes. A node could solve the task in around 1 minute but his message would take 1 minute to reach the rest of the nodes that are still working on the solution. While his message travels through the network, all the work done by all other nodes during that 1 minute, even if these nodes also complete the task, would go to waste. In this case, 50% of the computational power contributed to the network is wasted.
The percentage of wasted computational power would proportionally decrease if the mining difficulty were higher, as it would statistically take longer for miners to complete the task. In other words, if the mining difficulty, and therefore targeted block time is low, miners with powerful and often centralized mining facilities would get a higher chance of becoming the block producer, while the participation of weaker miners would become in vain. This introduces possible centralization and weakens the overall security of the network.
However, given a limited amount of transactions that can be stored in a block, making the block time too longwould decrease the number of transactions the network can process per second, negatively affecting network scalability.
3. Bitcoin’s additional features
Segregated Witness (SegWit)Segregated Witness, often abbreviated as SegWit, is a protocol upgrade proposal that went live in August 2017.
SegWit separates witness signatures from transaction-related data. Witness signatures in legacy Bitcoin blocks often take more than 50% of the block size. By removing witness signatures from the transaction block, this protocol upgrade effectively increases the number of transactions that can be stored in a single block, enabling the network to handle more transactions per second. As a result, SegWit increases the scalability of Nakamoto consensus-based blockchain networks like Bitcoin and Litecoin.
SegWit also makes transactions cheaper. Since transaction fees are derived from how much data is being processed by the block producer, the more transactions that can be stored in a 1MB block, the cheaper individual transactions become.
The legacy Bitcoin block has a block size limit of 1 megabyte, and any change on the block size would require a network hard-fork. On August 1st 2017, the first hard-fork occurred, leading to the creation of Bitcoin Cash (“BCH”), which introduced an 8 megabyte block size limit.
Conversely, Segregated Witness was a soft-fork: it never changed the transaction block size limit of the network. Instead, it added an extended block with an upper limit of 3 megabytes, which contains solely witness signatures, to the 1 megabyte block that contains only transaction data. This new block type can be processed even by nodes that have not completed the SegWit protocol upgrade.
Furthermore, the separation of witness signatures from transaction data solves the malleability issue with the original Bitcoin protocol. Without Segregated Witness, these signatures could be altered before the block is validated by miners. Indeed, alterations can be done in such a way that if the system does a mathematical check, the signature would still be valid. However, since the values in the signature are changed, the two signatures would create vastly different hash values.
For instance, if a witness signature states “6,” it has a mathematical value of 6, and would create a hash value of 12345. However, if the witness signature were changed to “06”, it would maintain a mathematical value of 6 while creating a (faulty) hash value of 67890.
Since the mathematical values are the same, the altered signature remains a valid signature. This would create a bookkeeping issue, as transactions in Nakamoto consensus-based blockchain networks are documented with these hash values, or transaction IDs. Effectively, one can alter a transaction ID to a new one, and the new ID can still be valid.
This can create many issues, as illustrated in the below example:
Since the transaction malleability issue is fixed, Segregated Witness also enables the proper functioning of second-layer scalability solutions on the Bitcoin protocol, such as the Lightning Network.
Lightning NetworkLightning Network is a second-layer micropayment solution for scalability.
Specifically, Lightning Network aims to enable near-instant and low-cost payments between merchants and customers that wish to use bitcoins.
Lightning Network was conceptualized in a whitepaper by Joseph Poon and Thaddeus Dryja in 2015. Since then, it has been implemented by multiple companies. The most prominent of them include Blockstream, Lightning Labs, and ACINQ.
A list of curated resources relevant to Lightning Network can be found here.
In the Lightning Network, if a customer wishes to transact with a merchant, both of them need to open a payment channel, which operates off the Bitcoin blockchain (i.e., off-chain vs. on-chain). None of the transaction details from this payment channel are recorded on the blockchain, and only when the channel is closed will the end result of both party’s wallet balances be updated to the blockchain. The blockchain only serves as a settlement layer for Lightning transactions.
Since all transactions done via the payment channel are conducted independently of the Nakamoto consensus, both parties involved in transactions do not need to wait for network confirmation on transactions. Instead, transacting parties would pay transaction fees to Bitcoin miners only when they decide to close the channel.
One limitation to the Lightning Network is that it requires a person to be online to receive transactions attributing towards him. Another limitation in user experience could be that one needs to lock up some funds every time he wishes to open a payment channel, and is only able to use that fund within the channel.
However, this does not mean he needs to create new channels every time he wishes to transact with a different person on the Lightning Network. If Alice wants to send money to Carol, but they do not have a payment channel open, they can ask Bob, who has payment channels open to both Alice and Carol, to help make that transaction. Alice will be able to send funds to Bob, and Bob to Carol. Hence, the number of “payment hubs” (i.e., Bob in the previous example) correlates with both the convenience and the usability of the Lightning Network for real-world applications.
Schnorr Signature upgrade proposalElliptic Curve Digital Signature Algorithm (“ECDSA”) signatures are used to sign transactions on the Bitcoin blockchain.
However, many developers now advocate for replacing ECDSA with Schnorr Signature. Once Schnorr Signatures are implemented, multiple parties can collaborate in producing a signature that is valid for the sum of their public keys.
This would primarily be beneficial for network scalability. When multiple addresses were to conduct transactions to a single address, each transaction would require their own signature. With Schnorr Signature, all these signatures would be combined into one. As a result, the network would be able to store more transactions in a single block.
The reduced size in signatures implies a reduced cost on transaction fees. The group of senders can split the transaction fees for that one group signature, instead of paying for one personal signature individually.
Schnorr Signature also improves network privacy and token fungibility. A third-party observer will not be able to detect if a user is sending a multi-signature transaction, since the signature will be in the same format as a single-signature transaction.
4. Economics and supply distributionThe Bitcoin protocol utilizes the Nakamoto consensus, and nodes validate blocks via Proof-of-Work mining. The bitcoin token was not pre-mined, and has a maximum supply of 21 million. The initial reward for a block was 50 BTC per block. Block mining rewards halve every 210,000 blocks. Since the average time for block production on the blockchain is 10 minutes, it implies that the block reward halving events will approximately take place every 4 years.
As of May 12th 2020, the block mining rewards are 6.25 BTC per block. Transaction fees also represent a minor revenue stream for miners.
3. Best Bitcoin mining software CGminer. Pros: Supports GPU/FPGA/ASIC mining, Popular (frequently updated). Cons: Textual interface. Platforms: Windows, Mac, Linux Going strong for many years, CGminer is still one of the most popular GPU/FPGA/ASIC mining software available. CGminer is a command line application written in C. It’s also cross platform, meaning you can use it with Windows ... The guide will give a brief overview of each of these pieces of Bitcoin miner software. I’ll also try to give a scenario or user that each one is best suited for. Finally, the guide will conclude with a comparison table so that you can look at some of the specifications of each program in the clearest possible way. 5. Best Bitcoin Miners Comparison. I compared the leading miners against one another in our Bitcoin mining calculator. For electricity costs I used $0.67, which is the average industrial electrical rate in the US. For sake of comparison, the average consumer electricity price for the world is $0.12. ASIC (application-specific integrated) is an integrated circuit specialized for solving a specific problem. In contrast to General-purpose integrated circuits, specialized integrated circuits are used in a particular device and perform strictly limited functions specific to that device only; consequently, the execution of functions is faster and, ultimately, cheaper. Perhaps the easiest-to-use Bitcoin mining software, MultiMiner is a desktop application that's chock-full of features. Available for Windows, Mac OS X and Linux, MultiMiner lets you switch mining devices (e.g. ASICs, FPGAs) between different cryptocurrencies (e.g. Litecoin, Bitcoin) without any effort.
[index]          
Brian Armstrong Live: Coinbase Trading, Bitcoin Mining, BTC Price Stay Home NOW Coinbase PROMO 7,647 watching Live now Things you can make from old, dead laptops - Duration: 19:03. How to mine Bitcoin using free App for windows 10. No other extra equipment is needed. Similar to 99 bitcoins without lengthy intro. This is not to discuss w... Now that you have Bitcoin mining hardware, your next step is to join a Bitcoin mining pool. While mining pools are desirable to the average miner as they smo... In this tutorial i will discuss the comparison between Antminer S9 and Gpu rig. And will tell you as to which miner should be use and which should not be use. you can contact me 00923346208535 or ... #bitcoin #bitcoinmining #bitcoinminingsoftware By Far The BEST Bitcoin Mining Software In 2020 (Profitable). This is a review on the most profitable, easy, a...