What Proof of Stake Is And Why It Matters - Bitcoin Magazine
What Proof of Stake Is And Why It Matters - Bitcoin Magazine
Blockchain & Bitcoin 101 Course - Proof of Work, Proof of ...
The Bitcoin Blockchain and proof of work – SioTech World
(PDF) On the Security and Performance of Proof of Work ...
Blockchain Simulator – proof of work, security and ...
Technical: Upcoming Improvements to Lightning Network
Price? Who gives a shit about price when Lightning Network development is a lot more interesting????? One thing about LN is that because there's no need for consensus before implementing things, figuring out the status of things is quite a bit more difficult than on Bitcoin. In one hand it lets larger groups of people work on improving LN faster without having to coordinate so much. On the other hand it leads to some fragmentation of the LN space, with compatibility problems occasionally coming up. The below is just a smattering sample of LN stuff I personally find interesting. There's a bunch of other stuff, like splice and dual-funding, that I won't cover --- post is long enough as-is, and besides, some of the below aren't as well-known. Anyway.....
Yeah the exciting new Lightning Network channel update protocol!
Solves "toxic waste" problem. In the current Poon-Dryja update protocol, old state ("waste") is dangerous ("toxic") because if your old state is acquired by your most hated enemy, they can use that old state to publish a stale unilateral close transaction, which your counterparty must treat as a theft attempt and punish you, causing you to lose funds. With Decker-Russell-Osuntokun old state is not revoked, but is instead gainsaid by later state: instead of actively punishing old state, it simply replaces the old state with a later state.
Allows multiple participants in the update protocol. This can be used as the update protocol for a channel factory with 3 or more participants, for example (channels are not practical for multiple participants since the loss of any one participants makes the channel completely unuseable; it's more sensible to have a multiple-participant factory that splits up into 2-participant channels). Poon-Dryja only supports two participants. Another update protocol, Decker-Wattenhofer, also supports multiple participants, but requires much larger locktimes in case of a unilateral close (measurable in weeks, whereas Poon-Dryja and Decker-Russell-Osuntokun can be measured in hours or days).
It uses nLockTime in a very clever way.
No, it does not solve the "watchtower needed" problem. Decker-Russell-Osuntokun still requires watchtowers if you're planning to be offline for a long time.
What might be confused is that it was initially thought that watchtowers under Decker-Russell-Osuntokun could be made more efficient by having the channel participant update a single "slot" in the watchtower, rather than having to consume one "slot" per update in Poon-Dryja. However, the existence of the "poisoned blob" attack by ZmnSCPxj means that having a replaceable "slot" is risky if the other participant of the channel can spoof you. And the safest way to prevent spoofing somebody is to identify that somebody --- but now that means the watchtower can surveill the activities of somebody it has identified, losing privacy.
Requires base layer change --- SIGHASH_NOINPUT / SIGHASH_ANYPREVOUT. This is still being worked out and may potentially not reach Bitcoin anytime soon.
Determining costs of routes is somewhat harder, and may complicate routefinding algorithms. In particular: every channel today has a "CLTV Delta", a number of blocks by which the total maximum delay of the payment is increased. This maximum delay is the maximum amount of time by which an outgoing payment can be locked, and needs to be reduced for UX purposes. Decker-Russell-Osuntokun will also add a "CSV minimum", a number of blocks, which must be smaller than the delay of an HTLC going through the channel. Current routefinding algos are good at minimizing a summed-up cost (like the "CLTV Delta") so the "CSV minimum" may require discovering / developing new routefinding algos.
Due to the "CSV minimum" above, existing nodes that don't understand Decker-Russell-Osuntokun cannot reliably route over Decker-Russell-Osuntokun channels, as they might not impose this minimum properly.
Multipart payments / AMP
Splitting up large payments into smaller parts!
There are at least three variants of multipart payments: Original, Base, and High.
Original is the original AMP proposed by Lightning Labs. It sacrifices proof-of-payment in order to allow each path to have a different payment hash. This is done by having the payer use a derivation scheme to generate each part's payment preimage from a seed, then having the split the seed (using secret sharing) to each part. The receiver can only reconstruct the seed if all parts reach it.
Base simply uses the same payment hash for all routes. This retains proof-of-payment (i.e. an invoice is undeniably signed by the receiver, including a payment hash in the invoice; public knowledge of the payment preimage is proof that the receiver has in fact received money, and any third party can be convinced of this by being shown the signed invoice and the preimage). The receiver could just take one part of the payment and then claim to be underpaid by the payer and then deny service, but claiming any one part is enough to publish the payment preimage, creating a proof-of-payment: so the receiver can provably be made liable, even if it took just one part, thus the incentive of the receiver is to only take in the payment once all parts have arrived to it.
High requires elliptic curve points / scalars. It combines both Original and Base, retaining proof-of-payment (sacrificed by Original) and ensuring cryptographically-secure waiting for all parts (rather than the mere economically-incentivized of Base). This is done by using elliptic curve homomorphism to addition of scalars to add together the payer-provided preimage (really scalar) of Original with the payee-provided preimage (really scalar) of Base.
Better expected reliability. Channels are limited by capacity. By splitting up into many smaller payments, you can fit into more channels and be more likely to successfully reach the payee.
Capacity on mutiple of your channels can be used to pay. Currently if you have 0.05BTC on one channel and 0.05BTC on another channel, you can't pay 0.06BTC without first rebalancing your channels (and paying fees for the rebalance first, whether the payment succeeds or not). With multipart you can now combine the capacities of multiple of your channels, and only pay fees for combining them if the payment pushes through.
Wumbo payments (oversized payments) come "for free" without having to be explicitly supported by the nodes of the network: you just split up wumbo payments into parts smaller than the wumbo limit.
Multipart will have higher fees. Part of the feerate of each channel is a flat-rate fee. Going through multiple paths means paying more of this flat-rate fee.
It's not clear how to split up payments. Heuristics for payment splitting have to be derived and developed and tested.
Payment points / scalars
Using the magic of elliptic curve homomorphism for fun and Lightning Network profits! Basically, currently on Lightning an invoice has a payment hash, and the receiver reveals a payment preimage which, when inputted to SHA256, returns the given payment hash. Instead of using payment hashes and preimages, just replace them with payment points and scalars. An invoice will now contain a payment point, and the receiver reveals a payment scalar (private key) which, when multiplied with the standard generator point G on secp256k1, returns the given payment point. This is basically Scriptless Script usage on Lightning, instead of HTLCs we have Scriptless Script Pointlocked Timelocked Contracts (PTLCs).
Enables a shit-ton of improvements: payment decorrelation, stuckless payments, noncustodial escrow over Lightning (the Hodl Hodl Lightning escrow is custodial, read the fine print), High multipart.
It's the same coolness that makes Schnorr Signatures cool. ECDSA, despite being based on elliptic curves, is not cool because the hash-the-nonce operation needed to prevent it from infringing Schnorr's fatherfucking patent also prevents ECDSA from using the cool elliptic curve homomorphism of addition over scalars.
Requires Schnorr on Bitcoin layer.
Actually, we can work with 2p-ECDSA without waiting for Schnorr. We get back the nice elliptic curve homomorphism by passing the ECDSA nonce through another cryptosystem, Paillier. This gets us the ability to do Scriptless Script. I think it has only 80-bits security because of going through Paillier though.
Basically the conundrum is: we could implement 2p-ECDSA now, hope we never have to test the 80-bit security anytime soon, then switch to Schnorr with 128-bit security later (which means reimplementing a bunch of things, because the calculations are different and the data that needs to be exchanged between channel participants is very different between the 2p-ECDSA and Schnorr). Reimplementing is painful and is more dev work. If we don't implement with 2p-ECDSA now, though, we will be delaying all the nice elliptic curve goodness (stuckless, noncustodial escrow, payment decorrelation) until Bitcoin gets Schnorr.
Elliptic curve discrete log problem is theoretically quantum-vulnerable. If we can't find a qunatum-resistant homomorphic construction, we'll have to give up the advantages (payment decorrelation, stuckless payments, noncustodial escrow over Lightning) we got from using elliptic curve points and go back to boring old hashes.
Ensuring that payers cannot access data or other digital goods without proof of having paid the provider. In a nutshell: the payment preimage used as a proof-of-payment is the decryption key of the data. The provider gives the encrypted data, and issues an invoice. The buyer of the data then has to pay over Lightning in order to learn the decryption key, with the decryption key being the payment preimage.
Enables data providers to sell data. This could be sensors, livestreams, blogs, articles, whatever.
There's no scheme to determine if the data provider is providing actually-useful data. The data-provider could just stream https://random.org for example. This is a potentially-impossible problem. Even if the data-provider provides a "sample" of the data, and is able to derive some proof that the sample is indeed a true snippet of the encrypted data, the rest of the data outside of the sample might just be random junk.
No more payments getting stuck somewhere in the Lightning network without knowing whether the payee will ever get paid! (that's actually a bit overmuch claim, payments still can get stuck, but what "stuckless" really enables is that we can now safely run another parallel payment attempt until any one of the payment attempts get through). Basically, by using the ability to add points together, the payer can enforce that the payee can only claim the funds if it knows two pieces of information:
The payment scalar corresponding to the payment point in the invoice signed by the payee.
An "acknowledgment" scalar provided by the payer to the payee via another communication path.
This allows the payer to make multiple payment attempts in parallel, unlike the current situation where we must wait for an attempt to fail before trying another route. The payer only needs to ensure it generates different acknowledgment scalars for each payment attempt. Then, if at least one of the payment attempts reaches the payee, the payee can then acquire the acknowledgment scalar from the payer. Then the payee can acquire the payment. If the payee attempts to acquire multiple acknowledgment scalars for the same payment, the payer just gives out one and then tells the payee "LOL don't try to scam me", so the payee can only acquire a single acknowledgment scalar, meaning it can only claim a payment once; it can't claim multiple parallel payments.
Can safely run multiple parallel payment attempts as long as you have the funds to do so.
Needs payment point + scalar
Non-custodial escrow over Lightning
The "acknowledgment" scalar used in stuckless can be reused here. The acknowledgment scalar is derived as an ECDH shared secret between the payer and the escrow service. On arrival of payment to the payee, the payee queries the escrow to determine if the acknowledgment point is from a scalar that the escrow can derive using ECDH with the payer, plus a hash of the contract terms of the trade (for example, to transfer some goods in exchange for Lightning payment). Once the payee gets confirmation from the escrow that the acknowledgment scalar is known by the escrow, the payee performs the trade, then asks the payer to provide the acknowledgment scalar once the trade completes. If the payer refuses to give the acknowledgment scalar even though the payee has given over the goods to be traded, then the payee contacts the escrow again, reveals the contract terms text, and requests to be paid. If the escrow finds in favor of the payee (i.e. it determines the goods have arrived at the payer as per the contract text) then it gives the acknowledgment scalar to the payee.
True non-custodial escrow: the escrow service never holds any funds.
Needs payment point + scalar.
Because elliptic curve points can be added (unlike hashes), for every forwarding node, we an add a "blinding" point / scalar. This prevents multiple forwarding nodes from discovering that they have been on the same payment route. This is unlike the current payment hash + preimage, where the same hash is used along the route. In fact, the acknowledgment scalar we use in stuckless and escrow can simply be the sum of each blinding scalar used at each forwarding node.
Privacy! Multiple forwarding nodes cannot coordinate to try to uncover the payer and payee of each payment.
I interlaced everything between Vitalik and Tuur to make it easier to read.
1/ People often ask me why I’m so “against” Ethereum. Why do I go out of my way to point out flaws or make analogies that put it in a bad light?
2/ First, ETH’s architecture & culture is opposite that of Bitcoin, and yet claims to offer same solutions: decentralization, immutability, SoV, asset issuance, smart contracts, … Second, ETH is considered a crypto ‘blue chip’, thus colors perception of uninformed newcomers.
Agree! I personally find Ethereum culture far saner, though I am a bit biased :)
3/ I've followed Ethereum since 2014 & feel a responsibility to share my concerns. IMO contrary to its marketing, ETH is at best a science experiment. It’s now valued at $13B, which I think is still too high.
Not an argument
4/ I agree with Ethereum developer Vlad Zamfir that it’s not money, not safe, and not scalable. https://twitter.com/VladZamfistatus/838006311598030848 … @VladZamfir Eth isn't money, so there is no monetary policy. There is currently fixed block issuance with an exponential difficulty increase (the bomb).
I'm pretty sure Vlad would say the exact same thing about Bitcoin
5/ To me the first red flag came up when in our weekly hangout we asked the ETH founders about to how they were going to scale the network. (We’re now 4.5 years later, and sharding is still a pipe dream.)
The core principles have been known for years, the core design for nearly a year, and details for months, with implementations on the way. So sharding is definitely not at the pipe dream stage at this point.
6/ Despite strong optimism that on-chain scaling of Ethereum was around the corner (just another engineering job), this promise hasn’t been delivered on to date.
Sure, sharding is not yet finished. Though more incremental stuff has been going well, eg. uncle rates are at near record lows despite very high chain usage.
7/ Recently, a team of reputable developers decided to peer review a widely anticipated Casper / sharding white paper, concluding that it does not live up to its own claims.
Unmerciful peer review of Vlad Zamfir & co's white paper to scale Ethereum: "the authors do NOT prove that the CBC Casper family of protocols is Byzantine fault tolerant in either practice or theory".
8/ On the 2nd layer front, devs are now trying to scale Ethereum via scale via state channels (ETH’s version of Lightning), but it is unclear whether main-chain issued ERC20 type tokens will be portable to this environment.
Umm... you can definitely use Raiden with arbitrary ERC20s. That's why the interface currently uses WETH (the ERC20-fied version of ether) and not ETH
9/ Compare this to how the Bitcoin Lightning Network project evolved:
elizabeth stark @starkness: For lnd: First public code released: January 2016 Alpha: January 2017 Beta: March 2018…
10/ Bitcoin’s Lightning Network is now live, and is growing at rapid clip.
Jameson Lopp @lopp: Lightning Network: January 2018 vs December 2018
Sure, though as far as I understand there's still a low probability of finding routes for nontrivial amounts, and there's capital lockup griefing vectors, and privacy issues.... FWIW I personally never thought lightning is unworkable, it's just a design that inherently runs into ten thousand small issues that will likely take a very long time to get past.
11/ In 2017, more Ethereum scaling buzz was created, this time the panacea was “Plasma”.
12/ However, upon closer examination it was the recycling of some stale ideas, and the project went nowhere:
Peter Todd @peterktodd These ideas were all considered in the Treechains design process, and ultimately rejected as insecure.
Just because Peter Todd rejected something as "insecure" doesn't mean that it is. In general, the ethereum research community is quite convinced that the fundamental Plasma design is fine, and as far as I understand there are formal proofs on the way. The only insecurity that can't be avoided is mass exit vulns, and channel-based systems have those too.
13/ The elephant in the room is the transition to proof-of-stake, an “environmentally friendly” way to secure the chain. (If this was the plan all along, why create a proof-of-work chain first?)
@TuurDemeester "Changing from proof of work to proof of stake changes the economics of the system, all the rules change and it will impact everything."
Umm... we created a proof of work chain first because we did not have a satisfactory proof of stake algo initially?
14/ For the uninitiated, here’s a good write-up that highlights some of the fundamental design problems of proof-of-stake. Like I said, this is science experiment territory.
Yes, we know about weak subjectivity, see https://blog.ethereum.org/2014/11/25/proof-stake-learned-love-weak-subjectivity/. It's really not that bad, especially given that users need to update their clients once in a while anyway, oh and by the way even if the weak subjectivity assumption is broken an attacker still needs to gather up that pile of old keys making up 51% of the stake. And also to defend against that there's Universal Hash Time.
16/ Keep in mind that Proof of Stake (PoS) is not a new concept at all. Proof-of-Work actually was one of the big innovations that made Bitcoin possible, after PoS was deemed impractical because of censorship vulnerability.
Oh I definitely agree that proof of work was superior for bootstrap, and I liked it back then especially because it actually managed to be reasonably egalitarian around 2009-2012 before ASICs fully took over. But at the present time it doesn't really have that nice attribute.
17/ Over the years, this has become a pattern in Ethereum’s culture: recycling old ideas while not properly referring to past research and having poor peer review standards. This is not how science progresses.Tuur Demeester added,
I try to credit people whenever I can; half my blog and ethresear.ch posts have a "special thanks" section right at the top. Sometimes we end up re-inventing stuff, and sometimes we end up hearing about stuff, forgetting it, and later re-inventing it; that's life as an autodidact. And if you feel you've been unfairly not credited for something, always feel free to comment, people have done this and I've edited.
18/ One of my big concerns is that sophistry and marketing hype is a serious part of Ethereum’s success so far, and that overly inflated expectations have lead to an inflated market cap.
Ok, go on.
19/ Let’s illustrate with an example.
20/ A few days ago, I shared a critical tweet that made the argument that Ethereum’s value proposition is in essence utopian.
@TuurDemeester Ethereum-ism sounds a bit like Marxism to me:
What works today (PoW) is 'just a phase', the ideal & unproven future is to come: Proof-of-Stake.…
22/ My first point, about Ethereum developers rejecting Proof-of-Work, has been illustrated many times over By Vitalik and others. (See earlier in this tweetstorm for more about how PoS is unproven.)
Vitalik Non-giver of Ether @VitalikButerin: I don't believe in proof of work!
See above for links as to why I think proof of stake is great.
23/ My second point addresses Ethereum’s romance with the vague and dangerous notion of ‘social consensus’, where disruptive hard-forks are used to ‘upgrade’ or ‘optimize’ the system, which inevitably leads to increased centralization. More here:
See my rebuttal to Tuur's rebuttal :)
24/ My third point addresses PoS’ promise of perpetual income to ETHizens. Vitalik is no stranger to embracing free lunch ideas, e.g. during his 2014 ETH announcement speech, where he described a coin with a 20% inflation tax as having “no cost” to users.
Yeah, I haven't really emphasized perpetual income to stakers as a selling point in years. I actually favor rewards being as low as possible while still being high enough for security.
25/ In his response to my tweet, Vitalik adopted my format to “play the same game” in criticizing Bitcoin. My criticisms weren't addressed, and his response was riddled with errors. Yet his followers gave it +1,000 upvotes!
Vitalik Non-giver of Ether @VitalikButerin: - What works today (L1) is just a phase, ideal and unproven future (usable L2) is to come - Utopian concept of progress: we're already so confident we're finished we ain't needin no hard forks…
Ok, let's hear about what the errors are...
26/ Rebuttal: - BTC layer 1 is not “just a phase”, it always will be its definitive bedrock for transaction settlement. - Soft forking digital protocols has been the norm for over 3 decades—hard-forks are the deviation! - Satoshi never suggested hyperbitcoinization as a goal.
Sure, but (i) the use of layer 1 for consumer payments is definitely, in bitcoin ideology, "just a phase", (ii) I don't think you can make analogies between consensus protocols and other kinds of protocols, and between soft forking consensus protocols and protocol changes in other protocols, that easily, (iii) plenty of people do believe that hyperbitcoinization as a goal. Oh by the way: https://twitter.com/tuurdemeestestatus/545993119599460353
27/ This kind of sophistry is exhausting and completely counter-productive, but it can be very convincing for an uninformed retail public.
Ok, go on.
28/ Let me share a few more inconvenient truths.
29/ In order to “guarantee” the transition to PoS’ utopia of perpetual income (staking coins earns interest), a “difficulty bomb” was embedded in the protocol, which supposedly would force miners to accept the transition.
The intended goal of the difficulty bomb was to prevent the protocol from ossifying, by ensuring that it has to hard fork eventually to reset the difficulty bomb, at which point the status quo bias in favor of not changing other protocol rules at the same time would be weaker. Though forcing a switch to PoS was definitely a key goal.
30/ Of course, nothing came of this, because anything in the ETH protocol can be hard-forked away. Another broken promise.
33/ The modular approach to Bitcoin seems to be much better at compartmentalizing risk, and thus reducing attack surfaces. I’ve written about modular scaling here...
To be fair, risk is reduced because Bitcoin does less.
34/ Another huge issue that Ethereum has is with scaling. By putting “everything on the blockchain” (which stores everything forever) and dubbing it “the world computer”, you are going to end up with a very slow and clogged up system.
We never advocated "putting everything on the blockchain". The phrase "world computer" was never meant to be interpreted as "everyone's personal desktop", but rather as a common platform specifically for the parts of applications that require consensus on shared state. As evidence of this, notice how Whisper and Swarm were part of the vision as complements to Ethereum right from the start.
35/ By now the Ethereum bloat is so bad that cheaply running an individual node is practically impossible for a lay person. ETH developers are also imploring people to not deploy more smart contract apps on its blockchain.
Tuur Demeester @TuurDemeester: But... deploying d-apps on the "Ethereum Virtual Machine" is exactly what everyone was encouraged to do for the past 4 years. Looks like on-chain scaling wasn't such a great idea after all.
Umm.... I just spun up a node from scratch last week. On a consumer laptop.
36/ As a result, and despite the claims that running a node in “warp” mode is easy and as good as a full node, Ethereum is becoming increasingly centralized.
37/ Another hollow claim: in 2016, Ethereum was promoted as being censorship resistant…
Tuur Demeester @TuurDemeester: Pre TheDAO #Ethereum presentation: "uncensorable, code is law, bottom up". http://ow.ly/qW49302Pp92
Yes, the DAO fork did violate the notion of absolute immutability. However, the "forking the DAO will lead to doom and gloom" crowd was very wrong in one key way: it did NOT work as a precedent justifying all sorts of further state interventions. The community clearly drew a line in the sand by firmly rejecting EIP 867, and EIP 999 seems to now also be going nowhere. So it seems like there's some evidence that the social contract of "moderately but not infinitely strong immutability" actually can be stable.
38/ Yet later that year, after only 6% of ETH holders had cast a vote, ETH core devs decided to endorse a hard-fork that clawed back the funds from a smart contract that held 4.5% of all ETH in circulation. More here: ...
Hudson Jameson @hudsonjameson: The "semi-closed" Ethereum 1.x meeting from last Friday was an experiment. The All Core Dev meeting this Friday will be recorded as usual.
Suppose I were to tomorrow sign up to work directly for Kim Jong Un. What concretely would happen to the Ethereum protocol? I suspect very little; I am mostly involved in the Serenity work, and the other researchers have proven very capable of both pushing the spec forward even without me and catching any mistakes with my work. So I don't think any argument involving me applies. And we ended up deciding not to do more semi-closed meetings.
40/ Another red flag to me is the apparent lack of relevant expertise in the ETH development community. (Check the responses…)
I personally am confident in the talents of our core researchers, and our community of academic partners. Most recently the latter group includes people from Starkware, Stanford CBR, IC3, and other groups.
I have no idea who described Lucius Meredith's work as being important for the Serenity roadmap.... oh and by the way, RChain is NOT an "Ethereum scaling company"
42/ Perhaps the recently added Gandalf of Ethereum, with his “Fellowship of Ethereum Magicians” [sic] can save the day, but imo that seems unlikely...
Honestly, I don't see why Ethereum Gandalf needs to save the day, because I don't see what is in danger and needs to be saved...
43/ This is becoming a long tweetstorm, so let’s wrap up with a few closing comments.
44/ Do I have a conflict of interest? ETH is a publicly available asset with no real barriers to entry, so I could easily get a stake. Also, having met Vitalik & other ETH founders several times in 2013-’14, it would have been doable for me to become part of the in-crowd.
Agree there. And BTW I generally think financial conflicts of interest are somewhat overrated; social conflicts/tribal biases are the bigger problem much of the time. Though those two kinds of misalignments do frequently overlap and reinforce each other so they're difficult to fully disentangle.
45/ Actually, I was initially excited about Ethereum’s smart contract work - this was before one of its many pivots.
Tuur Demeester @TuurDemeester: Ethereum is probably the first programming language I will teach myself - who wouldn't want the ability to program smart BTC contracts?
Ethereum was never about "smart BTC contracts"..... even "Ethereum as a Mastercoin-style meta-protocol" was intended to be built on top of Primecoin.
46/ Also, I have done my share of soul searching about whether I could be suffering from survivor’s bias.
47/ Here’s why Ethereum is dubious to me: rather than creating an open source project & testnet to work on these interesting computer science problems, its founders instead did a securities offering, involving many thousands of clueless retail investors.
48/ Investing in the Ethereum ICO was akin to buying shares in a startup that had “invent time travel” as part of its business plan. Imo it was a reckless security offering, and it set the tone for the terrible capital misallocation of the 2017 ICO boom.
Nothing in the ethereum roadmap requires time-travel-like technical advancements or anything remotely close to that. Proof: we basically have all the fundamental technical advancements we need at this point.
49/ In my view, Ethereum is the Yahoo of our day - an unscalable “blue chip” cryptocurrency:
Tuur Demeester @TuurDemeester: 1/ The DotCom bubble shows that the market isn't very good at valuing early stage technology. I'll use Google vs. Yahoo to illustrate.
Over the past few days, I had some unexpected downtime, so I went ahead and did some "re-research" of ICON, and went back to read some of the articles and analysis from thoughtful people that got me so excited about the project to begin with. I also found some additional content that I hadn't found before, that was equally thoughtful and analytic. Ultimately, the more I followed the near-daily updates that ICON has been putting out recently, the more I was unable to fit them into the broader context of what ICON was trying to accomplish, and how each new partnership potentially fits into the puzzle. After doing this digging around, I thought it wouldn't hurt to share a chunk of what I read here. I thought this would be helpful for the following reasons:
There may be others like myself who have also forgotten, or lost context of, some of the key parts of why ICON is so important and why what they have done is so impressive.
There are a number of people who may be new to the project or this reddit, who might not have read these articles when they first appeared last year, so it's new information for them.
I see a lot of repeating questions/comments regarding certain aspects of the project. "ICON needs to explain how the Samsung Partnership works!" "ICONLOOP does nothing to help the price of ICX!" "ICX won't be valuable until grandma starts using the coins to buy things!" "If only the ICON marketing was better, retail investors would be scooping this up!"....I think if you take the time to read everything below and think it all through, all those concerns have been addressed, either directly or indirectly, and those with concerns about the project along those lines will be subdued (hopefully).
There are obviously a million other places to read about ICON, such as the website, whitepaper, etc. But having thoughtful people explain how the project works, and what new developments mean in context, can be incredibly illuminating, enlightening, and inspirational. This isn't organized in a perfect manner - not sure if there is a way to do that - but I believe reading the articles below in order is probably the best path to develop a 30,000 foot view of ICON. Keep in mind, these articles are older, so their timelines on certain developments may be outdated, or a bit on the overly-optimistic side. However, out of all the premises they lay out, and the conclusions they reach, the passing of time has only fortified their analysis and foundational beliefs about the projects. All of the partnerships listed still exist - none have gone away to my knowledge - and of course we've added plenty more over the past few months. The staff has grown, offices have expanded, new partnerships have been born, technological developments added, etc. Ultimately, it's all the more reason to ignore today's price and focus on the potential price in 3-5 years, once the vision articulated below is able to play out. Markus Jun - The Comprehensive ICON Report Part 1: ICON Facts & Commentary (Medium) A couple of my favorite sections...
Here’s how this plays out in real life. Imagine that a student requires surgery. She may check into a hospital, verify her identity on Chain ID and give permission so that the hospital can share her medical records directly with her insurance company. This will trigger a smart contract that will immediately transfer her health records and her surgeon’s medical certificate (signed digitally on ICON’s Chain Sign) to the insurance company. The insurance company can then immediately process the insurance claim as both the health records and the medical certificate are tamper-proof on the blockchain and do not require additional verification or the sending of official paper documents, steps which typically slow down the process of traditional insurance claims and make them more costly. After her surgery, the student who needs to get hospitalized for a few days may then give permission to the hospital to share her records with her university so that she can get formally excused from attending classes. The information would again be shared immediately via a smart contract without the need of a third party ‘messenger’, without the need for paper (e.g., a doctor’s note), and with full assurance for both parties that the information is legitimate. Thus, the use of Chain ID, Chain Sign, powered by smart contractsexecuting on the ICON network, enables information sharing within industries to become safer, faster, tamper-proof, and cheaper. This means that ICON isn’t just enabling connectivity between loopchain networks, but a more efficient connectivity. It should be noted that this scenario is theoretically possible on any smart contract blockchain protocol. However, the reason why it’s a uniquely plausible scenario for ICON currently is because ICON is one of the only platforms that have already secured and built the networks in the necessary industries (e.g., healthcare, banking, insurance). Building a blockchain can be as easy as copying and pasting code, the true challenge is building a network.
At the end, Kim refers to adoption occurring with ‘the more participants you have.’ And this is precisely the approach that ICON is taking: enticing corporations to join ICON’s network. Because every time you see an announcement of an MOU between theloop and Company X, you are seeing a new addition to the ICON network. And every addition makes the network more valuable. The sum total of the network value will always be more valuable than any single company within. Even if Company X is Samsung. If you agree with Kim that ‘the value of blockchain is in the network itself,’ you may see why ICON has gone beyond any other project in realizing this vision.
As a researcher of this space for 2 years, I’ve studied countless projects and countless surges and crashes in market value. I’ve concluded there are few projects in the blockchain space that have the network effect, enterprise technical expertise, real world partnerships and growth initiatives that ICON has. At the same time, I’ve seen few projects that have been as mischaracterized and misunderstood as ICON, which is why I felt compelled to clarify. As a Korean, I want to refute the idea that Koreans are nationalistic and will always support their own. Yes, we’re passionate and proud of our gold medalists and Samsung but that’s because they are the best at what they do, and ICON isn’t there yet. However, I feel that ICON is currently Korea’s best hope for global blockchain leadership. As an investor, I believe that ICX is one of the most undervalued tokens in the space, especially when considering that larger market cap projects can’t compete with the scope of ICON’s network nor their years of experience in providing enterprise solutions. Oncoming developments in the next few months will prove to be catalysts for a significant surge in ICX demand both in institutional and individual investors. As I’ve stated before, widespread misunderstandings of ICON and its relationship with loopchain, and frequent delays have cast uncertainty on the project. But this is mostly immaterial as I feel these are still extremely early days, and these doubt arise mostly from a lack of understanding. Hence, I write. ICON has successfully planted seeds that are critical for success in major industries, but they are currently just that, seeds. Much of the projects haven’t matured enough to enable the truly game-changing network effects yet. I would say that there aren’t many projects in this space in which the utility of their utility tokens actually drives organic demand, and it’s certainly not true for ICX either right now. But with the network that ICON has built, and the projects that are set to begin developing, I see a strong case to be made. The crypto market isn’t rational. Everyday, millions are poured into ideas with no product, let alone a network. But in mid-late 2017, Ethereum had an explosive growth in value as people realized its utility/value proposition as a leading 2.0 blockchain smart contract protocol and all its corresponding uses. I see parallels between ICX currently and ETH in early 2017, both in their initial mischaracterizations, and failure of the public to see their value. If ICON manages to do what it seeks to do just in Korea (and I’d argue they’ve done most of the heavy lifting), we can expect significant growth in demand for ICX. Not just among speculators and investors, but among those actually seeking to utilize ICON as a leading 3.0 blockchain interoperability protocol.
There’s a very clear difference between speculative and consumptive demand. Almost all cryptos, including Ethereum, are still firmly in the speculative demand bucket. People are buying, not to use (i.e. consume) the token, but to speculate on it’s future consumptive demand. Undoubtedly, these networks have the potential to generate MASSIVE consumptive demand, but that demand may still be years away. In my opinion, that timeline is much, MUCH shorter for Icon than any other crypto I’ve seen. Icon had functional products up and running that rely on their ICX token months before they launched their mainnet product. Most projects haven’t even begun to actually implement real-world use cases. Sure they talk about them, but talk is cheap, especially in cryptoland. Why does this matter? Let me use San Francisco as an example. Being a “sales guy” in the Bay Area is a borderline derogatory term with techies. Developers oftentimes start businesses with the mindset that “if we build it, they will come” — “they” meaning customers. Startup after startup learns the hard way that this is almost never the case. Business development is never as easy as technologists think. In one of his many interviews on Youtube, Min Kim, one of the founders of Icon, mentioned that while they have an incredibly strong tech team, they realize that blockchain technology is still rapidly evolving. We’re still in the early innings of blockchain and, much like the internet, the tech will rapidly improve. Icon’s approach is to develop real world use cases today given the technological constraints we’re operating under. As the space matures technologically, so will Icon. As someone who has worked in tech as well as private and public market investing for many years, I can’t tell you how excited I was to hear him say this. This approach has been the winning approach in industry after industry after industry for decades. Technology and business development go hand in hand.
A common complaint I often see in the ICON subreddit and Telegram channel has to do with ICON’s lack of consumer and retail-oriented marketing. I believe there are several reasons for these complaints, with most of them being somewhat unreasonable. - A lack of understanding of the difference between a token holder and a shareholder. Many token holders believe they are entitled (perhaps even legally) to daily or weekly updates regarding the status of the project. - Unrealistic expectations set by other cryptocurrency projects which do not function like real-world businesses. - Overinvesting – plain and simple. Before discussing the first two points, I would like to say I completely understand the negativity towards ICON’s lack of B2C marketing, and I do agree there is major room for improvement – more on this later on. Token holder versus shareholder – there’s a major difference. In this stage of the game, investing in a cryptocurrency is complete speculation – it can be educated speculation, but it’s still speculation. We do not have the same rights and protections as a shareholder, and we should act accordingly. In the world of corporate business, there are usually quarterly shareholder meetings that act as a medium of communication between shareholders and company executives. As token holders, this does not apply to us. We should not expect constant communication from executives – the fact that we even have some communication from ICON’s executives is completely unheard of. Imagine Mark Zuckerberg tweeting (LOL) with a Facebook user about Facebook’s upcoming plans on a consistent basis. The truth is, blockchain and its decentralized identity have already punctured a gigantic hole through the facade of normal corporate communication, and the fact that Min Kim willingly spends his free time interacting with the ICON community members is absolutely incredible. This is further compounded by the fact that, unlike many blockchain projects, ICON has real ties with many industry-leading companies, major banks, top universities, and government institutions. Here are examples of a few – LINE, Deloitte, Samsung, SBI Ripple Asia, DAYLI Intelligence, Smilegate, and more. Many of these companies are worth billions of dollars and have assets and brand recognition of their own to protect. You have to understand that creating coordinated PR strategies with these huge corporations and government entities takes time and patience. Most importantly, it’s not the kind of stuff ICON can whimsically tweet about whenever they feel like it. Obviously, ICON wants to share all the great things happening behind the scenes, but legally they cannot do this without being sued by their partners. Do you think LINE would be okay with ICON publicly talking about future blockchain DApps on the LINE platform? No, because this would affect LINE’s bottom line if its competitors can race to build a similar product.
Okay, so all three examples above have one thing in common – ICX gains value as a network utility token. I cannot stress this enough. ICX will not have real value if a bunch of retail investors buys it. ICX may gain “bubble value” if a bunch of hedge funds buys it. ICX will only gain REAL value if the ICON platform is used to connect people, businesses, and institutions together to create new and unique connections that can be monetized. Knowing this, I hope you have a better understanding of why ICON is working to onboard businesses to build on and connect to the ICON platform first. At this stage of the game, there’s very little incentive to market to retail investors because we are not the intended users of the ICON network. We are the intended users of the services that will use ICON as a backbone and interoperable protocol. Thus, ICON is choosing to devote 95% of their manpower to the core business philosophy, and as true supporters of the project, we shouldn’t have it any other way.
If this ends up happening, and I think it will relatively soon, ICON will be poised to become the most widely used blockchain platform in the world. At the moment, Bitcoin, the world’s number one cryptocurrency, is estimated to have less than 28.5 million users.
Last October, theloop revealed that its blockchain-based authentication solution, CHAIN ID, was already being piloted by 25 banks and securities companies in the Korea Financial Investment Blockchain Consortium. Half a year later, theloop announced that CHAIN ID would be used by Samsung (one of Korea’s largest companies) in their biometric authentication technology, Samsung Pass. Recently, ICON Foundation wrote, “in the future it is expected that there will no longer be classifications of certified/private certifications, and all certificates will have the same authenticity.” Connect the dots. CHAIN ID is already being used by some of South Korea’s largest banks and securities companies. CHAIN ID is being implemented in Samsung Pass. Samsung has over 57% market share in South Korea’s mobile smartphone market. ICON revealed there will only be one kind of certificate in the future. After a little reading between the lines and a tiny amount of educated speculation, I have come to the conclusion that the majority of digital authentication in South Korea will happen on the CHAIN ID platform in the near future. This blockchain solution is being aggressively adopted by the country’s biggest financial and technology firms. If there’s really only going to be one certificate in the future, it’s obvious they will be issued by the first mover in the space – theloop’s CHAIN ID.
I would suggest going through all of the Decrypto.net posts on ICON. Brian Li - the blog's author - has a deep understanding of the project and has done a good job of breaking down new developments by providing context. Here are all the posts he has made that I could find (I have bolded a couple that are particularly helpful):
I would also recommend this year+ old reddit post, as well as this one. I hope you all find this helpful and valuable. As stated, this will be stale information to a number of you, but hopefully some of it will be fresh to a chunk of you, and 100% new to those who have recently hopped on board the project. And of course, if there are other articles you've enjoyed along a similar vein that I have not included, please feel free to share them below.
Dear Reddit community, Following our announcement for DTube v0.9, I have received countless questions about the new blockchain part, avalon. First I want to make it clear, that it would have been utterly impossible to build this on STEEM, even with the centralized SCOT/Tribes that weren't available when I started working on this. This will become much clearer as you read through the whole wall of text and understand the novelties. SteemPeak says this is a 25 minutes read, but if you are truly interested in the concept of a social blockchain, and you believe in its power, I think it will be worth the time!
I'm a long time member of STEEM, with tens of thousands of staked STEEM for 2 years+. I understand the instinctive fear from the other members of the community when they see a new crypto project coming out. We've had two recent examples recently with the VOICE and LIBRA annoucements, being either hated or ignored. When you are invested morally, and financially, when you see competitors popping up, it's normal to be afraid. But we should remember competition is healthy, and learn from what these projects are doing and how it will influence us. Instead, by reacting the way STEEM reacts, we are putting our heads in the sand and failing to adapt. I currently see STEEM like the "North Korea of blockchains", trying to do everything better than other blockchains, while being #80 on coinmarketcap and slowly but surely losing positions over the months. When DLive left and revealed their own blockchain, it really got me thinking about why they did it. The way they did it was really scummy and flawed, but I concluded that in the end it was a good choice for them to try to develop their activity, while others waited for SMTs. Sadly, when I tried their new product, I was disappointed, they had botched it. It's purely a donation system, no proof of brain... And the ultra-majority of the existing supply is controlled by them, alongside many other 'anti-decentralization' features. It's like they had learnt nothing from their STEEM experience at all... STEEM was still the only blockchain able to distribute crypto-currency via social interactions (and no, 'donations' are not social interactions, they are monetary transfers; bitcoin can do it too). It is the killer feature we need. Years of negligence or greed from the witnesses/developers about the economic balance of STEEM is what broke this killer feature. Even when proposing economical changes (which are actually getting through finally in HF21), the discussions have always been centered around modifying the existing model (changing the curve, changing the split, etc), instead of developing a new one.
You never change things by fighting the existing reality. To change something, build a new model that makes the existing model obsolete.
What if I built a new model for proof of brain distribution from the ground up? I first tried playing with STEEM clones, I played with EOS contracts too. Both systems couldn't do the concepts I wanted to integrate for DTube, unless I did a major refactor of tens of thousands of lines of code I had never worked with before. Making a new blockchain felt like a lighter task, and more fun too. Before even starting, I had a good idea of the concepts I'd love to implement. Most of these bullet points stemmed from observations of what happened here on STEEM in the past, and what I considered weaknesses for d.tube's growth.
The first concept I wanted to implement deep down the core of how a DPOS chain works, is that I didn't want the token to be staked, at all (i.e. no 'powering up'). The cons of staking for a decentralized social platform are obvious: * complexity for the users with the double token system. * difficulty to onboard people as they need to freeze their money, akin to a pyramid scheme. The only good thing about staking is how it can fill your bandwidth and your voting power when you power-up, so you don't need to wait for it to grow to start transacting. In a fully-liquid system, your account ressources start at 0% and new users will need to wait for it to grow before they can start transacting. I don't think that's a big issue. That meant that witness elections had to be run out of the liquid stake. Could it be done? Was it safe for the network? Can we update the cumulative votes for witnesses without rounding issues? Even when the money flows between accounts freely? Well I now believe it is entirely possible and safe, under certain conditions. The incentive for top witnesses to keep on running the chain is still present even if the stake is liquid. With a bit of discrete mathematics, it's easy to have a perfectly deterministic algorithm to run a decentralized election based off liquid stake, it's just going to be more dynamic as the funds and the witness votes can move around much faster.
NO EARLY USER ADVANTAGE
STEEM has had multiple events that influenced the distribution in a bad way. The most obvious one is the inflation settings. One day it was hella-inflationary, then suddently hard fork 16 it wasn't anymore. Another major one, is the non-linear rewards that ran for a long time, which created a huge early-user advantage that we can still feel today. I liked linear rewards, it's what gives minnows their best chance while staying sybil-resistant. I just needed Avalon's inflation to be smart. Not hyper-inflationary like The key metric to consider for this issue, is the number of tokens distributed per user per day. If this metric goes down, then the incentive for staying on the network and playing the game, goes down everyday. You feel like you're making less and less from your efforts. If this metric goes up, the number of printed tokens goes up and the token is hyper-inflationary and holding it feels really bad if you aren't actively earning from the inflation by playing the game. Avalon ensures that the number of printed tokens is proportional to the number of users with active stake. If more users come in, avalon prints more tokens, if users cash-out and stop transacting, the inflation goes down. This ensures that earning 1 DTC will be about as hard today, tomorrow, next month or next year, no matter how many people have registered or left d.tube, and no matter what happens on the markets.
NO LIMIT TO MY VOTING POWER
Another big issue that most steemians don't really know about, but that is really detrimental to STEEM, is how the voting power mana bar works. I guess having to manage a 2M SP delegation for @dtube really convinced me of this one. When your mana bar is full at 100%, you lose out the potential power generation, and rewards coming from it. And it only takes 5 days to go from 0% to 100%. A lot of people have very valid reasons to be offline for 5 days+, they shouldn't be punished so hard. This is why all most big stake holders make sure to always spend some of their voting power on a daily basis. And this is why minnows or smaller holders miss out on tons of curation rewards, unless they delegate to a bidbot or join some curation guild... meh. I guess a lot of people would rather just cash-out and don't mind the trouble of having to optimize their stake. So why is it even a mana bar? Why can't it grow forever? Well, everything in a computer has to have a limit, but why is this limit proportional to my stake? While I totally understand the purpose of making the bandwidth limited and forcing big stake holders to waste it, I think it's totally unneeded and inadapted for the voting power. As long as the growth of the VP is proportional to the stake, the system stays sybil-resistant, and there could technically be no limit at all if it wasn't for the fact that this is ran in a computer where numbers have a limited number of bits. On Avalon, I made it so that your voting power grows virtually indefinitely, or at least I don't think anyone will ever reach the current limit of Number.MAX_SAFE_INTEGER: 9007199254740991 or about 9 Peta VP. If you go inactive for 6 months on an account with some DTCs, when you come back you will have 6 months worth of power generation to spend, turning you into a whale, at least for a few votes. Another awkward limit on STEEM is how a 100% vote spends only 2% of your power. Not only STEEM forces you to be active on a daily basis, you also need to do a minimum of 10 votes / day to optimize your earnings. On Avalon, you can use 100% of your stored voting power in a single mega-vote if you wish, it's up to you.
A NEW PROOF-OF-BRAIN
No Author rewards
People should vote with the intent of getting a reward from it. If 75% of the value forcibly goes to the author, it's hard to expect a good return from curation. Steem is currently basically a complex donation platform. No one wants to donate when they vote, no matter what they will say, and no matter how much vote-trading, self-voting or bid-botting happens. So in order to keep a system where money is printed when votes happen, if we cannot use the username of the author to distribute rewards, the only possibility left is to use the list of previous voters aka "Curation rewards". The 25% interesting part of STEEM, that has totally be shadowed by the author rewards for too long.
STEEM has always suffered from the issue that the downvote button is unused, or when it's used, it's mostly for evil. This comes from the fact that in STEEM's model, downvotes are not eligible for any rewards. Even if they were, your downvote would be lowering the final payout of the content, and your own curation rewards... I wanted Avalon's downvotes to be completely symmetric to the upvotes. That means if we revert all the votes (upvotes become downvotes and vice versa), the content should still distribute the same amount of tokens to the same people, at the same time.
No payment windows
Steem has a system of payments windows. When you publish a content, it opens a payment window where people can freely upvote or downvote to influence the payout happening 7 days later. This is convenient when you want a system where downvotes lower rewards. Waiting 7 days to collect rewards is also another friction point for new users, some of them might never come back 7 days later to convince themselves that 'it works'. On avalon, when you are part of the winners of curation after a vote, you earn it instantly in your account, 100% liquid and transferable.
Unlimited monetization in time
Indeed, the 7 days monetization limit has been our biggest issue for our video platform since day 8. This incentivized our users to create more frequent, but lesser quality content, as they know that they aren't going to earn anything from the 'long-haul'. Monetization had to be unlimited on DTube, so that even a 2 years old video could be dug up and generate rewards in the far future. Infinite monetization is possible, but as removing tokens from a balance is impossible, the downvotes cannot remove money from the payout like they do on STEEM. Instead, downvotes print money in the same way upvotes do, downvotes still lower the popularity in the hot and trending and should only rewards other people who downvoted the same content earlier.
New curation rewards algorithm
STEEM's curation algorithm isn't stupid, but I believe it lacks some elegance. The 15 minutes 'band-aid' necessary to prevent curation bots (bots who auto vote as fast as possible on contents of popular authors) that they added proves it. The way is distributes the reward also feels very flat and boring. The rewards for my votes are very predictable, especially if I'm the biggest voter / stake holder for the content. My own vote is paying for my own curation rewards, how stupid is that? If no one elses votes after my big vote despite a popularity boost, it probably means I deserve 0 rewards, no? I had to try different attempts to find an algorithm yielding interesting results, with infinite monetization, and without obvious ways to exploit it. The final distribution algorithm is more complex than STEEM's curation but it's still pretty simple. When a vote is cast, we calculate the 'popularity' at the time of the vote. The first vote is given a popularity of 0, the next votes are defined by (total_vp_upvotes - total_vp_downvotes) / time_since_1st_vote. Then we look into the list of previous votes, and we remove all votes in the opposite direction (up/down). The we remove all the votes with a higher popularity if its an upvote, or the ones with a lower popularity if its a downvote. The remaining votes in the list are the 'winners'. Finally, akin to STEEM, the amount of tokens generated by the vote will be split between winners proportionally to the voting power spent by each (linear rewards - no advantages for whales) and distributed instantly. Instead of purely using the order of the votes, Avalon distribution is based on when the votes are cast, and each second that passes reduces the popularity of a content, potentially increasing the long-term ROI of the next vote cast on it. GraphIt's possible to chart the popularity that influences the DTC monetary distribution directly in the d.tube UI This algorithm ensures there are always losers. The last upvoter never earns anything, also the person who upvoted at the highest popularity, and the one who downvoted at the lowest popularity would never receive any rewards for their vote. Just like the last upvoter and last downvoter wouldn't either. All the other ones in the middle may or may not receive anything, depending on how the voting and popularity evolved in time. The one with an obvious advantage, is the first voter who is always counted as 0 popularity. As long as the content stays at a positive popularity, every upvote will earn him rewards. Similarly, being the first downvoter on an overly-popular content could easily earn you 100% rewards on the next downvote that could be from a whale, earning you a fat bonus. While Avalon doesn't technically have author rewards, the first-voter advantage is strong, and the author has the advantage of always being the first voter, so the author can still earn from his potentially original creations, he just needs to commit some voting power on his own contents to be able to publish.
ONE CHAIN <==> ONE APP
More scalable than shared blockchains
Another issue with generalistic blockchains like ETH/STEEM/EOS/TRX, which are currently hosting dozens of semi-popular web/mobile apps, is the reduced scalability of such shared models. Again, everything in a computer has a limit. For DPOS blockchains, 99%+ of the CPU load of a producing node will be to verify the signatures of the many transactions coming in every 3 seconds. And sadly this fact will not change with time. Even if we had a huge breakthrough on CPU speeds today, we would need to update the cryptographic standards for blockchains to keep them secure. This means it would NOT become easier to scale up the number of verifiable transactions per seconds. Oh, but we are not there yet you're thinking? Or maybe you think that we'll all be rich if we reach the scalability limits so it doesn't really matter? WRONG The limit is the number of signature verifications the most expensive CPU on the planet can do. Most blockchains use the secp256k1 curve, including Bitcoin, Ethereum, Steem and now Avalon. It was originally chosen for Bitcoin by Satoshi Nakamoto probably because it's decently quick at verifying signatures, and seems to be backdoor-proof (or else someone is playing a very patient game). Maybe some other curves exist with faster signature verification speed, but it won't be improved many-fold, and will likely require much research, auditing, and time to get adopted considering the security implications.
In 2015 Graphene was created, and Bitshares was completely rewritten. This was able to achieve 100,000 transaction per second on a single machine, and decentralized global stress testing achieved 18,000 transactions per second on a distributed network.
So BitShares/STEEM and other DPOS graphene chains in production can validate at most 18000 txs/sec, so about 1.5 billion transactions per day. EOS, Tendermint, Avalon, LIBRA or any other DPOS blockchain can achieve similar speeds, because there's no planet-killing proof-of-works, and thanks to the leader-based/democratic system that reduces the number of nodes taking part in the consensus. As a comparison, there are about 4 billion likes per day on instagram, so you can probably double that with the actual uploads, stories and comments, password changes, etc. The load is also likely unstable through the day, probably some hours will go twice as fast as the average. You wouldn't be able to fit Instagram in a blockchain, ever, even with the most scalable blockchain tech on the world's best hardware. You'd need like a dozen of those chains. And instagram is still a growing platform, not as big as Facebook, or YouTube. So, splitting this limit between many popular apps? Madness! Maybe it's still working right now, but when many different apps reach millions of daily active users plus bots, it won't fit anymore. Serious projects with a big user base will need to rethink the shared blockchain models like Ethereum, EOS, TRX, etc because the fees in gas or necessary stake required to transact will skyrocket, and the victims will be the hordes of minnows at the bottom of the distribution spectrum. If we can't run a full instagram on a DPOS blockchain, there is absolutely no point trying to run medium+reddit+insta+fb+yt+wechat+vk+tinder on one. Being able to run half an instagram is already pretty good and probably enough to actually onboard a fair share of the planet. But if we multiply the load by the number of different app concepts available, then it's never gonna scale. DTube chain is meant for the DTube UI only. Please do not build something unrelated to video connecting to our chain, we would actively do what we can to prevent you from growing. We want this chain to be for video contents only, and the JSON format of the contents should always follow the one used by d.tube. If you are interested in avalon tech for your project isn't about video, it's strongly suggested to fork the blockchain code and run your own avalon chain with a different origin id, instead of trying to connect your project to dtube's mainnet. If you still want to do it, chain leaders would be forced to actively combat your project as we would consider it as useless noise inside our dedicated blockchain.
Another issue of sharing a blockchain, is the issues coming up with the governance of it. Tons of features enabled by avalon would be controversial to develop on STEEM, because they'd only benefit DTube, and maybe even hurt/break some other projects. At best they'd be put at the bottom of a todo list somewhere. Having a blockchain dedicated to a single project enables it to quickly push updates that are focused on a single product, not dozens of totally different projects. Many blockchain projects are trying to make decentralized governance true, but this is absolutely not what I am interested in for DTube. Instead, in avalon the 'init' account, or 'master' account, has very strong permissions. In the DTC case, @dtube: * will earn 10% fees from all the inflation * will not have to burn DTCs to create accounts * will be able to do certain types of transactions when others can't * * account creation (during steem exclusivity period) * * transfers (during IEO period) * * transfering voting power and bandwidth ressources (used for easier onboarding) For example, for our IEO we will setup a mainnet where only @dtube is allowed to transfer funds or vote until the IEO completes and the airdrop happens. This is also what enabled us to create a 'steem-only' registration period on the public testnet for the first month. Only @dtube can create accounts, this way we can enforce a 1 month period where users can port their username for free, without imposters having a chance to steal usernames. Through the hard-forking mechanism, we can enable/disable these limitations and easily evolve the rules and permissions of the blockchain, for example opening monetary transfers at the end of our IEO, or opening account creation once the steem exclusivity ends. Luckily, avalon is decentralized, and all these parameters (like the @dtube fees, and @dtube permissions) are easily hardforkable by the leaders. @dtube will however be a very strong leader in the chain, as we plan to use our vote to at least keep the #1 producing node for as long as we can. We reserve the right to 'not follow' an hardfork. For example, it's obvious we wouldn't follow something like reducing our fees to 0% as it would financially endanger the project, and we would rather just continue our official fork on our own and plug d.tube domain and mobile app to it. On the other end of the spectrum, if other leaders think @dtube is being tyranical one way or another, leaders will always have the option of declining the new hardforks and putting the system on hold, then @dtube will have an issue and will need to compromise or betray the trust of 1/3 of the stake holders, which could reveal costly. The goal is to have a harmounious, enterprise-level decision making within the top leaders. We expect these leaders to be financially and emotionally connected with the project and act for good. @dtube is to be expected to be the main good actor for the chain, and any permission given to it should be granted with the goal of increasing the DTC marketcap, and nothing else. Leaders and @dtube should be able to keep cooperation high enough to keep the hard-forks focused on the actual issues, and flowing faster than other blockchain projects striving for a totally decentralized governance, a goal they are unlikely to ever achieve.
A lot of hard-forking
Avalon is easily hard-forkable, and will get hard-forked often, on purpose. No replays will be needed for leaders/exchanges during these hard-forks, just pull the new hardfork code, and restart the node before the hard-fork planned time to stay on the main fork. Why is this so crucial? It's something about game theory. I have no former proof for this, but I assume a social and financial game akin to the one played on steem since 2016 to be impossible to perfectly balance, even with a thourough dichotomical process. It's probably because of some psychological reason, or maybe just the fact that humans are naturally greedy. Or maybe it's just because of the sheer number of players. They can gang up together, try to counter each others, and find all sorts of creative ideas to earn more and exploit each other. In the end, the slightest change in the rules, can cause drastic gameplay changes. It's a real problem, luckily it's been faced by other people in the past. Similarly to what popular and succesful massively multiplayer games have achieved, I plan to patch or suggest hard-forks for avalon's mainnet on a bi-monthly basis. The goal of this perfect imbalance concept, is to force players to re-discover their best strategy often. By introducing regular, small, and semi-controlled changes into this chaos, we can fake balance. This will require players to be more adaptative and aware of the changes. This prevents the game from becoming stale and boring for players, while staying fair.
Death to bots
Automators on the other side, will need to re-think their bots, go through the developement and testing phase again, on every new hard-fork. It will be an unfair cat-and-mouse game. Doing small and semi-random changes in frequent hard-forks will be a easy task for the dtube leaders, compared to the work load generated to maintain the bots. In the end, I hope their return on investment to be much lower compared to the bid-bots, up to a point where there will be no automation. Imagine how different things would have been if SteemIt Inc acted strongly against bid-bots or other forms of automation when they started appearing? Imagine if hard-forks were frequent and they promised to fight bid-bots and their ilk? Who would be crazy enough to make a bid-bot apart from @berniesanders then? I don't want you to earn DTCs unless you are human. The way you are going to prove you are human, is not by sending a selfie of you with your passport to a 3rd party private company located on the other side of the world. You will just need to adapt to the new rules published every two weeks, and your human brain will do it subconsciously by just playing the voting game and seeing the rewards coming. All these concepts are aimed at directly improving d.tube, making it more resilient, and scale both technologically and economically. Having control over the full tech stack required to power our dapp will prevent issues like the one we had with the search engine, where we relied too heavily on a 3rd party tool, and that created a 6-months long bug that basically broke 1/3 of the UI. While d.tube's UI can now totally run independently from any other entity, we kept everything we could working with STEEM, and the user is now able to transparently publish/vote/comment videos on 2 different chains with one click. This way we can keep on leveraging the generalistic good features of STEEM that our new chain doesn't focuses on doing, such as the dollar-pegged token, the author rewards/donation mechanism, the tribes/communities tokens, and simply the extra exposure d.tube users can get from other website (steemit.com, busy.org, partiko, steempeak, etc), which is larger than the number of people using d.tube directly. The public testnet has been running pretty well for 3 weeks now, with 6000+ accounts registered, and already a dozen of independant nodes popping up and running for leaders. The majority of the videos are cross-posted on both chains and the daily video volume has slightly increased since the update, despite the added friction of the new 'double login' system and several UI bugs. If you've read this article, I'm hoping to get some reactions from you in the comments section! Some even more focused articles about avalon are going to pop on my blog in the following weeks, such as how to get a node running and running for leadewitness, so feel free to follow me to get more news and help me reach 10K followers ;)
Stay away from the SegwitGold pump and dump scheme. Devs get coins just for creating it is a clear sign that it's bad news.
Led by anonymous lead-developer h4x3rotab, it's basically intended to make a few no-namers rich. They get a bunch of coins just for being developers, hooray for them! Sounds decentralized to me /s. It's ironic that their tagline is, "make Bitcoin decentralized again". It has a proof of work change to Equihash which is a GPU-based algorithm that is prone to Botnet attacks. If you want a legit Equihash altcoin checkout Zcash which implemented zk-snarks from the start and has private transactions - an improvement on Bitcoin rather than a stale copy. I think BitcoinCash/Bitcoin and Ethereum are both looking to include zk-snark proofs at some time in the future without changing the POW. Oh and check out their site: "Error establishing a database connection". Definitely trustworthy developers here! Use this post to shit-talk SegwitGold, make general jokes about how trashy Segwit is or to delineate that no exchanges will support it. Don't waste time with these people trying to capitalize on stupidity.
Refutation to savingprivatedash's Proposal to Demote Ryan Taylor
Recently a proposal was submitted to the Dash masternode network requesting the demotion of Ryan Taylor, the CEO of Dash Core Group (DCG). DCG is the core development team hired by the Dash DAO. The proposal’s author, savingprivatedash, provided 7 points to support his argument. I am going to discuss each of these points directly. This is the link to the proposal - https://www.dashcentral.org/p/demote-ryan-taylor-to-an-advisory-role (1) “Ryan destroyed the market's confidence in Dash by repeatedly breaking promises and missing deadlines. Dash was once valued at 0.09BTC and it is now 0.02, in spite of millions of dollars available to him. Vault accounts, usernames, friends lists, easy to use mobile wallets, marketplace. None of the 2016 promises were kept. Even Amanda Johnson, once Dash's biggest fan and now nowhere to be seen, said publicly she would give DCG until Dec 31 2018 to deliver on Evolution. Unfortunately, she is in for yet another disappointment, since we are in August 2018 and there isn't even a roadmap yet. If Ryan were to present one during this quarterly call, there is no reason he should be believed.”
Ryan has not broken a single promise or missed a deadline by any unreasonable amount. The original roadmap for Evolution was developed before before Ryan was CEO and before any real work on Evolution had been started. It was overly optimistic and Ryan said so when he took the position as DCG CEO. The timeline was readjusted to Evolution being released in Q4 2018. Dash is still on track to meet that goal.
All coins have suffered significant losses during this bear market. Even though Dash has fallen a significant amount from the all time high, it's not as bad as savingprivatedash makes it sound. Dash had a higher peak relative to BTC during the late 2017 bull run but beyond that short and exceptional period, the Dash price has been pretty much lock step with BTC. Savingprivatedash is cherry picking his data to paint a false picture.
Usernames, friend lists, easy to use mobile wallets are all part of Evolution which is scheduled to be released later this year. User vaults, which I'm assuming he means masternode shares, are not scheduled to be included with the initial release of Evolution. There is no defined timeline for this feature but it's likely going to be at least a year from now.
Amanda is still actively involved in Dash, just not in the public. While it's certainly possible, I have never heard her give an ultimatum to DCG about Evolution nor has savingprivatedash given any evidence to support this claim. Regardless, it doesn't really matter. While Amanda will always hold a special place in Dash, Dash is not dependent upon what Amanda thinks or does.
The new roadmap was released with the Q3 quarterly call as planned. It was known to be scheduled to be released at that time and before savingprivatedash made his proposal. He is being disingenuous in this claim.
(2) “Ryan has grown his company irresponsibly. There are 6,176 DASH available in the budget and DCG has about $500,000 in monthly expenses. Dash is now below $200, and $500,000 / 6176 = $80.95. That means if the DASH price goes below $80, not only there won't be funds for any other community projects, but also not enough to pay the salaries of DCG employees. The threshold for complete chaos is probably around $150-$160, because there are other financial obligations that they need to meet besides salaries. I wonder how much confidence the employees have in Ryan's leadership knowing their salaries are at risk.”
Ryan has grown and restructured DCG rather significantly but he has not done so irresponsibly. Dash is much bigger than just a blockchain and needs a full sized team to meet these demands. What Ryan has put together is a professional team with loads of experience. Ryan has built a foundation that is able to take Dash far into the future. At this point, there is still plenty of money in the monthly recurring treasury to pay for DCG. If the price were to drop to $80, the worst case scenario is they tighten their belts. Several Core members have promised to forgo their salary if treasury were not able to sustain them. There are also other contingency plans but I don't know what they are specifically. Lastly, with significant real world adoption and all the new partnerships and integrations, Dash is not likely to ever drop as low as $80. Also, this entire argument falls apart as soon as the price rises.
The treasury was created to fund development, nothing more. The treasury funds that other projects get are just gravy and most projects realize this. It certainly has been difficult for many of the them though, I won’t deny that. However, other means of funding like DashBoost and DashDonate have come on the scene to fill in the gaps left by the treasury. Also, many masternode owners have personally donated significant sums to help out projects.
That being said, during the last quarterly call, DCG made a pledge to never ask for more than 60% of the treasury in normal circumstances and never more than 80% in exceptional circumstances.
The threshold for complete chaos is demonstrably not $150 as he claims as the price got lower than that and there was virtually no disord in the community. This is nothing but speculation designed to sow dissent. The fact that this proposal has the most negative votes of any proposal in Dash's history is proof of the community's support for Ryan.
(3) “Ryan had access to more than $30,000,000 USD in funding and didn't create a safety net for DCG. Because of his unforgivable mistake, other important community projects are either already defunded or in serious risk of being defunded. Ryan jeopardized the financial stability of his entire company, and many other community projects, in spite of the ludicrous amounts of money that were available to him.”
There is and was a safety net but it was not as large as they wanted due to tax reasons. DCG had already started to develop a legal framework to get around this tax burden but the bear market hit before they were done. Keep in mind that Dash is navigating uncharted legal waters. Dash is the first DAO to ever be legally recognized and it takes time to develop such a legal framework. It is true that other projects have not been able to be funded during the bear market but as I already said, the treasury is to fund DCG first and foremost.
savingprivatedash is playing both sides of the coin here. Sure, Ryan may have had access to $30,000,000 but that was during the ATH. DCG keeps its funds in Dash. If they exchanged them to USD then savingprivatedash would have accused DCG of not truly believing in the project and selling out. Damned if you do, damned if you don't.
(4) “Technology. Big promises were made and we expected reasonable results in reasonable times. Users, merchants, investors and everyone else in the ecosystem had high expectations but didn't see meaningful releases in the past 3 years. We still don't have features promised in Evolution, Private Send still takes way too long (it took me almost 2 days to mix 5 DASH), Dash.org and the Dash Core Wallet are still the same they were 2 years ago, and so on. There are thousands of other cryptocurrencies being actively developed and timing is essential. People cared about logins and passwords in 2016, but won't in 2019-2020 if and when this is released. Perhaps we would do better by breaking Dash Core into individual teams, where each apply for their own funding. Instead of 100 DASH all going to DCG, the Marketing Team applies for 30, Evolution Team for 60 and Business Development for 10.”
I already covered how the initial timeline was developed before Ryan took over as CEO and how Dash is on track since the roadmap was revised.
PrivateSend does not take days to mix 5 Dash, this is an outright lie. I regularly churn 5 Dash and it only takes a few hours. I believe it's going to get even faster with v12.4 and then even faster (maybe even instant, I'm not sure about that) with Evolution.
He is correct in that Dash.org has not changed and people are starting to complain. We are told there is a full rework in progress and it's supposed to be ready with the release of Evolution. I haven't seen anything to back up this claim but nor have I seen anything to doubt it. If the website is not ready for the launch of Evolution, I would expect/request the contracts of those responsible to not be renewed.
He is lying when he says that the dash core wallet has not changed in 2 years. There have been several updates, only the interface has not changed. The changes have all been related to building a foundation for Evolution, not making it look pretty. That would be a waste of time and resources.
There are demos of the features of Evolution (usernames, dsdk, dapi, etc). All of the hard work is done, it's just a matter of putting it all together.
Sure, there are thousands of cryptocurrencies being developed but most of them are crap and won’t survive. None have the four year history of continual innovation that Dash has.
By asking that DCG be separated into several groups, I believe that savingprivatedash is actually trying to break up the team that Ryan has put together. The goal of DCG is to create a base for other projects to build from. To do this, they need a singular focus that can only come from a cohesive team.
(5) “Marketing. Ryan made the mistake of promoting Fernando Gutierrez to CMO (Chief Marketing Officer) back in Jan 17 2018. As a lawyer with no experience, creativity, or talent for marketing, Fernando has an impressive track record of zero results in 8 months. He had at his disposal millions of dollars and still have nothing to show for. The Dash brand is in dire need of professional tender love and care. He is doing the best he can with the limited resources he has (talent, experience, creativity), and it is Ryan's fault for misallocating human resources. The new CEO should move Fernando to a different position and instruct HR to hire a new CMO.”
While I know very little about Fernando, savingprivatedash’s claims are pure speculation. That being said, I don’t entirely disagree with his critique of the results. So far, I'm not impressed with the marketing efforts that DCG has initiated. We have had better success with non-DCG efforts. However, my knowledge of DCG’s marketing is superficial and there very well could be more going on behind the scenes I’m not aware of.
(6) “Business Development. Ryan made the mistake of hiring Bradley Zastrow on Dec 15 2017. For the past 8 months, the guy has been bullshitting his way with meaningless updates and also zero results. Things like "30 conversations focusing on 9 integrations", and "30+ conversations focusing on 6 integrations" are his way of saying he is working, but not delivering. Imagine a sales person that does not make a single sale. Ever. His list of accomplishments includes things like "Attended Consensus" and "Attended Alt36 conference". If Bradley were a community project he would have been defunded after just two months. He is allowed to underperform and underdeliver without consequences, in spite of the disproportional salary he receives.”
This point is a load of bull. Bradley has brought in several important integrations and partnerships in Q3 alone including General Bytes, Rewards.com, bitgo, Paycent, and Alogateway. He has already brought in more integrations this quarter.
With the exception of Bitcoin, I don’t know of any other projects who have more partners and integrations than Dash. Certainly not Bitcoin Cash, Litecoin, or Monero, Dash’s closest competitors.
(7) “Ryan is not a leader. Since Evan Duffield left, Dash Core Group has been a stale and boring company that does not innovate! Ryan failed to create a sense of urgency and a culture of results. His company has taken millions of dollars from the budget and still does not have any meaningful achievements on Marketing, Business Development, and most importantly, on Technology. No other entity in the Dash ecosystem consumes so much resources and delivers so little. Even small community projects with modest budgets have far more to show for than DCG's bloated and fully funded departments. We need a dependable, energetic, and passionate CEO. One that would care deeply about our brand, that would be involved in important community projects, that would have a say on important proposals, that would DELIVER and KEEP HIS PROMISES.”
The Dash community and masternodes seem to strongly disagree with this point thus proving that Ryan is a leader. While DCG has changed significantly under Ryan's leadership, it most definitely is not stale or boring. DCG is now the most professional and best organized development team in crypto with maybe the exception of Ripple. DCG has been blazing a very exciting trail in many areas beyond the basic blockchain. Dash Ventures for example, is a mind blowing game changer.
As I said that start of my refutation, Ryan has not missed a goal or broken a single promise. The only delays have been typical of every software project ever.
Ultimately, even though presented as concern for the Dash network, this proposal is nothing more than an insidious attempt to create dissent in the Dash community and tarnish the reputation of Dash to those who don’t follow the project closely. It failed. What it showed is not only how open and decentralized the Dash system is where someone can submit a proposal for personnel change and have the network vote on it, but it also showed that by being the least successful proposal in Dash’s history, the Dash community is more united than ever. **Edited to fix formatting**
So you have an army of ants living inside your computer capable of apparently changing life on the planet as we know it. And as a friendly gesture, you managed to get the cereal and sugar water they asked for earlier. Because let's face it, the impossible has already apparently happened. The least you can do before considering a can of insecticide is to at least see just how far this all can go. And while you aren't the most childish person around, it is fun to watch ants move each piece of cereal into their home little by little. The downside is; you now have a colony of ants and most of a balanced breakfast inside your computer. There's no mess however, and if anything the lights inside the rig give off a rather ethereal glow among the whole thing. Another text message. "Thank you for the assistance. We will negotiate shortly." "Shortly as in?" "Let's say an hour." An hour came and went. You didn't leave, because truthfully who would? This whole thing is rather peculiar, slightly alarming, and questionable to say the least. Who's to say what would happen if you left the room? You're usually not one to judge people by first glance, but again, you're the one talking to a colony of ants. Not the weirdest thing anyone's ever done. Well, as a far as strange occurrences go... Eh, best not to think further on that now. You're getting ahead of yourself. After checking the clock, you figure now is a good time to try to see what exactly the ants have in mind involving the whole cryptocurrency thing. Bitcoins are huge right now after all, and it made you wonder exactly how you'd even remotely get into that world considering your lack of expertise. Say what you want about trying something new. A lot of them don't involve dumping money you don't exactly have into more money you may not even get a chance to collect. "Alright Human, let's get to work. We have our Queen's approval on the matter." "I thought you were the Ant King though." "I am. But happy wife, happy life. Besides the Queen runs things. Ant kingdoms are a usually a matriarchy on a biological level. I'm just the ambassador at this point." "Today I learned." "Indeed." "So, about that whole money and data thing?" "Yes, of course. Using your credit cards, we were able to purchase you several of the more up and coming proof-or-work currencies. It helps all of us greatly that you've managed to keep a decent credit score." Wait what? While you were all for this sort of thing at the beginning, you couldn't help but panic as you went across the room to retrieve your laptop. However, the ants already had everything pulled up. You couldn't believe the things they had found. Also, they somehow had managed to stream Ant Man in the interval. "You used my credit cards?!" You furiously texted. "It's a small investment for a large return. Besides, we heard this movie is rather good." Another stream of ants had tipped over the cereal box. Instead of having to clean it, you watched several of them maneuver some stale marshmellows into the computer. "You're just having a good time in there, aren't you?" "Look at the bright side. We're not roaches."
New release! You can get the latest official binaries here: https://github.com/riecointeam/riecoin/releases Please upgrade as soon as possible. To upgrade, no further action have to be done, other than replacing the wallet binaries and restarting it. That said, always make sure that you have backups of your private keys. A lot of efforts have been given in order to modernize the Riecoin wallet from 0.10.2 to 0.16.3. To complete the upgrade and move forward (start listing in other exchanges, improve the proof of work and beat new records, etc.), some softforks have to be enabled. However, the activation process is getting stale. Indeed, 95% of the blocks must indicate support for these forks during 12 h, which is basically impossible because of a couple of miners. Both never appeared publicly to explain why they are not upgrading. As it is clear that the majority of the Riecoin community wants to upgrade, and as we should not be retained by those non cooperating people, we decided to lower the threshold to 77.5% (80% might still be too much, 75% a bit low as we reached 76.5% recently). The window has been increased from 12 h to 1 week to ensure a safe activation. We also want to give everyone enough time to upgrade to 0.16.3.1. New miners are welcome, and additional mining power from the curent ones as well, in order to reach this 77.5% threshold and definitively make 0.10.2 a thing of the past. We can do it!New developers are more than welcome too. Bitcoin just released 0.18.0, and Riecoin shall never be outdated again like it was in the 0.10.2 era. Join our Discord server to get even more involved! https://discordapp.com/invite/2sJEayC With some new rules in the server, invite links can now be freely shared.
Ars, of course, "Tesla nearly triples Model 3 production from the previous quarter" "Tesla says that it will enjoy positive cash flow and positive net income in the third quarter of 2018. "
The full article is here, but read the comments too.
I filed the story at around 11:15am, at which point the up 3 percent figure was accurate (it was up 2.7 percent, according to Google). It's true that in the 45 minutes since then the stock price has been falling. Regardless, I would say that the first hour or so after an announcement is the best window for judging the market's reaction to a news event. Tesla's announcement hit my inbox at 9:12am, so the market had plenty of time to digest the numbers at the opening bell at 9:30. The more recent decline might reflect second thoughts about those numbers, but it could also reflect any number of other developments since the market opened.
The market had plenty of time (9:12 - 9:30 = 18 minutes) to digest the numbers, but the decline of the share price that started at 10 "could reflect any number of other developments." Yeah, so many things could happen in those 30 minutes. That's more likely than the pump didn't work and people are not buying those numbers. I can't believe Ars would write something so ridiculous. If the price goes up, it's proof that Tesla is doing well. If the price goes down literally half an hour later, it's something else, not the 5K2K FactoryGate. Is this how it really works? Perhaps the heat wave and Brazil's weak start against Mexico in the World Cup brought down TSLA at 10:00. It could be any number of things. When Ars reports on stories that have fluctuating prices, they post "Update: The price is now so and so" at the bottom like this here with an [Updated] addition to the headline. The change of bitcoin's value from $7500 to $7000 doesn't change the story, but they still reported it. In the case of TSLA, however, it's not an issue of quantity but quality. The stock went down, but they reported it as going up. No need to update the story or even add a tiny note at the end? Tesla reported the best news they could ever report, filled with corporate jargon, production manipulation, media manipulation, etc. and the share price still went down. Ars just quickly wrote an article to capture the bump when the price went up and published it before the price went down. I really wonder how these blogs and sites would report a Tesla bankruptcy. Edit: Still no update on the Ars article. When Ars wrote their article, the share price had gone from 342.95 to 359.81. They considered this newsworthy. At 11:35 the Ars article was published, but in less than 25 minutes, this gain in stock was gone. It would've been nice to mention it in the article as an update. Not expecting a constant ticker on the site, but now we've gone from 342.95 to 310.86. That's not enough to warrant an updated line? No. Up = Tesla is doing well. Down = market has its ups and downs. Of course. Long story short: Ars wrote a celebratory article on Tesla that stayed fresh for 15 - 25 minutes. They did not update the article when the facts changed (which they do with Bitcoin, shooters, etc.) and we are now a day later, the article is on page one of Ars, and the day old article is not just stale, but completely out of touch with reality. There is a 13% discrepancy, but who cares?
DPOS System Architecture:Elisia will launch its main net with 51 block producers authorized to process the transactions. For consensus building the block producers are elected into a round of 51, each producer gets one block per round, and is rewarded for the validation of incoming transactions and production of the block of transactions. A block released by one producer is validated by the next and the next and so forth; if not validated, it is not built upon. A block that is accepted by a quorum of producers is declared immutable, and the chain of immutable blocks becomes in effect a checkpoint. Like proof of work, producers can censor (ignore) messages, or they can front-run by introducing their own from their superior knowledge of the future. https://preview.redd.it/yztj0ap1dg421.png?width=852&format=png&auto=webp&s=66719f22d7c1a4ab4591e9ce1f3e14e2fe8ba571
GOVERNANCE IN ARCHITECTURE
To provide transparency in block producer selection and their governance over bad acts by producers, each round of producers is continuously elected by the community using proof of stake (PoS). Block Production: Bitcoin has a time between blocks of roughly 10 minutes, but with natural variance this can on occasion lead to fairly long periods before the next block is mined. Newer ledger designs such as Ethereum have improved upon this and benefit from a much shorter block-times (15 seconds) without the loss of security or miner centralization from high rates of orphan/stale blocks. Elisia will have a block production rate of 5 seconds .
Hello everyone! I'm happy to share what I've been working on with you today. This is the latest version of COSMiC for ERC-918 tokens. This update includes new features, under-the-hood improvements and (of course) optimizations to the CUDA core for greater efficiency. I call this build "beta", although it appears to be very stable. With miners' feedback, I'll make any desired improvements for 4.1.3. :) Supports Mining: 0xBitcoin, KiwiToken, 0xLTC (merged mining proof of concept token), Caelum(CLM), S.E.D.O. (standalone or merged mining with 0xBTC), CatEther(0xCATE), Bitcoin Classic Token (on the Ethereum Classic network), and many other ERC-918 token varieties. Requires: nVidia (CUDA) GPUs, any 64-bit CPU (>2 threads recommended), any 64-bit Windows version (Windows 7 should work.) Important Note: Now built against CUDA v10.0 - It is strongly recommended that you update your nVidia graphics drivers. It is NOT necessary to install the CUDA toolkit. :) FEATURES:
Mining on multiple CUDA devices in one instance, auto-detected
Mines on CUDA Devices and sending results to Pools (TokenPool etc.)
Fully-integrated Graphical User Interface for Windows 64-bit (Intended to make Token Mining as easy as possible for newcomers)
All settings can be configured in the GUI (no manual config editing required!)
Integrated Hardware Montioring and Safety Features ('Watchqat')
Faster, more efficient and lower CPU/RAM use than COSMiC V3.4 and the classic 0xBitcoin-miner
Aims for very few stale Shares at the Pool level and verifies GPU-found solutions on the CPU
Improved Network code for stability/lower CPU use
Multi-Threaded for enhanced performance on CPUs with >2 threads
CHANGES THIS VERSION: - Integrated Hardware Monitoring ('Watchqat') with safety features: individual GPUs will automatically pause if GPU temperature or Fan/Pump speed is outside the user-defined range
Now built against CUDA v10.0 (please updare your nV graphics drivers! CUDA tookit is NOT required.)
Various internal improvements for stability, CUDA engine tweaks and optimizations (specifically to enhance performance on Pascal and newer architecture GPUs)
Increased hashrate observed on Maxwell Gen2 (GTX 9xx) and Pascal (GTX 10x0) architectures
Keyboard hotkeys (see Help menu or README.txt) for quick Intensity adjustment (more will be added.)
Network code improvements and bug-fixes
Optimizations to further reduce CPU usage
UI Reworking/Improvements (Feel free to let me know what you think of the design/UX!)
The past thirty days have seen a prolonged debate on an important matter: Ethereum’s Issuance. But be it an important matter or not – few of us seem to understand what’s going on. To be fair, this issue has complexities that has brought indecision to even the best minds of this space. So, it’s not surprising that many of us have decided to glaze over the issue. However, as a community, we owe it to ourselves to understand at least the basics of this issue. In this post, I will break down the Ethereum Issuance debate as simply as possible. This will be an easy read – and by the end of the post you will have a firm understanding of what is going on.
Ethereum Issuance & Inflation Rate
Ethereum is “inflationary”. You hear it all the time. But many don’t seem to understand how the inflation is caused. It’s rather simple. Ethereum miners get rewarded for mining new blocks. These miners get rewarded/paid in Ether. But this isn’t “existing” Ethereum; this is Ether that is freshly minted/created. Essentially, miners are rewarded by issuing freshly minted Ether into the system. This “inflates” the existing supply in the market. Hence the term “Inflation Rate”
Ethereum Inflation Rate vs Issuance Rate
The Ethereum inflation rate and issuance rate are pretty much the same thing – for the most part. There’s a tiny ‘difference’ that is worth discussing. Let’s think about this for a second. There are two factors that will affect Ethereum’s inflation rate:
The speed at which fresh Ether is given out
The AMOUNT of Ether given out each time
Analogy: I can give you one piece of candy every minute; OR give you ten pieces of candy every ten minutes. Either way, over time I inflate your candy supply at the same rate. You’ll have 100 candy pieces in 100 minutes.
Ethereum Inflation Factors: Block Time & Block Reward
Speed of Ether Issuance Currently, the speed of at which Ether is issued out is pretty stable. Ether is issued to miners as a reward each time a new block is created/validated. As things stand, the time taken to create a block is relatively stable at ~14 seconds. However, if Ethereum increases the difficulty of “block creation”, then it will take longer to create each block. This is what people are referring to when they mention the “difficulty bomb”. If it takes longer to create create blocks, then less Ether will be rewarded over a period of time Analogy I stop giving you 10 pieces of candy every 10 minutes, and instead give you 10 pieces of candy every 15 minutes. After 100 minutes, you’ll have only 66 pieces of candy (instead of 100)
Ethereum Block Time vs Ethereum Block Reward Amount of Ether Issuance The amount of Ether issued for each reward is the next driving factor for Ethereum’s inflation rate. And this is the most debated factor at the moment. Ethereum is currently issuing roughly 5.5 Ether per block (as rewards) If Ethereum decides to reduce the amount of Ether given out per reward, then the inflation rate will drop regardless of the difficulty bomb. Analogy I keep giving you candy every 10 minutes. However, I give you only 6 pieces of candy each time – instead of 10. You’ll have only 60 pieces of candy after 100 minutes.
The Problem Reducing the Ether issued will cut into miner profits. But not reducing Ether issuance will anger the rest of the community (more on why later)
What is the Debate About?
The current inflation rate is around 7.3% annually. The Ethereum community was promised somewhere around 2% - 4% with the release of Casper. (In fact, Vitalik once quoted ~0.5% as a feasible number – leading to even more expectations)
So, the community has been patiently waiting a reduction in Ethereum’s inflation rate. This was supposed to happen with the release of Ethereum’s Proof Of Stake: Casper. If you’re keeping updated, you know about the delay on the Casper release. Casper was also supposed to include a “difficulty bomb” that would increase the time it takes to find a block. This would decrease the ethereum inflation rate
However, since Casper has been delayed, the community wants the matter of issuance being addressed right away. If Ethereum has to delay the difficulty bomb, then the other course of action is to reduce the amount of Ether being issued per block. Many community members are advocating for a reduction of issuance that would align inflation rate to ~2%. This would align the inflation rate to what it would be if Proof Of Stake was not delayed.
However the Ethereum Miners don’t like that idea – since it would cut directly into their profits. It’s important to note that Ethereum is still using Proof Of Work – which consumes a lot more power per block than Proof Of Stake would. Many miners claim that they would be forced off the network since the rewards would not be enough to cover their costs.
Why do we care about Miners?
Miners do more than just process/validate our transactions. Each miner contributes to the security of the network via their hashpower. If overall hashpower drops, the network is easier to attack. (I touch on this in a YouTube video on 1% Shard Attack) Essentially, the more miners we have, the more security we have. If miners drop off the network, security will begin to drop – and we’re more vulnerable to attacks. As you may now be noticing, this issue does not have an easy solution. But we can get a better idea of which direction to take. First, let’s quickly go over where we currently stand.
Ethereum Issuance: Blocks & Uncles
What is the Ethereum Issuance currently? The current Ether that is being issued is roughly 5.5 Ether per block. It’s important to note that unlike Bitcoin, the reward issuance is not straightforward. Here is a simplified breakdown of the rewards distributed: Block Reward: 3 Ether Uncle Rewards: ~2.4 Ether Total Ether issued per block: ~5.4 Ether (Issuance reduction will decrease this) No. of blocks per day: ~6000 Blocks (difficulty bomb will decrease this) Current Annual Increase: ~7.3% (issuance reduction and/or difficulty bomb will reduce this)
Uncle Rewards...What the..?
(If you know what are Uncle Rewards , then you can skip this section) Unlike Bitcoin, Ethereum rewards miners that find blocks that don’t make it into the longest chain. These blocks that are considered “stale” in Bitcoin, and are orphaned. In Ethereum these are called Uncles and are rewarded for their work. This is primarily because Ethereum has a much lower block-time (the average time required to find a block). This “small window” may result in smaller miners unfairly losing out on potential rewards due to network latency etc. As such, miners are rewarded for their work. Of course, we cannot predict the exact number of Uncles – but we’re estimating that around 2.4 Ether will be given out to Uncles on average. Uncle Rewards are important because they:a) incentivise decentralization (small miners are less likely to join pools)b) they increase the security of the chain. (more on this in another post)
Cool.. So What is Being Proposed?
Alright, now the fun part. There are three proposals – Ethereum Improvement Proposals (EIPs) to be specific. Here’s a list and summary of each of them:
EIP-858: Reduce block reward to 1 ETH
This would be a significant decrease from 3 ETH to 1 ETH. It would probably put several small/mid-size miners in the negative profitability. Many miners may drop off the network. However, this would probably benefit miners who have access to cheap electricity since they will be able to accrue more rewards for themselves. Larger miners may probably benefit for the same reasons.
EIP-1294: Keep block reward at 3 ETH. Reduce Uncle Rewards to ~0.56 ETH.
This will be a significant reduction to the Uncle Rewards – from ~2.4 ETH to ~0.56 ETH. Ouch. This will affect small miners the most since they rely on the Uncle Rewards. Furthermore, this reduces the incentive for them to remain independent; and may lead to larger shift of small miners to joining pools. Of course, larger miners who aren’t affected by network latency issues do not get affected by this reduction. They benefit the most from this since the issuance will be maintained.
EIP-1234 Reduce block reward to 2 ETH
This EIP seems to be receiving the most favour from the community. It serves as a good middle ground. it will still eat into profits of miners, but not as significantly as the reduction that EIP 858 proposes. EIP-1234 does seem to have an air of “compromise”. It will offer developers enough time to develop & release Casper while keeping both sides of the community at bay. Is compromise the best way to go? -shrug-
So..What Has Been Decided?
A vote took place – that lasted 30 days – where the community voted on their preference. Although the results were leaning significantly toward one side, these results are not binding. The vote was more-so to gauge community sentiment. Quite frankly, this issue is far too important to be decided over a vote like this. It requires serious consideration & research. There was a core-dev meeting that took place last week where the matter was discussed. It seems like the developers are leaning towards EIP1234 – the reduction to 2ETH. However, I'm not certain if this has been confirmed. If so, EIP1234 will be included in the upcoming hardfork scheduled for mid October. I’ll keep you guys posted – and I will be updating this post regularly as news develops.
Abstract So far, the topic of merged mining has mainly been considered in a security context, covering issues such as mining power centralization or crosschain attack scenarios. In this work we show that key information for determining blockchain metrics such as the fork rate can be recovered through data extracted from merge mined cryptocurrencies. Specifically, we reconstruct a long-ranging view of forks and stale blocks in Bitcoin from its merge mined child chains, and compare our results to previous findings that were derived from live measurements. Thereby, we show that live monitoring alone is not sufficient to capture a large majority of these events, as we are able to identify a non-negligible portion of stale blocks that were previously unaccounted for. Their authenticity is ensured by cryptographic evidence regarding both, their position in the respective blockchain, as well as the Proof-of-Work difficulty. Furthermore, by applying this new technique to Litecoin and its child cryptocur rencies, we are able to provide the first extensive view and lower bound on the stale block and fork rate in the Litecoin network. Finally, we outline that a recovery of other important metrics and blockchain characteristics through merged mining may also be possible. References
C. Decker and R. Wattenhofer, “Information propagation in the bitcoin network,” in Peer-to-Peer Computing (P2P), 2013 IEEE Thirteenth International Conference on. IEEE, 2013, pp. 1–10. [Online]. Available: http://diyhpl.us/∼bryan/papers2/bitcoin/Information% 20propagation%20in%20the%20Bitcoin%20network.pdf
A. Gervais, G. O. Karame, K. Wust, V. Glykantzis, H. Ritzdo rf, and S. Capkun, “On the ¨ security and performance of proof of work blockchains,” in Proceedings of the 2016 ACM SIGSAC. ACM, 2016, pp. 3–16.
A. E. Gencer, S. Basu, I. Eyal, R. van Renesse, and E. G. Sirer, “Decentralization in bitcoin and ethereum networks,” in Proceedings of the 22nd International Conference on Financial Cryptography and Data Security (FC). Springer, 2018. [Online]. Available: http://fc18.ifca.ai/preproceedings/75.pdf
I. Eyal and E. G. Sirer, “Majority is not enough: Bitcoin mining is vulnerable,” in Financial Cryptography and Data Security. Springer, 2014, pp. 436–454. [Online]. Available: http://arxiv.org/pdf/1311.0243
K. Nayak, S. Kumar, A. Miller, and E. Shi, “Stubborn mining: Generalizing selfish mining and combining with an eclipse attack,” in 1st IEEE European Symposium on Security and Privacy, 2016. IEEE, 2016. [Online]. Available: http://eprint.iacr.org/2015/796.pdf
J. Bonneau, “Why buy when you can rent? bribery attacks on bitcoin consensus,” in BITCOIN ’16: Proceedings of the 3rd Workshop on Bitcoin and Blockchain Research, February 2016. [Online]. Available: http://fc16.ifca.ai/bitcoin/papers/Bon16b.pdf
K. Liao and J. Katz, “Incentivizing blockchain forks via whale transactions,” in International Conference on Financial Cryptography and Data Security. Springer, 2017, pp. 264–279. [Online]. Available: http://www.cs.umd.edu/∼jkatz/papers/whale-txs.pdf
A. Zamyatin, N. Stifter, A. Judmayer, P. Schindler, E. Weippl, and W. J. Knottebelt, “(Short Paper) A Wild Velvet Fork Appears! Inclusive Blockchain Protocol Changes in Practice,” in 5th Workshop on Bitcoin and Blockchain Research, Financial Cryptography and Data Security 18 (FC). Springer, 2018. [Online]. Available: https://eprint.iacr.org/2018/087.pdf
Y. Sompolinsky, Y. Lewenberg, and A. Zohar, “Spectre: A fast and scalable cryptocurrency protocol,” Cryptology ePrint Archive, Report 2016/1159, 2016, accessed: 2017-02-20. [Online]. Available: http://eprint.iacr.org/2016/1159.pdf
Y. Sompolinsky and A. Zohar, “Phantom: A scalable blockdag protocol,” Cryptology ePrint Archive, Report 2018/104, 2018, accessed:2018-01-31. [Online]. Available: https://eprint.iacr.org/2018/104.pdf
A. Judmayer, A. Zamyatin, N. Stifter, A. G. Voyiatzis, and E. Weippl, “Merged mining: Curse or cure?” in CBT’17: Proceedings of the International Workshop on Cryptocurrencies and Blockchain Technology, Sep 2017. [Online]. Available: https://eprint.iacr.org/2017/791.pdf
A. Judmayer, N. Stifter, K. Krombholz, and E. Weippl, “Blocks and chains: Introduction to bitcoin, cryptocurrencies, and their consensus mechanisms,” Synthesis Lectures on Information Security, Privacy, and Trust, 2017.
A. Kiayias, A. Miller, and D. Zindros, “Non-interactive proofs of proof-of-work,” Cryptology ePrint Archive, Report 2017/963, 2017, accessed:2017-10-03. [Online]. Available: https://eprint.iacr.org/2017/963.pdf
N. T. Courtois and L. Bahack, “On subversive miner strategies and block withholding attack in bitcoin digital currency,” arXiv preprint arXiv:1402.1718, 2014, accessed: 2016-07-04. [Online]. Available: https://arxiv.org/pdf/1402.1718.pdf
A. P. Ozisik, G. Bissias, and B. Levine, “Estimation of miner hash rates and consensus on blockchains,” arXiv preprint arXiv:1707.00082, 2017, accessed:2017-09-25. [Online]. Available: https://arxiv.org/pdf/1707.00082.pdf
J. A. D. Donet, C. Perez-Sola, and J. Herrera-Joancomart ´ ´ı, “The bitcoin p2p network,” in Financial Cryptography and Data Security. Springer, 2014, pp. 87–102. [Online]. Available: http://fc14.ifca.ai/bitcoin/papers/bitcoin14 submission 3.pdf
R. Matzutt, J. Hiller, M. Henze, J. H. Ziegeldorf, D. Mullmann, O. Hohlfeld, and K. Wehrle, ¨ “A quantitative analysis of the impact of arbitrary blockchain content on bitcoin,” in Proceedings of the 22nd International Conference on Financial Cryptography and Data Security (FC). Springer, 2018. [Online]. Available: http://fc18.ifca.ai/preproceedings/6.pdf
M. Grundmann, T. Neudecker, and H. Hartenstein, “Exploiting transaction accumulation and double spends for topology inference in bitcoin,” in 5th Workshop on Bitcoin and Blockchain Research, Financial Cryptography and Data Security 18 (FC). Springer, 2018. [Online]. Available: http://fc18.ifca.ai/bitcoin/papers/bitcoin18-final10.pdf
A. Judmayer, N. Stifter, P. Schindler, and E. Weippl, “Pitchforks in cryptocurrencies: Enforcing rule changes through offensive forking- and consensus techniques (short paper),” in CBT’18: Proceedings of the International Workshop on Cryptocurrencies and Blockchain Technology, Sep 2018. [Online]. Available: https://www.sba-research.org/wpcontent/uploads/2018/09/judmayer2018pitchfork 2018-09-05.pdf
To arms Bitcoin community! Help us to complete this mining installation for the Zürich MoneyMuseum. We are not asking for funds. Only your expertise needed! 20$ tip if you give us the relevant clue to solve or mitigate our main problem. Nice pictures of the exhibition inside as well…
Edit: A big thank you to all people who helped us we can now mine true pps with diff1! The people in this thread which have helped most have been awarded. I want to mention also the operator of btcmp.com denis2342 and Luke-Jr. Actually looking at the miner screen in the Linux terminal helped a lot ;-). The pool constantly resigned to stratum with variable difficulty. We can now mine true pps with diff1. Getwork with long polling seems to be default after disabling stratum... We will probably post again, when there is a video of the installation in action... Again many thanks. Learned a lot. Edit: Thank you for all the answeres so far! We will try different things now and report back. Tip bounty will be distrubuted as soon as we found out what finally does the trick. Ths could take a few days. The offerd tip will be distributed and very likeley a few others as well. First of all, let me tell you that the Bitcoin Exhibition at the Zürich MoneyMuseum is most likely the biggest and most diverse of it’s kind. Please read more about the museum and the exhibition below. Help us solve the following problem we experience with our “Muscle Powered Proof of Work” installation: Me and a friend have invested a lot of time to build an installation for the Museum. It is basically a 10GHash/s miner and RapberryPi which is powered by a hand generator (Maxon DC motor with planetary gear). Here are some pictures of the installation, although not entirely put together yet. There are still some changes planned. https://www.dropbox.com/sh/0qcvl3wu4romhnt/AAAYF08lnVAy6W6KEepE7e2Ua?dl=0 Now let’s get to the core of our problem: We are mining at the getwork diff1 pool btcmp.com as it is a true pps pool with getwork diff1. The visitors in the museum can power the generator for 2-3min and see directly how many Satoshis the "network" (actually pool but we don't want to confuse the visitors to much at that point) has given the museum for their work. This all works well so far but one problem remains. Sometimes the pool does not get a share from us for more than 40 seconds or even more than 60 in some cases. I have calculated that with 8.4 GHash/s we should find a share about every 0.5 seconds in average (diff1). I think when the pool gets a share it gets all the hashes as it then accounts for several Satoshis. Statistically we get per minute what we should get in theory. We would very much like to lower the time between the accepted shares by the pool, however. This would help to make the overall experience much smoother for the visitors. Please look at this screenshot from MinePeon and answer some questions: https://www.dropbox.com/s/lb1jei4trc9kqe5/MinePeonScreenshot.png?dl=0 We see that we get a lot of diff1 hashes. However, only 11 shares/packages have been accepted. The Is there a possibility to set the miner SW so it submits to the pool as soon as a share is found? It seems to send them in packages which sometimes have 4-5 seconds in between but sometimes a much as 80 seconds. I would like to submit packages of hashes much more often. How can this be influenced? What exactly are the Getworks (GW)? What exactly are the Accepted ones (Acc)? This is where the TipBounty is. Help us to get a better Acc/diff1 ratio. Best would be 1:1. What exactly are the rejected ones (Rej)? What exactly are the discarded ones (Disc)? What exactly are the difficulty one hashes (diff1)? Now some of these questions seem very very basic but it is important for us to understand what these are and how we can influence these. We have a 1:1 correlation between the Acc and the pool side acknowledgement of shares/packages. So whenever the MinePeon shows one more for this value the pool value for last submitted share goes to “moments ago”. Does the miner SW have a setting where we can set after how many diff1 hashes a package of hashes is sent to the pool? If no, do you have another idea why so few are sent? Ideally we would set it so the diff1 hashes are sent every 5 seconds or so, probably even more often. Is stratum with fixed diff1 possible? If so, would it be better to use stratum? Are there critical settings if we should know of? (we have tried --request-diff and --no-submit-stale) We are using BFGMiner on MinePeon if that matters. We could switch to CGMiner if that would help. Any help is very much appreciated. The museum is doing a great job explaining Bitcoin basics. We had special focus on interactive learning and have several things to underline this. I hope to hear back from you so we can improve our installation. Please don't hesitate to ask if you have further questions. We are both not mining experts. Thanks for reading and AMA. SimonBelmond Current features of the Bitcoin exhibition at the Zürich MoneyMuseum: Current Features:
Life screen with various stats/charts/parameters/transactions…
Muscle powered PoW: Hand generator with 5v and 3.5-5A output, Raspberry Pi, MinePeon, 5x Antminer U2+ plus a screen to show the hash-rate at the pool and/or in MinePeon web interface. This screen will not be hand powered. This installation will complement their coining die (go to 1:27 to see what I mean).
The Bitcoin mining evolution (CPU, GPU, FPGA, ASIC)
A few short (2-3 minutes) interviews.
Other wallets, Trezor, PiperWallet
ATM Prototype, functional
PiperWallet to use.
Casascius and other physical Bitcoins, Wallets (also some commemorative coins), Paper wallet like one out of the first Bitcoin (A)TM ever
12 Picture tours
Bitcoin for beginners
Debunking 13 Bitcoin myths
What you definitely have to know
The history of Bitcoin
Bitcoin und traditional forms of money
Alternatives to Bitcoin
Citations about Bitcoin
How do I open an account?
How do I get Bitcoin?
Bitcoin community and economy
Bitcoin as a platform
I see this as a good opportunity for Bitcoin, so let’s embrace it. I am especially excited to compare the traditional forms of money which used proof of work to the new money which also uses proof of work. I think in that context it will be much easier for the visitors to value this concept. A lot of schools and other groups book guided tours at the museum. It is open on every Friday from December 05. On. Entry is free of charge. Edit:Markdown, typos
higher stale block rate Stale block rate 6.8 % 0.41 % Litecoin would require 28, and Dogecoin 47 block conﬁrmations respectively to match the security of 6 Bitcoin conﬁrmations. Matching Block conﬁrmations, 30% adversary 37 6 12.4 minutes 60 minutes Impact of the block size on the median block propagation time (tMBP ) in seconds, the stale block rate rs, v d and r rel , given the current Bitcoin block generation interval and an adversary with ... Otherwise, they will be working on competing or stale chains (i.e., forks of the true longest chain) and not contributing to the network security. ... As such, Bitcoin/proof-of-work is not ... The proof of work (PoW) consensus mechanism is the widest deployed consensus mechanism in existing blockchains. PoW was introduced by Bitcoin  and assumes that each peer votes with his “computing power” by solving proof of work instances and constructing the appropriate blocks. Bitcoin, for example, employs We will cover how Blockchain and Bitcoin really work under the hood, what Proof of Work and Proof of Stake is, how forks work and much much more. I usually teach this course in seminars and lectures to the highest management of industrial companies, however I think it's important for everyone to understand these concept and ideas.
Proof of Work vs Proof of Stake - Clearly Explained
In this Bitcoin protocol tutorial video, the notion of Proof of Work is discussed. The idea of proof of work, normally, is to protect against spam and DoS attacks. Cryptocurrencies use a ton of electricity because of mining. In recent years people started working on a different technique called Proof-of-Stake. Not only ... What is the difference between proof-of-work (PoW), proof-of-stake (PoS), and delegated proof-of-stake (DPoS)? This is part of a talk which took place on Jul... #blockchain #ProofOfwork #PoW If multiple governments collaborated, could they launch a 51% attack on Bitcoin? If all it takes to attack a proof-of-work (PoW) network is enough electricity, wouldn't you want the game theory of ...