This Researcher Says Bitcoin’s Elliptic Curve Could Have a

[new estimate] Quantum computers require at most 2330 qubits and 129 billion gates to crack Bitcoin's secp256k1 elliptic curve (we are at 17 qubits today).

submitted by blk0 to Bitcoin [link] [comments]

[new estimate] Quantum computers require at most 2330 qubits and 129 billion gates to crack Bitcoin's secp256k1 elliptic curve (we are at 17 qubits today).

submitted by BitcoinAllBot to BitcoinAll [link] [comments]

ECDSA In Bitcoin

Digital signatures are considered the foundation of online sovereignty. The advent of public-key cryptography in 1976 paved the way for the creation of a global communications tool – the Internet, and a completely new form of money – Bitcoin. Although the fundamental properties of public-key cryptography have not changed much since then, dozens of different open-source digital signature schemes are now available to cryptographers.

How ECDSA was incorporated into Bitcoin

When Satoshi Nakamoto, a mystical founder of the first crypto, started working on Bitcoin, one of the key points was to select the signature schemes for an open and public financial system. The requirements were clear. An algorithm should have been widely used, understandable, safe enough, easy, and, what is more important, open-sourced.
Of all the options available at that time, he chose the one that met these criteria: Elliptic Curve Digital Signature Algorithm, or ECDSA.
At that time, native support for ECDSA was provided in OpenSSL, an open set of encryption tools developed by experienced cipher banks in order to increase the confidentiality of online communications. Compared to other popular schemes, ECDSA had such advantages as:
These are extremely useful features for digital money. At the same time, it provides a proportional level of security: for example, a 256-bit ECDSA key has the same level of security as a 3072-bit RSA key (Rivest, Shamir и Adleman) with a significantly smaller key size.

Basic principles of ECDSA

ECDSA is a process that uses elliptic curves and finite fields to “sign” data in such a way that third parties can easily verify the authenticity of the signature, but the signer himself reserves the exclusive opportunity to create signatures. In the case of Bitcoin, the “data” that is signed is a transaction that transfers ownership of bitcoins.
ECDSA has two separate procedures for signing and verifying. Each procedure is an algorithm consisting of several arithmetic operations. The signature algorithm uses the private key, and the verification algorithm uses only the public key.
To use ECDSA, such protocol as Bitcoin must fix a set of parameters for the elliptic curve and its finite field, so that all users of the protocol know and apply these parameters. Otherwise, everyone will solve their own equations, which will not converge with each other, and they will never agree on anything.
For all these parameters, Bitcoin uses very, very large (well, awesomely incredibly huge) numbers. It is important. In fact, all practical applications of ECDSA use huge numbers. After all, the security of this algorithm relies on the fact that these values are too large to pick up a key with a simple brute force. The 384-bit ECDSA key is considered safe enough for the NSA's most secretive government service (USA).

Replacement of ECDSA

Thanks to the hard work done by Peter Wuille (a famous cryptography specialist) and his colleagues on an improved elliptical curve called secp256k1, Bitcoin's ECDSA has become even faster and more efficient. However, ECDSA still has some shortcomings, which can serve as a sufficient basis for its complete replacement. After several years of research and experimentation, a new signature scheme was established to increase the confidentiality and efficiency of Bitcoin transactions: Schnorr's digital signature scheme.
Schnorr's signature takes the process of using “keys” to a new level. It takes only 64 bytes when it gets into the block, which reduces the space occupied by transactions by 4%. Since transactions with the Schnorr signature are the same size, this makes it possible to pre-calculate the total size of the part of the block that contains such signatures. A preliminary calculation of the block size is the key to its safe increase in the future.
Keep up with the news of the crypto world at CoinJoy.io Follow us on Twitter and Medium. Subscribe to our YouTube channel. Join our Telegram channel. For any inquiries mail us at [[email protected]](mailto:[email protected]).
submitted by CoinjoyAssistant to btc [link] [comments]

[ANN] RustCrypto: `p256` and `k256` v0.2.0: pure Rust NIST P-256 and secp256k1 curve arithmetic

Announcing the v0.2.0 releases of the following RustCrypto elliptic curve crates:
Both of these releases now implement curve/field arithmetic, namely they implement the complete Weierstrass formulas and are initially targeting correctness over performance. Because of all of that, they are suitable for environments which require small code sizes (e.g. embedded), and are designed from the ground up to work in no_std environments.
These are the first releases of these crates with an arithmetic feature. The code is brand new and has not been thoroughly reviewed, though we believe it is of high quality. Some of the field arithmetic implementations have been proptested against the ones in fiat-rust, and we will continue to investigate ways to ensure the implementations are correct.
All of that said, USE AT YOUR OWN RISK!
submitted by bascule to rust [link] [comments]

ECDSA In Bitcoin

Digital signatures are considered the foundation of online sovereignty. The advent of public-key cryptography in 1976 paved the way for the creation of a global communications tool – the Internet, and a completely new form of money – Bitcoin. Although the fundamental properties of public-key cryptography have not changed much since then, dozens of different open-source digital signature schemes are now available to cryptographers.

How ECDSA was incorporated into Bitcoin

When Satoshi Nakamoto, a mystical founder of the first crypto, started working on Bitcoin, one of the key points was to select the signature schemes for an open and public financial system. The requirements were clear. An algorithm should have been widely used, understandable, safe enough, easy, and, what is more important, open-sourced.
Of all the options available at that time, he chose the one that met these criteria: Elliptic Curve Digital Signature Algorithm, or ECDSA.
At that time, native support for ECDSA was provided in OpenSSL, an open set of encryption tools developed by experienced cipher banks in order to increase the confidentiality of online communications. Compared to other popular schemes, ECDSA had such advantages as:
These are extremely useful features for digital money. At the same time, it provides a proportional level of security: for example, a 256-bit ECDSA key has the same level of security as a 3072-bit RSA key (Rivest, Shamir и Adleman) with a significantly smaller key size.

Basic principles of ECDSA

ECDSA is a process that uses elliptic curves and finite fields to “sign” data in such a way that third parties can easily verify the authenticity of the signature, but the signer himself reserves the exclusive opportunity to create signatures. In the case of Bitcoin, the “data” that is signed is a transaction that transfers ownership of bitcoins.
ECDSA has two separate procedures for signing and verifying. Each procedure is an algorithm consisting of several arithmetic operations. The signature algorithm uses the private key, and the verification algorithm uses only the public key.
To use ECDSA, such protocol as Bitcoin must fix a set of parameters for the elliptic curve and its finite field, so that all users of the protocol know and apply these parameters. Otherwise, everyone will solve their own equations, which will not converge with each other, and they will never agree on anything.
For all these parameters, Bitcoin uses very, very large (well, awesomely incredibly huge) numbers. It is important. In fact, all practical applications of ECDSA use huge numbers. After all, the security of this algorithm relies on the fact that these values are too large to pick up a key with a simple brute force. The 384-bit ECDSA key is considered safe enough for the NSA's most secretive government service (USA).

Replacement of ECDSA

Thanks to the hard work done by Peter Wuille (a famous cryptography specialist) and his colleagues on an improved elliptical curve called secp256k1, Bitcoin's ECDSA has become even faster and more efficient. However, ECDSA still has some shortcomings, which can serve as a sufficient basis for its complete replacement. After several years of research and experimentation, a new signature scheme was established to increase the confidentiality and efficiency of Bitcoin transactions: Schnorr's digital signature scheme.
Schnorr's signature takes the process of using “keys” to a new level. It takes only 64 bytes when it gets into the block, which reduces the space occupied by transactions by 4%. Since transactions with the Schnorr signature are the same size, this makes it possible to pre-calculate the total size of the part of the block that contains such signatures. A preliminary calculation of the block size is the key to its safe increase in the future.
Keep up with the news of the crypto world at CoinJoy.io Follow us on Twitter and Medium. Subscribe to our YouTube channel. Join our Telegram channel. For any inquiries mail us at [[email protected]](mailto:[email protected]).
submitted by CoinjoyAssistant to Bitcoin [link] [comments]

Implementing support for Gleec coin

I'm trying to make a custom Trezor implementation that supports Gleec coin (BTC clone).
I have done the following steps:
  1. Inside the trezor-firmware in "common/defs/bitcoin" I included the "gleecBtc.json" configuartion:
After building the firmware.bin file I found out via hex editor that it doesn't contain the configuration. Then I included a new configuration inside "core/src/apps/common/coininfo.py":
.... if not utils.BITCOIN_ONLY: if False: pass elif name == "GleecBTC": return CoinInfo( coin_name=name, coin_shortcut="GLEEC", decimals=8, address_type=35, address_type_p2sh=38, maxfee_kb=2000000, signed_message_header="GleecBTC Signed Message:\n", xpub_magic=0x0488b21e, xpub_magic_segwit_p2sh=0x049d7cb2, xpub_magic_segwit_native=0x04b24746, bech32_prefix="fg", cashaddr_prefix=None, slip44=460, segwit=True, fork_id=None, force_bip143=False, decred=False, negative_fee=False, curve_name='secp256k1', extra_data=False, timestamp=False, overwintered=False, confidential_assets=None, ) .... 
Finally the gleec configuration was inside the firmware.bin file. And I flashed it on the hardware.
2) Inside the trezor-connect in "src/data/coins.json" I included:
.... { "address_type": 35, "address_type_p2sh": 38, "bech32_prefix": "fg", "blockbook": [ "https://blockbook.gleechain.com/" ], "blocktime_seconds": 300, "cashaddr_prefix": null, "coin_label": "GleecBTC", "coin_name": "GleecBTC", "coin_shortcut": "GLEEC", "consensus_branch_id": null, "curve_name": "secp256k1", "decimals": 8, "decred": false, "default_fee_b": { "Economy": 70, "High": 200, "Low": 10, "Normal": 140 }, "dust_limit": 546, "extra_data": false, "force_bip143": false, "fork_id": null, "hash_genesis_block": "000000000019d6689c085ae165831e934ff763ae46a2a6c172b3f1b60a8ce26f", "max_address_length": 34, "maxfee_kb": 2000000, "min_address_length": 27, "minfee_kb": 1000, "name": "GleecBTC", "segwit": true, "shortcut": "GLEEC", "signed_message_header": "GleecBTC Signed Message:\n", "slip44": 460, "support": { }, "timestamp": false, "xprv_magic": 76066276, "xpub_magic": 76067358, "xpub_magic_segwit_native": 78792518, "xpub_magic_segwit_p2sh": 77429938 }, .... 
3) So after running:
await TrezorConnect.getAddress({ path: "m/44'/460'/0'/0/0", coin: 'GLEEC' }); 
I get the following error:
Invalid parameters: Coin not found. 
I don't even see communication or data reaching the Trezor device after the call.
Do I need to add the coin inside other configuration file? Are configurations enough in order to support the coin or I need to implement some features?
submitted by Slav-0 to TREZOR [link] [comments]

Reddcoin (RDD) 02/20 Progress Report - Core Wallet v3.1 Evolution & PoSV v2 - Commits & More Commits to v3.1! (Bitcoin Core 0.10, MacOS Catalina, QT Enhanced Speed and Security and more!)

Reddcoin (RDD) Core Dev Team Informal Progress Report, Feb 2020 - As any blockchain or software expert will confirm, the hardest part of making successful progress in blockchain and crypto is invisible to most users. As developers, the Reddcoin Core team relies on internal experts like John Nash, contributors offering their own code improvements to our repos (which we would love to see more of!) and especially upstream commits from experts working on open source projects like Bitcoin itself. We'd like tothank each and everyone who's hard work has contributed to this progress.
As part of Reddcoin's evolution, and in order to include required security fixes, speed improvements that are long overdue, the team has up to this point incorporated the following code commits since our last v3.0.1 public release. In attempting to solve the relatively minor font display issue with MacOS Catalina, we uncovered a complicated interweaving of updates between Reddcoin Core, QT software, MacOS SDK, Bitcoin Core and related libraries and dependencies that mandated we take a holistic approach to both solve the Catalina display problem, but in doing so, prepare a more streamlined overall build and test system, allowing the team to roll out more frequent and more secure updates in the future. And also to include some badly needed fixes in the current version of Core, which we have tentatively labeled Reddcoin Core Wallet v3.1.
Note: As indicated below, v3.1 is NOT YET AVAILABLE FOR DOWNLOAD BY PUBLIC. We wil advise when it is.
The new v3.1 version should be ready for internal QA and build testing by the end of this week, with luck, and will be turned over to the public shortly thereafter once testing has proven no unexpected issues have been introduced. We know the delay has been a bit extended for our ReddHead MacOS Catalina stakers, and we hope to have them all aboard soon. We have moved with all possible speed while attempting to incorproate all the required work, testing, and ensuring security and safety for our ReddHeads.
Which leads us to: PoSV v2 activation and the supermajority on Mainnet at the time of this writing has reached 5625/9000 blocks or 62.5%. We have progressed quite well and without any reported user issues since release, but we need all of the community to participate! This activation, much like the funding mechanisms currently being debated by BCH and others, and employed by DASH, will mean not only a catalyst for Reddcoin but ensure it's future by providing funding for the dev team. As a personal plea from the team, please help us support the PoSV v2 activation by staking your RDD, no matter how large or small your amount of stake.
Every block and every RDD counts, and if you don't know how, we'll teach you! Live chat is fun as well as providing tech support you can trust from devs and community ReddHead members. Join us today in staking and online and collect some RDD "rain" from users and devs alike!
If you're holding Reddcoin and not staking, or you haven't upgraded your v2.x wallet to v3.0.1 (current release), we need you to help achieve consensus and activate PoSV v2! For details, see the pinned message here or our website or medium channel. Upgrade is simple and takes moments; if you're nervous or unsure, we're here to help live in Telegram or Discord, as well as other chat programs. See our website for links.
Look for more updates shortly as our long-anticipated Reddcoin Payment Gateway and Merchant Services API come online with point-of-sale support, as we announce the cross-crypto-project Aussie firefighter fundraiser program, as well as a comprehensive update to our development roadmap and more.
Work has restarted on ReddID and multiple initiatives are underway to begin educating and sharing information about ReddID, what it is, and how to use it, as we approach a releasable ReddID product. We enthusiastically encourage anyone interested in working to bring these efforts to life, whether writers, UX/UI experts, big data analysts, graphic artists, coders, front-end, back-end, AI, DevOps, the Reddcoin Core dev team is growing, and there's more opportunity and work than ever!
Bring your talents to a community and dev team that truly appreciates it, and share the Reddcoin Love!
And now, lots of commits. As v3.1 is not yet quite ready for public release, these commits have not been pushed publicly, but in the interests of sharing progress transparently, and including our ReddHead community in the process, see below for mind-numbing technical detail of work accomplished.
e5c143404 - - 2014-08-07 - Ross Nicoll - Changed LevelDB cursors to use scoped pointers to ensure destruction when going out of scope. *99a7dba2e - - 2014-08-15 - Cory Fields - tests: fix test-runner for osx. Closes ##4708 *8c667f1be - - 2014-08-15 - Cory Fields - build: add funcs.mk to the list of meta-depends *bcc1b2b2f - - 2014-08-15 - Cory Fields - depends: fix shasum on osx < 10.9 *54dac77d1 - - 2014-08-18 - Cory Fields - build: add option for reducing exports (v2) *6fb9611c0 - - 2014-08-16 - randy-waterhouse - build : fix CPPFLAGS for libbitcoin_cli *9958cc923 - - 2014-08-16 - randy-waterhouse - build: Add --with-utils (bitcoin-cli and bitcoin-tx, default=yes). Help string consistency tweaks. Target sanity check fix. *342aa98ea - - 2014-08-07 - Cory Fields - build: fix automake warnings about the use of INCLUDES *46db8ad51 - - 2020-02-18 - John Nash - build: add build.h to the correct target *a24de1e4c - - 2014-11-26 - Pavel Janík - Use complete path to include bitcoin-config.h. *fd8f506e5 - - 2014-08-04 - Wladimir J. van der Laan - qt: Demote ReportInvalidCertificate message to qDebug *f12aaf3b1 - - 2020-02-17 - John Nash - build: QT5 compiled with fPIC require fPIC to be enabled, fPIE is not enough *7a991b37e - - 2014-08-12 - Wladimir J. van der Laan - build: check for sys/prctl.h in the proper way *2cfa63a48 - - 2014-08-11 - Wladimir J. van der Laan - build: Add mention of --disable-wallet to bdb48 error messages *9aa580f04 - - 2014-07-23 - Cory Fields - depends: add shared dependency builder *8853d4645 - - 2014-08-08 - Philip Kaufmann - [Qt] move SubstituteFonts() above ToolTipToRichTextFilter *0c98e21db - - 2014-08-02 - Ross Nicoll - URLs containing a / after the address no longer cause parsing errors. *7baa77731 - - 2014-08-07 - ntrgn - Fixes ignored qt 4.8 codecs path on windows when configuring with --with-qt-libdir *2a3df4617 - - 2014-08-06 - Cory Fields - qt: fix unicode character display on osx when building with 10.7 sdk *71a36303d - - 2014-08-04 - Cory Fields - build: fix race in 'make deploy' for windows *077295498 - - 2014-08-04 - Cory Fields - build: Fix 'make deploy' when binaries haven't been built yet *ffdcc4d7d - - 2014-08-04 - Cory Fields - build: hook up qt translations for static osx packaging *25a7e9c90 - - 2014-08-04 - Cory Fields - build: add --with-qt-translationdir to configure for use with static qt *11cfcef37 - - 2014-08-04 - Cory Fields - build: teach macdeploy the -translations-dir argument, for use with static qt *4c4ae35b1 - - 2014-07-23 - Cory Fields - build: Find the proper xcb/pcre dependencies *942e77dd2 - - 2014-08-06 - Cory Fields - build: silence mingw fpic warning spew *e73e2b834 - - 2014-06-27 - Huang Le - Use async name resolving to improve net thread responsiveness *c88e76e8e - - 2014-07-23 - Cory Fields - build: don't let libtool insert rpath into binaries *18e14e11c - - 2014-08-05 - ntrgn - build: Fix windows configure when using --with-qt-libdir *bb92d65c4 - - 2014-07-31 - Cory Fields - test: don't let the port number exceed the legal range *62b95290a - - 2014-06-18 - Cory Fields - test: redirect comparison tool output to stdout *cefe447e9 - - 2014-07-22 - Cory Fields - gitian: remove unneeded option after last commit *9347402ca - - 2014-07-21 - Cory Fields - build: fix broken boost chrono check on some platforms *c9ed039cf - - 2014-06-03 - Cory Fields - build: fix whitespace in pkg-config variable *3bcc5ad37 - - 2014-06-03 - Cory Fields - build: allow linux and osx to build against static qt5 *01a44ba90 - - 2014-07-17 - Cory Fields - build: silence false errors during make clean *d1fbf7ba2 - - 2014-07-08 - Cory Fields - build: fix win32 static linking after libtool merge *005ae2fa4 - - 2014-07-08 - Cory Fields - build: re-add AM_LDFLAGS where it's overridden *37043076d - - 2014-07-02 - Wladimir J. van der Laan - Fix the Qt5 build after d95ba75 *f3b4bbf40 - - 2014-07-01 - Wladimir J. van der Laan - qt: Change serious messages from qDebug to qWarning *f4706f753 - - 2014-07-01 - Wladimir J. van der Laan - qt: Log messages with type>QtDebugMsg as non-debug *98e85fa1f - - 2014-06-06 - Pieter Wuille - libsecp256k1 integration *5f1f2e226 - - 2020-02-17 - John Nash - Merge branch 'switch_verification_code' into Build *1f30416c9 - - 2014-02-07 - Pieter Wuille - Also switch the (unused) verification code to low-s instead of even-s. *1c093d55e - - 2014-06-06 - Cory Fields - secp256k1: Add build-side changes for libsecp256k1 *7f3114484 - - 2014-06-06 - Cory Fields - secp256k1: add libtool as a dependency *2531f9299 - - 2020-02-17 - John Nash - Move network-time related functions to timedata.cpp/h *d003e4c57 - - 2020-02-16 - John Nash - build: fix build weirdness after 54372482. *7035f5034 - - 2020-02-16 - John Nash - Add ::OUTPUT_SIZE *2a864c4d8 - - 2014-06-09 - Cory Fields - crypto: create a separate lib for crypto functions *03a4e4c70 - - 2014-06-09 - Cory Fields - crypto: explicitly check for byte read/write functions *a78462a2a - - 2014-06-09 - Cory Fields - build: move bitcoin-config.h to its own directory *a885721c4 - - 2014-05-31 - Pieter Wuille - Extend and move all crypto tests to crypto_tests.cpp *5f308f528 - - 2014-05-03 - Pieter Wuille - Move {Read,Write}{LE,BE}{32,64} to common.h and use builtins if possible *0161cc426 - - 2014-05-01 - Pieter Wuille - Add built-in RIPEMD-160 implementation *deefc27c0 - - 2014-04-28 - Pieter Wuille - Move crypto implementations to src/crypto/ *d6a12182b - - 2014-04-28 - Pieter Wuille - Add built-in SHA-1 implementation. *c3c4f9f2e - - 2014-04-27 - Pieter Wuille - Switch miner.cpp to use sha2 instead of OpenSSL. *b6ed6def9 - - 2014-04-28 - Pieter Wuille - Remove getwork() RPC call *0a09c1c60 - - 2014-04-26 - Pieter Wuille - Switch script.cpp and hash.cpp to use sha2.cpp instead of OpenSSL. *8ed091692 - - 2014-04-20 - Pieter Wuille - Add a built-in SHA256/SHA512 implementation. *0c4c99b3f - - 2014-06-21 - Philip Kaufmann - small cleanup in src/compat .h and .cpp *ab1369745 - - 2014-06-13 - Cory Fields - sanity: hook up sanity checks *f598c67e0 - - 2014-06-13 - Cory Fields - sanity: add libc/stdlib sanity checks *b241b3e13 - - 2014-06-13 - Cory Fields - sanity: autoconf check for sys/select.h *cad980a4f - - 2019-07-03 - John Nash - build: Add a top-level forwarding target for src/ objects *f4533ee1c - - 2019-07-03 - John Nash - build: qt: split locale resources. Fixes non-deterministic distcheck *4a0e46e76 - - 2019-06-29 - John Nash - build: fix version dependency *2f61699d9 - - 2019-06-29 - John Nash - build: quit abusing AMCPPFLAGS *99b60ba49 - - 2019-06-29 - John Nash - build: avoid the use of top and abs_ dir paths *c8f673d5d - - 2019-06-29 - John Nash - build: Tidy up file generation output *5318bce57 - - 2019-06-29 - John Nash - build: nuke Makefile.include from orbit *672a25349 - - 2019-06-29 - John Nash - build: add stub makefiles for easier subdir builds *562b7c5a6 - - 2020-02-08 - John Nash - build: delete old Makefile.am's *066120079 - - 2020-02-08 - John Nash - build: Switch to non-recursive make
Whew! No wonder it's taken the dev team a while! :)
TL;DR: Trying to fix MacOS Catalina font display led to requiring all kinds of work to migrate and evolve the Reddcoin Core software with Apple, Bitcoin and QT components. Lots of work done, v3.1 public release soon. Also other exciting things and ReddID back under active dev effort.
submitted by TechAdept to reddCoin [link] [comments]

Threshold Signature Explained— Bringing Exciting Applications with TSS

Threshold Signature Explained— Bringing Exciting Applications with TSS
— A deep dive into threshold signature without mathematics by ARPA’s cryptographer Dr. Alex Su

https://preview.redd.it/cp0wib2mk0q41.png?width=757&format=png&auto=webp&s=d42056f42fb16041bc512f10f10fed56a16dc279
Threshold signature is a distributed multi-party signature protocol that includes distributed key generation, signature, and verification algorithms.
In recent years, with the rapid development of blockchain technology, signature algorithms have gained widespread attention in both academic research and real-world applications. Its properties like security, practicability, scalability, and decentralization of signature are pored through.
Due to the fact that blockchain and signature are closely connected, the development of signature algorithms and the introduction of new signature paradigms will directly affect the characteristics and efficiency of blockchain networks.
In addition, institutional and personal account key management requirements stimulated by distributed ledgers have also spawned many wallet applications, and this change has also affected traditional enterprises. No matter in the blockchain or traditional financial institutions, the threshold signature scheme can bring security and privacy improvement in various scenarios. As an emerging technology, threshold signatures are still under academic research and discussions, among which there are unverified security risks and practical problems.
This article will start from the technical rationale and discuss about cryptography and blockchain. Then we will compare multi-party computation and threshold signature before discussing the pros and cons of different paradigms of signature. In the end, there will be a list of use cases of threshold signature. So that, the reader may quickly learn about the threshold signature.
I. Cryptography in Daily Life
Before introducing threshold signatures, let’s get a general understanding of cryptography. How does cryptography protect digital information? How to create an identity in the digital world? At the very beginning, people want secure storage and transmission. After one creates a key, he can use symmetric encryption to store secrets. If two people have the same key, they can achieve secure transmission between them. Like, the king encrypts a command and the general decrypts it with the corresponding key.
But when two people do not have a safe channel to use, how can they create a shared key? So, the key exchange protocol came into being. Analogously, if the king issues an order to all the people in the digital world, how can everyone proves that the sentence originated from the king? As such, the digital signature protocol was invented. Both protocols are based on public key cryptography, or asymmetric cryptographic algorithms.


“Tiger Rune” is a troop deployment tool used by ancient emperor’s, made of bronze or gold tokens in the shape of a tiger, split in half, half of which is given to the general and the other half is saved by the emperor. Only when two tiger amulets are combined and used at the same time, will the amulet holder get the right to dispatch troops.
Symmetric and asymmetric encryption constitute the main components of modern cryptography. They both have three fixed parts: key generation, encryption, and decryption. Here, we focus on digital signature protocols. The key generation process generates a pair of associated keys: the public key and the private key. The public key is open to everyone, and the private key represents the identity and is only revealed to the owner. Whoever owns the private key has the identity represented by the key. The encryption algorithm, or signature algorithm, takes the private key as input and generate a signature on a piece of information. The decryption algorithm, or signature verification algorithm, uses public keys to verify the validity of the signature and the correctness of the information.
II. Signature in the Blockchain
Looking back on blockchain, it uses consensus algorithm to construct distributed books, and signature provides identity information for blockchain. All the transaction information on the blockchain is identified by the signature of the transaction initiator. The blockchain can verify the signature according to specific rules to check the transaction validity, all thanks to the immutability and verifiability of the signature.
For cryptography, the blockchain is more than using signature protocol, or that the consensus algorithm based on Proof-of-Work uses a hash function. Blockchain builds an infrastructure layer of consensus and transaction through. On top of that, the novel cryptographic protocols such as secure multi-party computation, zero-knowledge proof, homomorphic encryption thrives. For example, secure multi-party computation, which is naturally adapted to distributed networks, can build secure data transfer and machine learning platforms on the blockchain. The special nature of zero-knowledge proof provides feasibility for verifiable anonymous transactions. The combination of these cutting-edge cryptographic protocols and blockchain technology will drive the development of the digital world in the next decade, leading to secure data sharing, privacy protection, or more applications now unimaginable.
III. Secure Multi-party Computation and Threshold Signature
After introducing how digital signature protocol affects our lives, and how to help the blockchain build identities and record transactions, we will mention secure multi-party computation (MPC), from where we can see how threshold signatures achieve decentralization. For more about MPC, please refer to our previous posts which detailed the technical background and application scenarios.
MPC, by definition, is a secure computation that several participants jointly execute. Security here means that, in one computation, all participants provide their own private input, and can obtain results from the calculation. It is not possible to get any private information entered by other parties. In 1982, when Prof. Yao proposed the concept of MPC, he gave an example called the “Millionaires Problem” — two millionaires who want to know who is richer than the other without telling the true amount of assets. Specifically, the secure multiparty computation would care about the following properties:
  • Privacy: Any participant cannot obtain any private input of other participants, except for information that can be inferred from the computation results.
  • Correctness and verifiability: The computation should ensure correct execution, and the legitimacy and correctness of this process should be verifiable by participants or third parties.
  • Fairness or robustness: All parties involved in the calculation, if not agreed in advance, should be able to obtain the computation results at the same time or cannot obtain the results.
Supposing we use secure multi-party computation to make a digital signature in a general sense, we will proceed as follows:
  • Key generation phase: all future participants will be involved together to do two things: 1) each involved party generates a secret private key; 2) The public key is calculated according to the sequence of private keys.
  • Signature phase: Participants joining in a certain signature use their own private keys as private inputs, and the information to be signed as a public input to perform a joint signature operation to obtain a signature. In this process, the privacy of secure multi-party computing ensures the security of private keys. The correctness and robustness guarantee the unforgeability of the signature and everyone can all get signatures.
  • Verification phase: Use the public key corresponding to the transaction to verify the signature as traditional algorithm. There is no “secret input” during the verification, this means that the verification can be performed without multi-party computation, which will become an advantage of multi-party computation type distributed signature.
The signature protocol constructed on the idea of ​​secure multiparty computing is the threshold signature. It should be noted that we have omitted some details, because secure multiparty computing is actually a collective name for a type of cryptographic protocol. For different security assumptions and threshold settings, there are different construction methods. Therefore, the threshold signatures of different settings will also have distinctive properties, this article will not explain each setting, but the comparative result with other signature schemes will be introduced in the next section.
IV. Single Signature, Multi-Signature and Threshold Signature
Besides the threshold signature, what other methods can we choose?
Bitcoin at the beginning, uses single signature which allocates each account with one private key. The message signed by this key is considered legitimate. Later, in order to avoid single point of failure, or introduce account management by multiple people, Bitcoin provides a multi-signature function. Multi-signature can be simply understood as each account owner signs successively and post all signatures to the chain. Then signatures are verified in order on the chain. When certain conditions are met, the transaction is legitimate. This method achieves a multiple private keys control purpose.
So, what’s the difference between multi-signature and threshold signature?
Several constraints of multi-signature are:
  1. The access structure is not flexible. If an account’s access structure is given, that is, which private keys can complete a legal signature, this structure cannot be adjusted at a later stage. For example, a participant withdraws, or a new involved party needs to change the access structure. If you must change, you need to complete the initial setup process again, which will change the public key and account address as well.
  2. Less efficiency. The first is that the verification on chain consumes power of all nodes, and therefore requires a processing fee. The verification of multiple signatures is equivalent to multiple single signatures. The second is performance. The verification obviously takes more time.
  3. Requirements of smart contract support and algorithm adaptation that varies from chain to chain. Because multi-sig is not naturally supported. Due to the possible vulnerabilities in smart contracts, this support is considered risky.
  4. No anonymity, this is not able to be trivially called disadvantage or advantage, because anonymity is required for specific conditions. Anonymity here means that multi-signature directly exposes all participating signers of the transaction.
Correspondingly, the threshold signature has the following features:
  1. The access structure is flexible. Through an additional multi-party computation, the existing private key sequence can be expanded to assign private keys to new participants. This process will not expose the old and newly generated private key, nor will it change the public key and account address.
  2. It provides more efficiency. For the chain, the signature generated by the threshold signature is not different from a single signature, which means the following improvements : a) The verification is the same as the single signature, and needs no additional fee; b ) the information of the signer is invisible, because for other nodes, the information is decrypted with the same public key; c) No smart contract on chain is needed to provide additional support.
In addition to the above discussion, there is a distributed signature scheme supported by Shamir secret sharing. Secret sharing algorithm has a long history which is used to slice information storage and perform error correction information. From the underlying algorithm of secure computation to the error correction of the disc. This technology has always played an important role, but the main problem is that when used in a signature protocol, Shamir secret sharing needs to recover the master private key.
As for multiple signatures or threshold signature, the master private key has never been reconstructed, even if it is in memory or cache. this short-term reconstruction is not tolerable for vital accounts.
V. Limitations
Just like other secure multi-party computation protocols, the introduction of other participants makes security model different with traditional point-to-point encrypted transmission. The problem of conspiracy and malicious participants were not taken into account in algorithms before. The behavior of physical entities cannot be restricted, and perpetrators are introduced into participating groups.
Therefore, multi-party cryptographic protocols cannot obtain the security strength as before. Effort is needed to develop threshold signature applications, integrate existing infrastructure, and test the true strength of threshold signature scheme.
VI. Scenarios
1. Key Management
The use of threshold signature in key management system can achieve a more flexible administration, such as ARPA’s enterprise key management API. One can use the access structure to design authorization pattern for users with different priorities. In addition, for the entry of new entities, the threshold signature can quickly refresh the key. This operation can also be performed periodically to level up the difficulty of hacking multiple private keys at the same time. Finally, for the verifier, the threshold signature is not different from the traditional signature, so it is compatible with old equipments and reduces the update cost. ARPA enterprise key management modules already support Elliptic Curve Digital Signature Scheme secp256k1 and ed25519 parameters. In the future, it will be compatible with more parameters.

https://preview.redd.it/c27zuuhdl0q41.png?width=757&format=png&auto=webp&s=26d46e871dadbbd4e3bea74d840e0198dec8eb1c
2. Crypto Wallet
Wallets based on threshold signature are more secure because the private key doesn’t need to be rebuilt. Also, without all signatures posted publicly, anonymity can be achieved. Compared to the multi-signature, threshold signature needs less transaction fees. Similar to key management applications, the administration of digital asset accounts can also be more flexible. Furthermore, threshold signature wallet can support various blockchains that do not natively support multi-signature, which reduces the risk of smart contracts bugs.

Conclusion

This article describes why people need the threshold signature, and what inspiring properties it may bring. One can see that threshold signature has higher security, more flexible control, more efficient verification process. In fact, different signature technologies have different application scenarios, such as aggregate signatures not mentioned in the article, and BLS-based multi-signature. At the same time, readers are also welcomed to read more about secure multi-party computation. Secure computation is the holy grail of cryptographic protocols. It can accomplish much more than the application of threshold signatures. In the near future, secure computation will solve more specific application questions in the digital world.

About Author

Dr. Alex Su works for ARPA as the cryptography researcher. He got his Bachelor’s degree in Electronic Engineering and Ph.D. in Cryptography from Tsinghua University. Dr. Su’s research interests include multi-party computation and post-quantum cryptography implementation and acceleration.

About ARPA

ARPA is committed to providing secure data transfer solutions based on cryptographic operations for businesses and individuals.
The ARPA secure multi-party computing network can be used as a protocol layer to implement privacy computing capabilities for public chains, and it enables developers to build efficient, secure, and data-protected business applications on private smart contracts. Enterprise and personal data can, therefore, be analyzed securely on the ARPA computing network without fear of exposing the data to any third party.
ARPA’s multi-party computing technology supports secure data markets, precision marketing, credit score calculations, and even the safe realization of personal data.
ARPA’s core team is international, with PhDs in cryptography from Tsinghua University, experienced systems engineers from Google, Uber, Amazon, Huawei and Mitsubishi, blockchain experts from the University of Tokyo, AIG, and the World Bank. We also have hired data scientists from CircleUp, as well as financial and data professionals from Fosun and Fidelity Investments.
For more information about ARPA, or to join our team, please contact us at [email protected].
Learn about ARPA’s recent official news:
Telegram (English): https://t.me/arpa_community
Telegram (Việt Nam): https://t.me/ARPAVietnam
Telegram (Russian): https://t.me/arpa_community_ru
Telegram (Indonesian): https://t.me/Arpa_Indonesia
Telegram (Thai): https://t.me/Arpa_Thai
Telegram (Philippines):https://t.me/ARPA_Philippines
Telegram (Turkish): https://t.me/Arpa_Turkey
Korean Chats: https://open.kakao.com/o/giExbhmb (Kakao) & https://t.me/arpakoreanofficial (Telegram, new)
Medium: https://medium.com/@arpa
Twitter: u/arpaofficial
Reddit: https://www.reddit.com/arpachain/
Facebook: https://www.facebook.com/ARPA-317434982266680/54
submitted by arpaofficial to u/arpaofficial [link] [comments]

Help me code it!

Hi everyone, i am learning about Python and it's quite hard with me. I want to calculate Public key from Private key with ECC. I have the code from Github, transform it to Python 3.0 and it does not work:
# Super simple Elliptic Curve Presentation. No imported libraries, wrappers, nothing. # For educational purposes only. Remember to use Python 2.7.6 or lower. You'll need to make changes for Python 3. # Below are the public specs for Bitcoin's curve - the secp256k1 import binascii Pcurve = 2**256 - 2**32 - 2**9 - 2**8 - 2**7 - 2**6 - 2**4 -1 # The proven prime N=0xFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFEBAAEDCE6AF48A03BBFD25E8CD0364141 # Number of points in the field Acurve = 0; Bcurve = 7 # These two defines the elliptic curve. y^2 = x^3 + Acurve * x + Bcurve Gx = 55066263022277343669578718895168534326250603453777594175500187360389116729240 Gy = 32670510020758816978083085130507043184471273380659243275938904335757337482424 GPoint = (Gx,Gy) # This is our generator point. Trillions of dif ones possible #Individual Transaction/Personal Information privKey = 0xA0DC65FFCA799873CBEA0AC274015B9526505DAAAED385155425F7337704883E #replace with any private key def modinv(a,n=Pcurve): #Extended Euclidean Algorithm/'division' in elliptic curves lm, hm = 1,0 low, high = a%n,n while low > 1: ratio = high/low nm, new = hm-lm*ratio, high-low*ratio lm, low, hm, high = nm, new, lm, low return lm % n def ECadd(a,b): # Not true addition, invented for EC. Could have been called anything. LamAdd = ((b[1]-a[1]) * modinv(b[0]-a[0],Pcurve)) % Pcurve x = (LamAdd*LamAdd-a[0]-b[0]) % Pcurve y = (LamAdd*(a[0]-x)-a[1]) % Pcurve return (x,y) def ECdouble(a): # This is called point doubling, also invented for EC. Lam = ((3*a[0]*a[0]+Acurve) * modinv((2*a[1]),Pcurve)) % Pcurve x = (Lam*Lam-2*a[0]) % Pcurve y = (Lam*(a[0]-x)-a[1]) % Pcurve return (x,y) def EccMultiply(GenPoint,ScalarHex): #Double & add. Not true multiplication if ScalarHex == 0 or ScalarHex >= N: raise Exception("Invalid ScalaPrivate Key") ScalarBin = str(bin(ScalarHex))[2:]; #print(ScalarBin); Q=GenPoint for i in range (1,len(ScalarBin)): # This is invented EC multiplication. Q=ECdouble(Q); print(("DUB", Q[0])); print(i) if ScalarBin[i] == "1": Q=ECadd(Q,GenPoint); print(("ADD", Q[0])); print() return (Q) PublicKey = EccMultiply(GPoint,privKey); print(); print("******* Public Key Generation *********"); print() print("the private key:"); print((hex(privKey))); print() print("the uncompressed public key (not address):"); print(PublicKey); print() print("the uncompressed public key (HEX):"); print(("04" + "%064x" % PublicKey[0] + "%064x" % PublicKey[1])); print(); print("the official Public Key - compressed:"); if PublicKey[1] % 2 == 1: # If the Y value for the Public Key is odd. print(("03"+str(hex(PublicKey[0])[2:-1]).zfill(64))) else: # Or else, if the Y value is even. print(("02"+str(hex(PublicKey[0])[2:-1]).zfill(64))) 
submitted by Phuc_Jackson to Bitcoin [link] [comments]

Technical: Upcoming Improvements to Lightning Network

Price? Who gives a shit about price when Lightning Network development is a lot more interesting?????
One thing about LN is that because there's no need for consensus before implementing things, figuring out the status of things is quite a bit more difficult than on Bitcoin. In one hand it lets larger groups of people work on improving LN faster without having to coordinate so much. On the other hand it leads to some fragmentation of the LN space, with compatibility problems occasionally coming up.
The below is just a smattering sample of LN stuff I personally find interesting. There's a bunch of other stuff, like splice and dual-funding, that I won't cover --- post is long enough as-is, and besides, some of the below aren't as well-known.
Anyway.....

"eltoo" Decker-Russell-Osuntokun

Yeah the exciting new Lightning Network channel update protocol!

Advantages

Myths

Disadvantages

Multipart payments / AMP

Splitting up large payments into smaller parts!

Details

Advantages

Disadvantages

Payment points / scalars

Using the magic of elliptic curve homomorphism for fun and Lightning Network profits!
Basically, currently on Lightning an invoice has a payment hash, and the receiver reveals a payment preimage which, when inputted to SHA256, returns the given payment hash.
Instead of using payment hashes and preimages, just replace them with payment points and scalars. An invoice will now contain a payment point, and the receiver reveals a payment scalar (private key) which, when multiplied with the standard generator point G on secp256k1, returns the given payment point.
This is basically Scriptless Script usage on Lightning, instead of HTLCs we have Scriptless Script Pointlocked Timelocked Contracts (PTLCs).

Advantages

Disadvantages

Pay-for-data

Ensuring that payers cannot access data or other digital goods without proof of having paid the provider.
In a nutshell: the payment preimage used as a proof-of-payment is the decryption key of the data. The provider gives the encrypted data, and issues an invoice. The buyer of the data then has to pay over Lightning in order to learn the decryption key, with the decryption key being the payment preimage.

Advantages

Disadvantages

Stuckless payments

No more payments getting stuck somewhere in the Lightning network without knowing whether the payee will ever get paid!
(that's actually a bit overmuch claim, payments still can get stuck, but what "stuckless" really enables is that we can now safely run another parallel payment attempt until any one of the payment attempts get through).
Basically, by using the ability to add points together, the payer can enforce that the payee can only claim the funds if it knows two pieces of information:
  1. The payment scalar corresponding to the payment point in the invoice signed by the payee.
  2. An "acknowledgment" scalar provided by the payer to the payee via another communication path.
This allows the payer to make multiple payment attempts in parallel, unlike the current situation where we must wait for an attempt to fail before trying another route. The payer only needs to ensure it generates different acknowledgment scalars for each payment attempt.
Then, if at least one of the payment attempts reaches the payee, the payee can then acquire the acknowledgment scalar from the payer. Then the payee can acquire the payment. If the payee attempts to acquire multiple acknowledgment scalars for the same payment, the payer just gives out one and then tells the payee "LOL don't try to scam me", so the payee can only acquire a single acknowledgment scalar, meaning it can only claim a payment once; it can't claim multiple parallel payments.

Advantages

Disadvantages

Non-custodial escrow over Lightning

The "acknowledgment" scalar used in stuckless can be reused here.
The acknowledgment scalar is derived as an ECDH shared secret between the payer and the escrow service. On arrival of payment to the payee, the payee queries the escrow to determine if the acknowledgment point is from a scalar that the escrow can derive using ECDH with the payer, plus a hash of the contract terms of the trade (for example, to transfer some goods in exchange for Lightning payment). Once the payee gets confirmation from the escrow that the acknowledgment scalar is known by the escrow, the payee performs the trade, then asks the payer to provide the acknowledgment scalar once the trade completes.
If the payer refuses to give the acknowledgment scalar even though the payee has given over the goods to be traded, then the payee contacts the escrow again, reveals the contract terms text, and requests to be paid. If the escrow finds in favor of the payee (i.e. it determines the goods have arrived at the payer as per the contract text) then it gives the acknowledgment scalar to the payee.

Advantages

Disadvantages

Payment decorrelation

Because elliptic curve points can be added (unlike hashes), for every forwarding node, we an add a "blinding" point / scalar. This prevents multiple forwarding nodes from discovering that they have been on the same payment route. This is unlike the current payment hash + preimage, where the same hash is used along the route.
In fact, the acknowledgment scalar we use in stuckless and escrow can simply be the sum of each blinding scalar used at each forwarding node.

Advantages

Disadvantages

submitted by almkglor to Bitcoin [link] [comments]

A reminder of who Craig Wright is and the benefits to BCH now he has gone.

This needs to be repeated every so often on this subreddit so new people can understand the history of the fork of BCH into BCH and BSV
From Jonald Fyookball's article
https://medium.com/@jonaldfyookball/bitcoin-cash-is-finally-free-of-faketoshi-great-days-lie-ahead-bb0c833e4c5d
Craig S. Wright (CSW) leaving the Bitcoin Cash community is a wonderful thing. This self-described “tyrant” has been expunged, and now we can get back to our mission of bringing peer-to-peer electronic cash to the world.
The markets will rebound when they see the chaos is over, but regardless of the price, we will keep building. Nothing will stop the sound money movement. Calling Out Bad Behavior
As Rick Falkvinge recently explained, there is a difference between small-minded gossiping about personalities and legitimately calling out bad behavior.
CSW’s bad behavior must be called out, because he has done tremendous damage to Bitcoin Cash (and possibly even the entire cryptocurrency sector).
The brief history is that he gained his reputation by claiming to be Bitcoin’s creator (Satoshi Nakamoto). He said he would provide “extraordinary proof” but he has never done so.
Supposedly, he did some “private signings” to a few people, and this allowed him to gain influence in the BCH community. The destruction he has been causing was not widely recognized until after a huge mess had been made.
Thanks to u/Contrarian__ for the following compliation of CSW’s misgivings:
Some background on Craig’s claim of being Satoshi, for the uninitiated:
He faked blog posts He faked PGP keys He faked contracts and emails He faked threats He faked a public key signing He has a well-documented history of fabricating things bitcoin and non-bitcoin related He faked a bitcoin trust to get free money from the Australian government but was caught and fined over a million dollars. 
And specifically concerning his claim to be Satoshi:
He has provided no independently verifiable evidence He is not technically competent in the subject matter His writing style is nothing like Satoshi’s He called bitcoin “Bit Coin” in 2011 when Satoshi never used a space He actively bought and traded coins from Mt. Gox in 2013 and 2014 He was paid millions for ‘coming out’ as Satoshi as part of the deal to sell his patents to nTrust — for those who claim he was ‘outed’ or had no motive 
Caught Red Handed Plagiarizing
No respectable academic, scientist, or professional needs to stoop so low as to steal and take credit for the work of others — least of all Satoshi. Yet, CSW has already been caught at least 3 times plagiarizing.
His paper on selfish mining has full sections copied almost verbatim from a paper written by Liu & Wang. His “Beyond Godel” paper which purports to claim that Bitcoin script is turing complete, is heavily plagiarized. A paper on block propagation was blatantly and intentionally plagiarized. 
Can’t Even Steal Code Correctly
CSW was also caught attempting to plagiarize a “hello world” program (the simplest of all computer programs).
He apparently does not understand base58 or how Bitcoin address checksums work (both of these are common knowledge to experienced Bitcoiners), and has made other embarrasing errors. So How Did Such an Obvious Fraud Gain So Much Power and Influence?
There are no easy answers here. It seems that as humans, we are very susceptible to manipulation and misinformation. The greatest weapon against sinister forces is a well-educated populace. This is something that can only improve over the long run.
The “Satoshi factor” is a powerful one and appeals to the glamorization of a mythical figure. Even people such as myself, who are technically astute, gave CSW all benefit of the doubt until the evidence staring us in the face could no longer be denied.
The seduction of the BCH community was also facilitated by CSW becoming a strong advocate for the on-chain/big-block scaling movement at a time when the community was dying to hear it. This message, delivered with a brazen, in-your-face style, was a sharp contrast to anything seen before.
In addition, CSW was able to find obscure topics (“2pda”), network topology, etc, that seemed to establish him as an expert with esoteric knowledge above and beyond anyone else. Basically, he was using technobabble, but it wasn’t immediately obvious except to very technical people… who were then attacked and discredited.
Eventually, as more and more of the community began to realize his technical claims were bogus, CSW banned those people from his twitter feed and slack channel, leaving only a group of untechnical “believers”, which the larger BCH community referred to as “the church” AKA the Cult-of-Craig.
Finally, if some believed that CSW possesed Satoshis’s stash of 1M BTC, then they may have been gnawing to get a piece of it. But it may turn out that these are the coins that never were. Broken Promises
If this article so far seems like an “attack piece” on CSW, remember it is important to get all the facts out in the open. We’ll get to the silver lining and bright future in a moment… but let’s continue here to “get it all out”.
One of the biggest ways that CSW has damaged the community is to make an endless series of broken promises. This caused others to wait, to waste time on his unproven ideas and solutions, and to postpone or drop their own ideas and initiatives.
He said he was building a mining pool to “stop SegWit” He said he was bringing big companies to use the BCH chain He said that he was providing a fungibility solution based on blind threshold signatures He said he was providing novel technology based on oblivious transfers He said he was providing a method where people could do atomic swaps without using timelocks He said he was going to show everyone how we can do bilinear pairings using secp256k1 He said he was going to release source code for nakasendo He said he was releasing some information that would “kill the lightning network” He said he was going to show everyone how the selfish mining theory is wrong He said he was going to show everyone how we can tokenize everything in the universe squared He said a few times “big things are coming in 2 months” 
How CSW Has Damaged the BCH Community
In addition to the broken promises, the BCH community was wounded due to:
The division of the community (with classic divide and conquer tactics) Loss of focus. Huge amounts of drama and distraction from building and adoption Investor confidence has been shaken due to uncertainty and chaos. BCH is a laughing stock to outsiders due to CSW’s antics Gemini deployment of BCH and other rollouts paused Loss of developer talent due to toxic and abrasive personality Various patent and legal threats 
The Hash War Event and Split into BitcoinSV
Every 6 months, BCH has a scheduled network upgrade. This is technically a “hard fork” but a non-contentious fork does not result in a split of the chain — it is simply new network rules being activated.
Bitcoin Cash has multiple independent developer groups including Bitcoin ABC, Bitcoin Unlimited, Bitcoin XT, Bitprim, BCHD, bcash, parity, Flowee, and others.
The nChain group, led by CSW, introduced an alternate set of changes a week before the agreed cut-off date, intentionally causing a huge controversey. These changes were incompatible with the changes being discussed between the other groups.
nChain objected to the changes being proposed (cannonical transaction ordering) despite specifically agreeing to it almost a year earlier. The last minute objections were in my opinion, an attempt at sabotage.
An emergency meeting was held in Bangkok to attempt to resolve the differences between the nChain group and the rest of the community. Not only did CSW refuse to listen to the other presentations, he walked out of the meeting after his own speech had been given. The other nChain people refused to discuss the technical issues.
After this, nChain built their own software (“BitcoinSV”) to attempt to compete for the Bitcoin Cash network. But rather than split off to follow their own set of rules, they threatened to attack Bitcoin Cash.
Their attitude was “you follow our rules or we burn it all down”.
The CSW sycophants adopted a strange interpretation of the Bitcoin whitepaper and proselytized the idea that if nChain could “out hash” everyone else, the market should be obliged to follow them.
This faulty thinking was eloquently debunked by u/CatatonicAdenosine. As it turns out, nChain was unable in any case to win at their own game. But Here’s the Obviously Good News…
CSW is gone. It’s over.
He can do whatever he wants on the BitcoinSV chain. He will never be allowed to influence Bitcoin Cash again. And all the negative things and negative people that were a consequence of his involvement in Bitcoin Cash are gone with him.
As a community, we will redouble our efforts and get back to our mission of peer-to-peer electronic cash. We will learn to work together better than ever, and we will learn to detect and punish bad behavior sooner.
The attempted attacks with hashpower also sparked innovation and a focus on the problem of how to stop such attacks in the future. This is only making Bitcoin Cash (BCH) and the entire class of Proof-of-Work coins stronger.
Nothing will stop us.
The reason why millions of dollars were spent to attack and also to defend Bitcoin Cash is because it’s something truly worth fighting over.
It’s sound money.
It’s permissionless.
It’s what Satoshi Nakamoto wrote about in 2008. It’s Bitcoin, a Peer-to-Peer Electronic Cash System.
Bitcoin 
Go to the profile of Jonald Fyookball Jonald Fyookball More from Jonald Fyookball Jimmy Song Tries to Claim Bitcoin Cash is “Fiat Money”… Seriously? Go to the profile of Jonald Fyookball Jonald Fyookball Related reads 600 Microseconds Go to the profile of Awemany Awemany Related reads The scams in Crypto Go to the profile of Craig Wright (Bitcoin SV is the original Bitcoin.) Craig Wright (Bitcoin SV is the original Bitcoin.) Responses
submitted by stewbits22 to btc [link] [comments]

How to apply sensible submodule namespaces?

I've written a library Secp256k1 which contains several modules, https://github.com/q9f/secp256k1.cr
When someone imports my library, i.e., require secp256k1, they also get access to the modules not named Secp256k1, for example:
```crystal

import secp256k1

require "secp256k1"

everything starts with a random number

private_key = Secp256k1.new_private_key

private keys in bitcoin's wallet import format

wif = Bitcoin.wif_from_private private_key, "80" ```
I am wondering, is there any sensible way to have namespaces for submodules?
For example: Secp256k1::Bitcoin.wif_from_private priv instead of referencing Bitcoin directly? I'm asking because maybe other libraries also reference such a namespace and I want clarity for developers using this library.
I'm currently refactoring the library and wondering what's the best practice is to arrange the modules exposed by the library.
submitted by q9fm to crystal_programming [link] [comments]

Upcoming Updates to Bitcoin Consensus

Price and Libra posts are shit boring, so let's focus on a technical topic for a change.
Let me start by presenting a few of the upcoming Bitcoin consensus changes.
(as these are consensus changes and not P2P changes it does not include erlay or dandelion)
Let's hope the community strongly supports these upcoming updates!

Schnorr

The sexy new signing algo.

Advantages

Disadvantages

MuSig

A provably-secure way for a group of n participants to form an aggregate pubkey and signature. Creating their group pubkey does not require their coordination other than getting individual pubkeys from each participant, but creating their signature does require all participants to be online near-simultaneously.

Advantages

Disadvantages

Taproot

Hiding a Bitcoin SCRIPT inside a pubkey, letting you sign with the pubkey without revealing the SCRIPT, or reveal the SCRIPT without signing with the pubkey.

Advantages

Disadvantages

MAST

Encode each possible branch of a Bitcoin contract separately, and only require revelation of the exact branch taken, without revealing any of the other branches. One of the Taproot script versions will be used to denote a MAST construction. If the contract has only one branch then MAST does not add more overhead.

Advantages

Disadvantages

submitted by almkglor to Bitcoin [link] [comments]

A good reminder of what we (BCH) gained from the hardfork, a review of all the lies and broken promises by CSW

From Jonald Fyookball's Medium Article
https://medium.com/@jonaldfyookball/bitcoin-cash-is-finally-free-of-faketoshi-great-days-lie-ahead-bb0c833e4c5d
Some background on Craig’s claim of being Satoshi, for the uninitiated:
  1. He faked blog posts
  2. He faked PGP keys
  3. He faked contracts and emails
  4. He faked threats
  5. He faked a public key signing
  6. He has a well-documented history of fabricating things bitcoin and non-bitcoin related
  7. He faked a bitcoin trust to get free money from the Australian government but was caught and fined over a million dollars.
And specifically concerning his claim to be Satoshi:
  1. He has provided no independently verifiable evidence
  2. He is not technically competent in the subject matter
  3. His writing style is nothing like Satoshi’s
  4. He called bitcoin “Bit Coin” in 2011 when Satoshi never used a space
  5. He actively bought and traded coins from Mt. Gox in 2013 and 2014
  6. He was paid millions for ‘coming out’ as Satoshi as part of the deal to sell his patents to nTrust — for those who claim he was ‘outed’ or had no motive

Caught Red Handed Plagiarizing

No respectable academic, scientist, or professional needs to stoop so low as to steal and take credit for the work of others — least of all Satoshi. Yet, CSW has already been caught at least 3 times plagiarizing.
  1. His paper on selfish mining has full sections copied almost verbatim from a paper written by Liu & Wang.
  2. His “Beyond Godel” paper which purports to claim that Bitcoin script is turing complete, is heavily plagiarized.
  3. A paper on block propagation was blatantly and intentionally plagiarized.

Can’t Even Steal Code Correctly

CSW was also caught attempting to plagiarize a “hello world” program (the simplest of all computer programs).
He apparently does not understand base58 or how Bitcoin address checksums work (both of these are common knowledge to experienced Bitcoiners), and has made other embarrasing errors.
Broken Promises
He said he was building a mining pool to “stop SegWit”
submitted by stewbits22 to btc [link] [comments]

Google’s Quantum Computing Breakthrough Brings Blockchain Resistance Into the Spotlight Again

Google’s Quantum Computing Breakthrough Brings Blockchain Resistance Into the Spotlight Again


News by Forbes: Darryn Pollock
Quantum computing has been on the tech radar for some time now, but it has also been lurking in the background of the blockchain ecosystem for very different reasons. The new advancement of computing allows for complex equations and problems to be solved exponentially quicker than is currently available.
However, it has always been predominantly a futuristic, almost science fiction-like pursuit; for blockchain that has been just fine as well because we have been warned that quantum computation could render existing encryption standards obsolete, threatening the security of every significant blockchain.
This week, news has emerged that Google has made a recent quantum computing breakthrough, achieving quantum supremacy. It is being reported that Google, using a quantum computer, managed to perform a calculation in just over three minutes that would take the world’s most powerful supercomputer 10,000 years.
This could mean panic stations for blockchain as all that has been achieved thus far could be wiped out, and without the right provisions, all the promise and potential could be eliminated overnight.
However, the term quantum supremacy refers to the moment when a quantum computer outperforms the world’s best classical computer in a specific test. This is just the first step, but it is a rather large step that means the spotlight is once again on blockchain to try and resist this kind of technology which can unravel its cryptographic algorithms in minutes.
Google’s first steps
Google has described the recent achievement as a “milestone towards full-scale quantum computing.” They have also said this milestone puts a marker in the ground on which they can start rapidly progressing towards full quantum computing — another concerning statement form blockchains.
Details are a little scarce on what Google has achieved, and how they have done it, but previous proposals essentially involve the quantum computer racing a classical computer simulating a random quantum circuit.
According to Gizmodo, it has been long known that Google has been testing a 72-qubit device called Bristlecone with which it hoped to achieve quantum supremacy and the initial report from the Financial Times says that the supremacy experiment was instead performed with a 53-qubit processor codenamed Sycamore.
However, it would be a little early to start abandoning all hope with Bitcoin, blockchain, and the emerging technology as it is a bit more complicated than that. More so, there is already technology and projects in place that has been trying to prepare for an age of quantum computing where blockchain is resistant.
Are blockchains ready to resist?
So, if quantum computing is making significant breakthroughs, is there any evidence of blockchain’s being prepared for this new age, and a new threat? There has been news of blockchain builders putting out quantum-resistant chains, such as E-cash inventor David Chaum and his latest cryptocurrency, Praxxis.

David Chaum, Elixxir on Moneyconf Stage during day two of Web Summit 2018 (Photo by Eoin Noonan /Web Summit via Getty Images)

WEB SUMMIT VIA GETTY IMAGES
QAN is another project that says it is ready for the quantum computing age, has reacted quickly to the news of Google’s breakthrough with Johann Polecsak, CTO of QAN, telling Bitcoin.com: “The notion of Google achieving a quantum breakthrough sounds very dramatic, but in reality, it’s hard to gauge the significance at this time. How can we be sure that Google’s quantum computer is more powerful than D-wave’s, for example, which surpassed 1,000 qubits four years ago?”
I also reached out to Polecsak to find out more about the threat of quantum computing when, and if, it reaches its pinnacle.
“We should definitely be worried,” he told me, “Many IT professionals and CTOs, including the earlier m, are neglecting and denying quantum computing threats with the simple reasoning that once it’s seriously coming, we’ll have to redesign almost everything from scratch, and that must surely be a long time ahead.”
“The truth is that one can already rent quantum computers for experimenting with possible attack algorithms and testing theoretical approaches. The maths behind breaking currently used public key cryptography — EC and RSA — were proven, we just need more qubits.”
“In cryptography, it’s best to prepare for the worst, and one can observe in recent literature that past skeptics now instantiate their crypto protocols in a post-quantum setting — just it case. Users shouldn’t worry now, but experts should prepare before it’s too late.”
QAN CTO Johann Polecsak speaking about the threat of quantum computing at a conference in Seoul, South Korea.

SUPPLIED
What it means to be quantum-resistant
Of course, the technological aspect of the race between quantum computing and blockchain quantum resistance is immense, and it is also quite nuanced. It is not as if quantum computing will, like a light switch, be available and all blockchains will suddenly be vulnerable — but it is still important to be prepared. As it stands, there probably is not enough preparation and planning in place, according to Polecsak.
“Blockchains won’t be ready for such a breakthrough. Since transaction history is the backbone of blockchains, such an improvement in quantum computing could be catastrophic for the whole transaction history,” added the CTO. “There is an extra layer of protection with Bitcoin’s double hashing but assuming a quantum computer is capable of Shor on secp256k1 it’s safe to assume it’s also capable of Grover256. Also, we don’t know bounds for SHA regarding quantum circuits.”
“As for QAN blockchain platform, it is not a linear comparison or a race where we need to keep up side-by-side with increasing qubits. Being Quantum-safe does not mean that we are just increasing bits in currently used algorithms, but that we take a totally different approach which resists the known Quantum attacks by design.”
Prepare to resist
As science-fictiony as it sounds, quantum computing is a threat that needs to be taken seriously in the world of blockchains. It may not be the kill switch that everyone imagines because of media hype, but it certainly something that should be on the radar for anyone involved in the ecosystem.
It is not only because of what has been accomplished in blockchain thus far but also because of what is being built and promised in the space. Blockchain is a major technology revolution on the horizon, and as it permeates deeper into enterprises and governments it would be catastrophic for all that has been done to be undone, and all that has been promised to be eliminated.
submitted by GTE_IO to u/GTE_IO [link] [comments]

Bitcoin Verde v1.1.0 : Schnorr Signatures

For those unacquainted:
Bitcoin Verde is an indexing full-node, written from the ground up in Java. Bitcoin Verde comes with a block explorer, and a stratum mining pool. While we consider v1.1.0 still a beta release, our public node (https://bitcoinverde.org) has been stable since January.
Bitcoin Verde is the first to write a Schnorr Signature verification algorithm for Java (using Pieter Wuille's specification) ; if others implementations or wallets need to use our implementation as a reference, it can be located here. Verde's Schnorr implementation has been tested against the same suite of tests as ABC's (Test file located here). We intend to submit a pull request to Bouncy Castle sometime in the future.
Things that are new this release:
Similar to our previous release, Bitcoin Verde is very likely incompatible with Windows. Furthermore, it's an indexing node, and because of that will have more system requirements than a traditional node due to database indexes and the inherent underlying database structure. Our fully synced bitcoinverde.org node is currently using 615G disk space, and 21G of memory. Bitcoin Verde can be configured to run with far less memory, with a minimum around 2G. (Disable UTXO caching, disable TX Bloom Filter, set max database memory to ~1.5G). We highly recommend running BV on an SSD or M2. Traditional HDD drives are awful at random reads, and the last attempted initial-block-download we performed on an HDD took about 2+ weeks to complete.
If you have any problems with your node, please reach out to us at [email protected]. We'd be happy to help and troubleshoot problems. Alternatively, you can report bugs or issues to our github.
We've been diligently working on a mobile wallet based on the Bitcoin Verde codebase. We hope to provide a more modern replacement for bitcoinj, while also supporting SLP tokens. We are also ensuring all Bitcoin Verde SPV code can be transpiled to Objective-C (with Swift Bindings) for use on iOS. Finally, Bitcoin Verde is experimenting with a new message to improve the initial synchronization of SPV wallets called addrblocks, which will greatly improve the ability to validate SLP tokens. Additionally/alternatively, we have been looking at supporting BIP-157 for a more privacy-oriented way to achieve near-instant SPV synchronization; we look forward to sharing these thoughts in the near future.
Again, thank you to the XT team for your support. Another thank you for the ABC team for welcoming us into your conversations, and for helping us understand some of the nuanced aspects of this HF.
To install/upgrade your nodes, clone/pull master at https://github.com/softwareverde/bitcoin-verde, and reference the README under Installing/Upgrading.
submitted by FerriestaPatronum to btc [link] [comments]

The core concepts of DTube's new blockchain

Dear Reddit community,
Following our announcement for DTube v0.9, I have received countless questions about the new blockchain part, avalon. First I want to make it clear, that it would have been utterly impossible to build this on STEEM, even with the centralized SCOT/Tribes that weren't available when I started working on this. This will become much clearer as you read through the whole wall of text and understand the novelties.
SteemPeak says this is a 25 minutes read, but if you are truly interested in the concept of a social blockchain, and you believe in its power, I think it will be worth the time!

MOVING FORWARD

I'm a long time member of STEEM, with tens of thousands of staked STEEM for 2 years+. I understand the instinctive fear from the other members of the community when they see a new crypto project coming out. We've had two recent examples recently with the VOICE and LIBRA annoucements, being either hated or ignored. When you are invested morally, and financially, when you see competitors popping up, it's normal to be afraid.
But we should remember competition is healthy, and learn from what these projects are doing and how it will influence us. Instead, by reacting the way STEEM reacts, we are putting our heads in the sand and failing to adapt. I currently see STEEM like the "North Korea of blockchains", trying to do everything better than other blockchains, while being #80 on coinmarketcap and slowly but surely losing positions over the months.
When DLive left and revealed their own blockchain, it really got me thinking about why they did it. The way they did it was really scummy and flawed, but I concluded that in the end it was a good choice for them to try to develop their activity, while others waited for SMTs. Sadly, when I tried their new product, I was disappointed, they had botched it. It's purely a donation system, no proof of brain... And the ultra-majority of the existing supply is controlled by them, alongside many other 'anti-decentralization' features. It's like they had learnt nothing from their STEEM experience at all...
STEEM was still the only blockchain able to distribute crypto-currency via social interactions (and no, 'donations' are not social interactions, they are monetary transfers; bitcoin can do it too). It is the killer feature we need. Years of negligence or greed from the witnesses/developers about the economic balance of STEEM is what broke this killer feature. Even when proposing economical changes (which are actually getting through finally in HF21), the discussions have always been centered around modifying the existing model (changing the curve, changing the split, etc), instead of developing a new one.
You never change things by fighting the existing reality.
To change something, build a new model that makes the existing model obsolete.
What if I built a new model for proof of brain distribution from the ground up? I first tried playing with STEEM clones, I played with EOS contracts too. Both systems couldn't do the concepts I wanted to integrate for DTube, unless I did a major refactor of tens of thousands of lines of code I had never worked with before. Making a new blockchain felt like a lighter task, and more fun too.
Before even starting, I had a good idea of the concepts I'd love to implement. Most of these bullet points stemmed from observations of what happened here on STEEM in the past, and what I considered weaknesses for d.tube's growth.

NO POWER-UP

The first concept I wanted to implement deep down the core of how a DPOS chain works, is that I didn't want the token to be staked, at all (i.e. no 'powering up'). The cons of staking for a decentralized social platform are obvious: * complexity for the users with the double token system. * difficulty to onboard people as they need to freeze their money, akin to a pyramid scheme.
The only good thing about staking is how it can fill your bandwidth and your voting power when you power-up, so you don't need to wait for it to grow to start transacting. In a fully-liquid system, your account ressources start at 0% and new users will need to wait for it to grow before they can start transacting. I don't think that's a big issue.
That meant that witness elections had to be run out of the liquid stake. Could it be done? Was it safe for the network? Can we update the cumulative votes for witnesses without rounding issues? Even when the money flows between accounts freely?
Well I now believe it is entirely possible and safe, under certain conditions. The incentive for top witnesses to keep on running the chain is still present even if the stake is liquid. With a bit of discrete mathematics, it's easy to have a perfectly deterministic algorithm to run a decentralized election based off liquid stake, it's just going to be more dynamic as the funds and the witness votes can move around much faster.

NO EARLY USER ADVANTAGE

STEEM has had multiple events that influenced the distribution in a bad way. The most obvious one is the inflation settings. One day it was hella-inflationary, then suddently hard fork 16 it wasn't anymore. Another major one, is the non-linear rewards that ran for a long time, which created a huge early-user advantage that we can still feel today.
I liked linear rewards, it's what gives minnows their best chance while staying sybil-resistant. I just needed Avalon's inflation to be smart. Not hyper-inflationary like The key metric to consider for this issue, is the number of tokens distributed per user per day. If this metric goes down, then the incentive for staying on the network and playing the game, goes down everyday. You feel like you're making less and less from your efforts. If this metric goes up, the number of printed tokens goes up and the token is hyper-inflationary and holding it feels really bad if you aren't actively earning from the inflation by playing the game.
Avalon ensures that the number of printed tokens is proportional to the number of users with active stake. If more users come in, avalon prints more tokens, if users cash-out and stop transacting, the inflation goes down. This ensures that earning 1 DTC will be about as hard today, tomorrow, next month or next year, no matter how many people have registered or left d.tube, and no matter what happens on the markets.

NO LIMIT TO MY VOTING POWER

Another big issue that most steemians don't really know about, but that is really detrimental to STEEM, is how the voting power mana bar works. I guess having to manage a 2M SP delegation for @dtube really convinced me of this one.
When your mana bar is full at 100%, you lose out the potential power generation, and rewards coming from it. And it only takes 5 days to go from 0% to 100%. A lot of people have very valid reasons to be offline for 5 days+, they shouldn't be punished so hard. This is why all most big stake holders make sure to always spend some of their voting power on a daily basis. And this is why minnows or smaller holders miss out on tons of curation rewards, unless they delegate to a bidbot or join some curation guild... meh. I guess a lot of people would rather just cash-out and don't mind the trouble of having to optimize their stake.
So why is it even a mana bar? Why can't it grow forever? Well, everything in a computer has to have a limit, but why is this limit proportional to my stake? While I totally understand the purpose of making the bandwidth limited and forcing big stake holders to waste it, I think it's totally unneeded and inadapted for the voting power. As long as the growth of the VP is proportional to the stake, the system stays sybil-resistant, and there could technically be no limit at all if it wasn't for the fact that this is ran in a computer where numbers have a limited number of bits.
On Avalon, I made it so that your voting power grows virtually indefinitely, or at least I don't think anyone will ever reach the current limit of Number.MAX_SAFE_INTEGER: 9007199254740991 or about 9 Peta VP. If you go inactive for 6 months on an account with some DTCs, when you come back you will have 6 months worth of power generation to spend, turning you into a whale, at least for a few votes.
Another awkward limit on STEEM is how a 100% vote spends only 2% of your power. Not only STEEM forces you to be active on a daily basis, you also need to do a minimum of 10 votes / day to optimize your earnings. On Avalon, you can use 100% of your stored voting power in a single mega-vote if you wish, it's up to you.

A NEW PROOF-OF-BRAIN

No Author rewards

People should vote with the intent of getting a reward from it. If 75% of the value forcibly goes to the author, it's hard to expect a good return from curation. Steem is currently basically a complex donation platform. No one wants to donate when they vote, no matter what they will say, and no matter how much vote-trading, self-voting or bid-botting happens.
So in order to keep a system where money is printed when votes happen, if we cannot use the username of the author to distribute rewards, the only possibility left is to use the list of previous voters aka "Curation rewards". The 25% interesting part of STEEM, that has totally be shadowed by the author rewards for too long.

Downvote rewards

STEEM has always suffered from the issue that the downvote button is unused, or when it's used, it's mostly for evil. This comes from the fact that in STEEM's model, downvotes are not eligible for any rewards. Even if they were, your downvote would be lowering the final payout of the content, and your own curation rewards...
I wanted Avalon's downvotes to be completely symmetric to the upvotes. That means if we revert all the votes (upvotes become downvotes and vice versa), the content should still distribute the same amount of tokens to the same people, at the same time.

No payment windows

Steem has a system of payments windows. When you publish a content, it opens a payment window where people can freely upvote or downvote to influence the payout happening 7 days later. This is convenient when you want a system where downvotes lower rewards. Waiting 7 days to collect rewards is also another friction point for new users, some of them might never come back 7 days later to convince themselves that 'it works'. On avalon, when you are part of the winners of curation after a vote, you earn it instantly in your account, 100% liquid and transferable.

Unlimited monetization in time

Indeed, the 7 days monetization limit has been our biggest issue for our video platform since day 8. This incentivized our users to create more frequent, but lesser quality content, as they know that they aren't going to earn anything from the 'long-haul'. Monetization had to be unlimited on DTube, so that even a 2 years old video could be dug up and generate rewards in the far future.
Infinite monetization is possible, but as removing tokens from a balance is impossible, the downvotes cannot remove money from the payout like they do on STEEM. Instead, downvotes print money in the same way upvotes do, downvotes still lower the popularity in the hot and trending and should only rewards other people who downvoted the same content earlier.

New curation rewards algorithm

STEEM's curation algorithm isn't stupid, but I believe it lacks some elegance. The 15 minutes 'band-aid' necessary to prevent curation bots (bots who auto vote as fast as possible on contents of popular authors) that they added proves it. The way is distributes the reward also feels very flat and boring. The rewards for my votes are very predictable, especially if I'm the biggest voter / stake holder for the content. My own vote is paying for my own curation rewards, how stupid is that? If no one elses votes after my big vote despite a popularity boost, it probably means I deserve 0 rewards, no?
I had to try different attempts to find an algorithm yielding interesting results, with infinite monetization, and without obvious ways to exploit it. The final distribution algorithm is more complex than STEEM's curation but it's still pretty simple. When a vote is cast, we calculate the 'popularity' at the time of the vote. The first vote is given a popularity of 0, the next votes are defined by (total_vp_upvotes - total_vp_downvotes) / time_since_1st_vote. Then we look into the list of previous votes, and we remove all votes in the opposite direction (up/down). The we remove all the votes with a higher popularity if its an upvote, or the ones with a lower popularity if its a downvote. The remaining votes in the list are the 'winners'. Finally, akin to STEEM, the amount of tokens generated by the vote will be split between winners proportionally to the voting power spent by each (linear rewards - no advantages for whales) and distributed instantly. Instead of purely using the order of the votes, Avalon distribution is based on when the votes are cast, and each second that passes reduces the popularity of a content, potentially increasing the long-term ROI of the next vote cast on it.
Graph It's possible to chart the popularity that influences the DTC monetary distribution directly in the d.tube UI
This algorithm ensures there are always losers. The last upvoter never earns anything, also the person who upvoted at the highest popularity, and the one who downvoted at the lowest popularity would never receive any rewards for their vote. Just like the last upvoter and last downvoter wouldn't either. All the other ones in the middle may or may not receive anything, depending on how the voting and popularity evolved in time. The one with an obvious advantage, is the first voter who is always counted as 0 popularity. As long as the content stays at a positive popularity, every upvote will earn him rewards. Similarly, being the first downvoter on an overly-popular content could easily earn you 100% rewards on the next downvote that could be from a whale, earning you a fat bonus.
While Avalon doesn't technically have author rewards, the first-voter advantage is strong, and the author has the advantage of always being the first voter, so the author can still earn from his potentially original creations, he just needs to commit some voting power on his own contents to be able to publish.

ONE CHAIN <==> ONE APP

More scalable than shared blockchains

Another issue with generalistic blockchains like ETH/STEEM/EOS/TRX, which are currently hosting dozens of semi-popular web/mobile apps, is the reduced scalability of such shared models. Again, everything in a computer has a limit. For DPOS blockchains, 99%+ of the CPU load of a producing node will be to verify the signatures of the many transactions coming in every 3 seconds. And sadly this fact will not change with time. Even if we had a huge breakthrough on CPU speeds today, we would need to update the cryptographic standards for blockchains to keep them secure. This means it would NOT become easier to scale up the number of verifiable transactions per seconds.
Oh, but we are not there yet you're thinking? Or maybe you think that we'll all be rich if we reach the scalability limits so it doesn't really matter? WRONG
The limit is the number of signature verifications the most expensive CPU on the planet can do. Most blockchains use the secp256k1 curve, including Bitcoin, Ethereum, Steem and now Avalon. It was originally chosen for Bitcoin by Satoshi Nakamoto probably because it's decently quick at verifying signatures, and seems to be backdoor-proof (or else someone is playing a very patient game). Maybe some other curves exist with faster signature verification speed, but it won't be improved many-fold, and will likely require much research, auditing, and time to get adopted considering the security implications.
In 2015 Graphene was created, and Bitshares was completely rewritten. This was able to achieve 100,000 transaction per second on a single machine, and decentralized global stress testing achieved 18,000 transactions per second on a distributed network.
So BitShares/STEEM and other DPOS graphene chains in production can validate at most 18000 txs/sec, so about 1.5 billion transactions per day. EOS, Tendermint, Avalon, LIBRA or any other DPOS blockchain can achieve similar speeds, because there's no planet-killing proof-of-works, and thanks to the leader-based/democratic system that reduces the number of nodes taking part in the consensus.
As a comparison, there are about 4 billion likes per day on instagram, so you can probably double that with the actual uploads, stories and comments, password changes, etc. The load is also likely unstable through the day, probably some hours will go twice as fast as the average. You wouldn't be able to fit Instagram in a blockchain, ever, even with the most scalable blockchain tech on the world's best hardware. You'd need like a dozen of those chains. And instagram is still a growing platform, not as big as Facebook, or YouTube.
So, splitting this limit between many popular apps? Madness! Maybe it's still working right now, but when many different apps reach millions of daily active users plus bots, it won't fit anymore.
Serious projects with a big user base will need to rethink the shared blockchain models like Ethereum, EOS, TRX, etc because the fees in gas or necessary stake required to transact will skyrocket, and the victims will be the hordes of minnows at the bottom of the distribution spectrum.
If we can't run a full instagram on a DPOS blockchain, there is absolutely no point trying to run medium+reddit+insta+fb+yt+wechat+vk+tinder on one. Being able to run half an instagram is already pretty good and probably enough to actually onboard a fair share of the planet. But if we multiply the load by the number of different app concepts available, then it's never gonna scale.
DTube chain is meant for the DTube UI only. Please do not build something unrelated to video connecting to our chain, we would actively do what we can to prevent you from growing. We want this chain to be for video contents only, and the JSON format of the contents should always follow the one used by d.tube.
If you are interested in avalon tech for your project isn't about video, it's strongly suggested to fork the blockchain code and run your own avalon chain with a different origin id, instead of trying to connect your project to dtube's mainnet. If you still want to do it, chain leaders would be forced to actively combat your project as we would consider it as useless noise inside our dedicated blockchain.

Focused governance

Another issue of sharing a blockchain, is the issues coming up with the governance of it. Tons of features enabled by avalon would be controversial to develop on STEEM, because they'd only benefit DTube, and maybe even hurt/break some other projects. At best they'd be put at the bottom of a todo list somewhere. Having a blockchain dedicated to a single project enables it to quickly push updates that are focused on a single product, not dozens of totally different projects.
Many blockchain projects are trying to make decentralized governance true, but this is absolutely not what I am interested in for DTube. Instead, in avalon the 'init' account, or 'master' account, has very strong permissions. In the DTC case, @dtube: * will earn 10% fees from all the inflation * will not have to burn DTCs to create accounts * will be able to do certain types of transactions when others can't * * account creation (during steem exclusivity period) * * transfers (during IEO period) * * transfering voting power and bandwidth ressources (used for easier onboarding)
For example, for our IEO we will setup a mainnet where only @dtube is allowed to transfer funds or vote until the IEO completes and the airdrop happens. This is also what enabled us to create a 'steem-only' registration period on the public testnet for the first month. Only @dtube can create accounts, this way we can enforce a 1 month period where users can port their username for free, without imposters having a chance to steal usernames. Through the hard-forking mechanism, we can enable/disable these limitations and easily evolve the rules and permissions of the blockchain, for example opening monetary transfers at the end of our IEO, or opening account creation once the steem exclusivity ends.
Luckily, avalon is decentralized, and all these parameters (like the @dtube fees, and @dtube permissions) are easily hardforkable by the leaders. @dtube will however be a very strong leader in the chain, as we plan to use our vote to at least keep the #1 producing node for as long as we can.
We reserve the right to 'not follow' an hardfork. For example, it's obvious we wouldn't follow something like reducing our fees to 0% as it would financially endanger the project, and we would rather just continue our official fork on our own and plug d.tube domain and mobile app to it.
On the other end of the spectrum, if other leaders think @dtube is being tyranical one way or another, leaders will always have the option of declining the new hardforks and putting the system on hold, then @dtube will have an issue and will need to compromise or betray the trust of 1/3 of the stake holders, which could reveal costly.
The goal is to have a harmounious, enterprise-level decision making within the top leaders. We expect these leaders to be financially and emotionally connected with the project and act for good. @dtube is to be expected to be the main good actor for the chain, and any permission given to it should be granted with the goal of increasing the DTC marketcap, and nothing else. Leaders and @dtube should be able to keep cooperation high enough to keep the hard-forks focused on the actual issues, and flowing faster than other blockchain projects striving for a totally decentralized governance, a goal they are unlikely to ever achieve.

PERFECT IMBALANCE

A lot of hard-forking

Avalon is easily hard-forkable, and will get hard-forked often, on purpose. No replays will be needed for leaders/exchanges during these hard-forks, just pull the new hardfork code, and restart the node before the hard-fork planned time to stay on the main fork. Why is this so crucial? It's something about game theory.
I have no former proof for this, but I assume a social and financial game akin to the one played on steem since 2016 to be impossible to perfectly balance, even with a thourough dichotomical process. It's probably because of some psychological reason, or maybe just the fact that humans are naturally greedy. Or maybe it's just because of the sheer number of players. They can gang up together, try to counter each others, and find all sorts of creative ideas to earn more and exploit each other. In the end, the slightest change in the rules, can cause drastic gameplay changes. It's a real problem, luckily it's been faced by other people in the past.
Similarly to what popular and succesful massively multiplayer games have achieved, I plan to patch or suggest hard-forks for avalon's mainnet on a bi-monthly basis. The goal of this perfect imbalance concept, is to force players to re-discover their best strategy often. By introducing regular, small, and semi-controlled changes into this chaos, we can fake balance. This will require players to be more adaptative and aware of the changes. This prevents the game from becoming stale and boring for players, while staying fair.

Death to bots

Automators on the other side, will need to re-think their bots, go through the developement and testing phase again, on every new hard-fork. It will be an unfair cat-and-mouse game. Doing small and semi-random changes in frequent hard-forks will be a easy task for the dtube leaders, compared to the work load generated to maintain the bots. In the end, I hope their return on investment to be much lower compared to the bid-bots, up to a point where there will be no automation.
Imagine how different things would have been if SteemIt Inc acted strongly against bid-bots or other forms of automation when they started appearing? Imagine if hard-forks were frequent and they promised to fight bid-bots and their ilk? Who would be crazy enough to make a bid-bot apart from @berniesanders then?
I don't want you to earn DTCs unless you are human. The way you are going to prove you are human, is not by sending a selfie of you with your passport to a 3rd party private company located on the other side of the world. You will just need to adapt to the new rules published every two weeks, and your human brain will do it subconsciously by just playing the voting game and seeing the rewards coming.
All these concepts are aimed at directly improving d.tube, making it more resilient, and scale both technologically and economically. Having control over the full tech stack required to power our dapp will prevent issues like the one we had with the search engine, where we relied too heavily on a 3rd party tool, and that created a 6-months long bug that basically broke 1/3 of the UI.
While d.tube's UI can now totally run independently from any other entity, we kept everything we could working with STEEM, and the user is now able to transparently publish/vote/comment videos on 2 different chains with one click. This way we can keep on leveraging the generalistic good features of STEEM that our new chain doesn't focuses on doing, such as the dollar-pegged token, the author rewards/donation mechanism, the tribes/communities tokens, and simply the extra exposure d.tube users can get from other website (steemit.com, busy.org, partiko, steempeak, etc), which is larger than the number of people using d.tube directly.
The public testnet has been running pretty well for 3 weeks now, with 6000+ accounts registered, and already a dozen of independant nodes popping up and running for leaders. The majority of the videos are cross-posted on both chains and the daily video volume has slightly increased since the update, despite the added friction of the new 'double login' system and several UI bugs.
If you've read this article, I'm hoping to get some reactions from you in the comments section!
Some even more focused articles about avalon are going to pop on my blog in the following weeks, such as how to get a node running and running for leadewitness, so feel free to follow me to get more news and help me reach 10K followers ;)
submitted by nannal to dtube [link] [comments]

In response to ProofOfResearch's misleading article on NEO.

In response to ProofOfResearch's misleading article on NEO.
Yesterday, I was made aware of an article published by ProofOfResearch almost entirely based on a Reddit post that I had written a few months ago. About a month ago I was contacted by Randomshortdude (supposedly ProofOfResearch himself) asking for a permission to use the excerpts from the aforementioned post in his write-up about NEO. As an avid proponent of inclusivity and transparency, I gave a permission to use the contents of my post (the screenshots of the entire conversation will be added below), providing him with links to the Github repos and updating him on the fixes and improvements that have happened since the post had been published. Unknowingly, I continued to work on my projects while my post was being molded into a foundation for an entirely misleading and unfathomably unscientific article.
This post is going to consist of a list of excerpts from the article and the corresponding refutal for each of the listed excerpts.
"This is a semantic issue (example: $BTC having a 1 MB block size + 10 min block time limits TPS; no way around that) meaning that this is immutable"
Bitcoin doesn't have a 10 minute block time limit coded into the platform. The 10 minute average block production time is obtained via a difficulty adjustment formula that readjusts the difficulty of the underlying HashCash PoW algorithm every 2015 blocks (not 2016 due to a bug that was never fixed) based on the average block production time of the previous 2015 blocks.
"I’m not sure it’s even possible to change the digital signature of a protocol without a major hard fork, and there isn’t an alternative digital signature (that I think of), that would make this any more secure."
This excerpt is written in a reference to Point 2 of my Reddit post that criticizes the use of multisigs as a proof of the fact that a quorum (at least 2f + 1) replicas had signed the block hash. The use of multisig instead of signature batching via Schnorr's signatures doesn't affect the security of the nodes or the cryptographic standards used, however, the security of the network as a whole can be compromised due to the decreased number of full/light nodes operating increasing the likelihood of a spam attack being able to degrade the performance of the platform. Apart from that, a digital signature algorithm of the platform can be easily changed by adjusting the versioning of the block and transaction structures.
"Therefore, the consensus algo itself would need to be changed to amend this issue."
The consensus protocol works independently from the cryptographic standards of the platform, so a switch to a different elliptic curve or digital signature algorithm will have zero impact on the consensus algorithm.
"This, in itself, might be what stops $NEO from ever being able to truly scale."
While digital signature algorithms can vary in signing and verification speeds, the difference in the performance of the most popular signature schemes is small enough (except for BLS) to be considered to have a negligible impact on the efficiency of the consensus. As long the nodes are running on an efficient implementation, average network throughput is going continue to be the main bottleneck of the platform.
"Digital signatures are somewhat complex, but not incomprehensible if you really take the time to sit down and understand it. Once again though, it’s going to rely on an understanding of blockchain tech as well to know how this impacts the signing feature of a TX itself as well as pub key creation"
Digital signature algorithms play no role in public key creation as a public key is created simply by multiplying a 256-bit entropy (private key) by a generator (G).


A screenshot of a tweet used in the article.
Baffling. ed25519 DSA does not impact the efficiency of BFT and "blockchain" (whatever the hell that means in this context) as a result. Please also note that NEO does not use ed25519. NEO uses secp256r1 (as opposed to secp256k1 used by Bitcoin, which is a Koblitz curve) which is a NIST-recommended elliptic curve.
"Regular PoW algos are already designed to be Byzantine fault-tolerant already"
While being technically correct, the author dismisses the fact that BFT algorithms offer Byzantine fault-tolerance under rigid mathematical assumptions, in contrast to PoW algorithms which offer Byzantine fault-tolerance under probabilistic assumptions.
"Byzantine Fault Tolerance is not an issue though. It’s actually really useful but for private blockchains."
A common misconception about the use of BFT algorithms in "public" (the author meant permissioned/permission-less) blockchains. BFT algorithms are only required to retain the permissioned status during the agreement phase (meaning that the new candidates will have to wait until the next consensus round to be able to participate in the consensus) and can have a round robin algorithm implemented to select the next pool of validators.
"Of course, in a decentralized protocol — something like that is very hard to achieve."
The research paper quoted in the article examines the efficiency of Castro and Liskov's PBFT (Practical Byzantine Fault Tolerance) algorithm which is dissimilar from dBFT because PBFT doesn't require a primary change after every consensus round, which is impacts the performance in a decentralized network.
“At the other extreme, Hyperledger uses the classic PBFT protocol, which is communication bound: O(N²) where N is the number of nodes. PBFT can tolerate fewer than N/3 failures, and works in three phases in which nodes broadcast messages to each other. First, the pre-prepare phase selects a leader which chooses a value to commit. Next, the prepare phase broadcasts the value to be validated. Finally, the commit phase waits for more than two third of the nodes to confirm before announcing that the value is committed. PBFT has been shown to achieve liveness and safety properties in a partially asynchronous model [11], thus, unlike PoW, once the block is appended it is confirmed immediately. It can tolerate more failures than PoW (which is vulnerable to 25% attacks [26]). However, PBFT assumes that node identities are known, therefore it can only work in the permissioned settings. Additionally, the protocol is unlikely to be able to scale to the network size of Ethereum beacuse of its communication overhead.”
This statement will require a separate post to examine the real-world "permission-lessness" of PoW chains.
"NEO codebase is virtually abandoned."
neo-sharp? neo-go?
"This is purportedly in favor of $NEO 3.0, but there’s no GitHub for $NEO 3.0 (at least not any that I’ve found)"
https://github.com/neo-project/neo/pull/288.
"The idea of it being able to handle 1000 TPS has been thoroughly debunked and it is virtually impossible (probably entirely impossible) for $NEO to create a public blockchain based on DBFT (essentially POS+BFT semantically), that keeps the same encryption signatures (which are probably the only ones that will reliably serve the purpose of crypto where collision resistance must be all but a guarantee)."
dBFT cannot be equated to PoS + BFT as none of those are delegate-centered protocols. How was 1000 TPS thoroughly debunked? With the neo-sharp implementation and Akka being launched, I don't see a reason for dBFT to not be able to surpass 1,000 TPS during peak loads (not during sustained loads though). The excerpt about the collision resistance of "encryption signatures" (?) makes no sense to me.
Here are the promised screenshots of our conversation:
Screenshot 1

Screenshot 2

Screenshot 3
P.S. It is sad to see the so-called "researchers" attracting a mass following despite being clueless about the technology they are trying to review.

submitted by toghrulmaharramov to NEO [link] [comments]

You Can Now Prove a Whole Blockchain With One Math Problem — Really

You Can Now Prove a Whole Blockchain With One Math Problem — Really


Article by Coindesk: William Foxley
The Electric Coin Company (ECC) says it discovered a new way to scale blockchains with “recursive proof composition,” a proof to verify the entirety of a blockchain in one function. For the ECC and zcash, the new project, Halo, may hold the key to privacy at scale.
A privacy coin based on zero-knowledge proofs, referred to as zk-SNARKs, zcash’s current underlying protocol relies on “trusted setups.” These mathematical parameters were used twice in zcash’s short history: upon its launch in 2016 and first large protocol change, Sapling, in 2018.
Zcash masks transations through zk-SNARKs but the creation of initial parameters remains an issue. By not destroying a transaction’s mathematical foundation — the trusted setup — the holder can produce forged zcash.
Moreover, the elaborate ‘ceremonies‘ the zcash community undergoes to create the trusted setups are expensive and a weak point for the entire system. The reliance on trusted setups with zk-SNARKs was well known even before zcash’s debut in 2016. While other research failed to close the gap, recursive proofs make trusted setups a thing of the past, the ECC claims.

Bowe’s Halo

Speaking with CoinDesk, ECC engineer and Halo inventor Sean Bowe said recursive proof composition is the result of years of labor — by him and others — and months of personal frustration. In fact, he almost gave up three separate times.
Bowe began working for the ECC after his interest in zk-SNARKs was noticed by ECC CEO and zcash co-founder Zooko Wilcox in 2015. After helping launch zcash and its first significant protocol change with Sapling, Bowe moved to full-time research with the company.
Before Halo, Bowe worked on a different zk-SNARK variant, Sonic, requiring only one trusted setup.
For most cypherpunks, that’s one too many.
“People we are also starting to think as far back as 2008, we should be able to have proofs that can verify other proofs, what we call recursive proof composition. This happened in 2014,” Bowe told CoinDesk.

Proofs, proofs and more proofs

In essence, Bowe and Co. discovered a new method of proving the validity of transactions, while masked, by compressing computational data to the bare minimum. As the ECC paper puts it, “proofs that are capable of verifying other instances of themselves.”
Blockchain transaction such as bitcoin and zcash are based on elliptic curves with points on the curve serving as the basis for the public and private keys. The public address can be thought of the curve: we know what the elliptic curve looks like in general. What we do not know is where the private addresses are which reside on the curve.
It is the function of zk-SNARKs to communicate about private addresses and transactions–if an address exists and where it exists on the curve–anonymously.
The secp256k1 elliptic curve, used for bitcoin and ethereum via Hackernoon
Bowe’s work is similar to bulletproofs, another zk-SNARK that requires no trusted setup. “What you should think of when you think of Halo is like recursive bulletproofs,” Bowe said.
From a technical standpoint, bulletproofs rely on the “inner product argument,” which relays certain information about the curves to one another. Unfortunately, the argument is both very expensive and time consuming compared to your typical zk-SNARK verification.
By proving multiple zk-SNARKs with one–a task thought impossible until Bowe’s research–computational energy is pruned to a fraction of the cost.
“People have been thinking of bulletproofs on top of bulletproofs. The problem the bulletproof verifier is extremely expensive because of the inner product argument,” Bowe said. “I don’t use bulletproofs exactly, I use a previous idea bulletproofs are built on.”
In fact, Bowe said recursive proofs mean you can prove the entirety of the bitcoin blockchain in less space than a bitcoin blockhead takes — 80-bytes of data.

The future of zcash

Writing on Twitter, Wilcox said his company is currently studying the Halo implementation as a Layer 1 solution on zcash.
Layer 1 solutions are implementations into the codebase constituting a blockchain. Most scaling solutions, like bitcoin’s Lightning Network, are Layer 2 solutions built on top of a blockchain’s state. The ECC’s interest in turning Halo into a Layer 1 solution speaks to the originality of the discovery as it will reside next to code copied from bitcoin’s creator himself, Satoshi Nakamoto.
ECC is exploring the use of Halo for Zcash to both eliminate trusted setup and to scale Zcash at Layer 1 using nested proof composition.
— zooko (@zooko) September 10, 2019
Since the early days of privacy coins, scaling has been a contentious issue: with so much data needed to mask transactions, how do you grow a global network?
Bowe and the ECC claim recursive proofs solve this dilemma: with only one proof needed to verify an entire blockchain, data concerns could be a thing of the past:
“Privacy and scalability are two different concepts, but they come together nicely here. About 5 years ago, academics were working on recursive snarks, a proof that could verify itself or another proof [and even] verify multiple proofs. So, what [recursive proof composition] means is you only need one proof to verify an entire blockchain.”
To be sure, this isn’t sophomore-level algebra: Bowe told CoinDesk the proof alone took close to nine months of glueing various pieces together.

A new way to node

A further implication of recursive proofs is the amount of data stored on the blockchain. Since the entire ledger can be verified in one function, onboarding new nodes will be easier than ever, Bowe said.
“You’re going to see blockchains that have much higher capacity because you don’t have to communicate the entire history in one. The state chain still needs to be seen. But if you want to entire the network you don’t need to download the entire blockchain.”
While state chains still need to be monitored for basic transaction verification, syncing the entire history of a blockchain–over 400 GB and 200 GB for ethereum and bitcoin respectively–becomes a redundancy.
For zcash, Halo means easier hard forks. Without trusted setups, ECC research claims, “proofs of state changes need only reference the latest proof, allowing old history to be discarded forever.”
When asked where his discovery ranks with other advancements, Bowe spoke on its practicality:
“Where does this stand in the grand scheme of things in cryptocurrency? It’s a cryptographic tool to compress computation… and scale protocols.”
Rubix cube image via Shutterstock
submitted by GTE_IO to u/GTE_IO [link] [comments]

Craig Wright is a fraud hoaxer. A reminder for everyone

submitted by chek2fire to Bitcoin [link] [comments]

WOW!!! MOST BULLISH BITCOIN NEWS OF 2020!?! REAL OR FAKE??? Bitcoin to Die - its unavoidable. The death of crypto and the blockchain. Schnorr - YouTube Bitcoin Rising With The Tide?! July 2020 Price Prediction & News Analysis Bitcoin Update  Cryptocurrency Updates and News!

Thanks to the hard work done by Peter Wuille (a famous cryptography specialist) and his colleagues on an improved elliptical curve called secp256k1, Bitcoin’s ECDSA has become even faster and Bitcoin uses secp256k1’s Elliptic Curve as its bedrock cryptography. Bitcoin Trust System vs Centralised Trust Systems “Bitcoin fundamentally inverts the trust mechanism of a distributed system. Secp256k1 - Bitcoin Wiki bitcoin-core/secp256k1 - Optimized C library for EC operations on curve secp256k1 How to find and patch a bug in PHP source - Sammy Kaye Powers In this article, we will analyze the random secp256r1 curve and the Koblitz Secp256k1 curve (parameters, equation, automorphism…), by giving the strengths and weaknesses of each one of them, in What Does The Curve Used In Bitcoin, Secp256k1, Look Like? For example, submissions like "Buying 100 BTC" or "Selling my computer for bitcoins" do not belong here. /r/Bitcoin is primarily for news and discussion. Please avoid repetition /r/bitcoin is a subreddit devoted to new information and discussion about Bitcoin and its ecosystem.

[index] [24140] [25016] [10409] [7357] [19030] [3794] [25354] [3251] [20379] [19872]

WOW!!! MOST BULLISH BITCOIN NEWS OF 2020!?! REAL OR FAKE???

Bitcoin Analysis, Top bitcoin analysis, price prediction, Bitcoin Trading, Bitcoin 2018, Bitcoin Crash, Bitcoin Moon, Bitcoin News, Bitcoin Today, Best Bitcoin Analysis, Bitcoin price, Bitcoin to ... Bitcoin has traditionally used ECDSA signatures over the secp256k1 curve for authenticating transactions. These are standardized, but have a number of downsi... Bitcoin will crash 75% soon in 2020 before the 2021 BTC bull run can begin! price targets, TA & NYSE - Duration: ... (Investors Must See) Crypto News 2020 - Duration: 15:25. And, as bitcoin options just expired, we bring YOU the best Bitcoin news in the US in 2020! Watch the whole video. New Bitcoin Mining Machines Hit US As Major Firm Inks Deal With Bitmain BITCOIN SETUP FOR HUUUGE BULL RALLY?!! 💰Crypto Analysis TA Today & BTC Cryptocurrency Price News Now - Duration: 13:38. Crypto Kirby Trading 13,976 views 13:38

Flag Counter