Blockchain Explorer - Search the Blockchain | BTC | ETH | BCH

Parallels between two disruptive technologies: Internet & Blockchain – Part I

Parallels between two disruptive technologies: Internet & Blockchain – Part I
[The original article appeared on: https://block.co/blog/]
The emergence of disruptive technologies is always complemented by the creation and development of new models, mostly resulting in new economic concepts and business structures. The rise of the internet over 30 years ago, laid the foundation for the creation of new markets, for instance, that book store that sells all publications from around the world — Amazon. Or new concepts like the ‘Instant gratification’ that contributed to the introduction of business models like Netflix, finally allowing consumers to get instant access to films and series.
Blockchain is bound to create new models in all fields that regulate our lives, from a financial perspective to the regulation of infrastructures facilitating interactions and transactions in a way that would not be possible without the internet. This is one of the reasons why it is often referred to as the next generation of the internet or Web3, where WWW revolutionized information, Web2 facilitated interactions and now Web3 has the potential to innovate agreement and value exchange structures using the internet in a decentralized manner.
From a technological perspective, the similarities between the two are impressive.
We are currently believed to be in what were the early stages of the internet, with similar challenges around scalability, costs, and education, limiting the development of breakthrough applications and mass adoption. These internet challenges were resolved over time, therefore we should expect a similar progression in Blockchain.

https://preview.redd.it/bvdjgzt9jg451.jpg?width=577&format=pjpg&auto=webp&s=a1e9b81b2d5bcbe629b7f414c4a19bafab9373d0
In 1996, major internet service AOL could not manage a high volume of Internet users and went down for nineteen hours. Gradually the average internet speed in the US went from 50Kbps in 1999 to 18.7Mbps of 2017. Similarly, in 2017 the Ethereum blockchain failed to sustain the spike of on-chain transactions caused by the famous blockchain game CryptoKitties and the platform suffered the most serious network clog to date. These failures are necessary to the development of technology. Just like it was clear in 1996 that the internet had to face a scalability issue, it’s been evident for years now that blockchain also has to find ways to deal with the problem. Developers, programmers, experts, and academics are all working on the improvement of the system but there won’t be a definite solution, just like there wasn’t for the internet.
As popular Bitcoin expert and educator Andreas Antonopoulos mentioned in his book The Internet of Money, “Scale is not a goal to achieve; it is a definition of what you can do with the network today.” Scalability was built on the internet based on layers on top of the basic protocol and it looks like this will be the possible progression for the blockchain too. The Lightning Network, layer2 or off-chain protocol, is paving the way for Bitcoin fast and small payments along with a major focus on providing full privacy to transactions. Many believe in this respect we are still in 1994 Internet time, when the TCP/IP, HTML, and FTP were invented, leading to the successful business models represented by Facebook, Airbnb and Uber later. In the blockchain, breakthrough Dapps have yet to appear and will emerge in the coming years.
User Interface will help drive adoption in the same ways it helped the internet. At the time of the Arpanet, the technical foundation of the internet, the system was difficult to use for both nontechnical and technical people. The search functionality relied on an IP address and navigating the internet meant inserting a long string of numbers in order to find what you were looking for. It was easy to get confused, mistype numbers, etc. When the switch between the IP address and the URL happened, it became easier to navigate the web thanks to a more efficient and user-friendly experience overall.
Blockchain usage and benefit are still clunky. We still need a 12 or 24-word phrase to access a private cryptocurrency wallet and send a transaction to a long string of numbers (just like it happened with the early internet addresses) to validate it. All of this will disappear once user interfaces will be given the right attention and mass adoption will likely benefit from a system easier to use. It is clear that in the blockchain development more focus has been given so far to make the technology more secure, reliable, and robust at the expense of the user experience. Once the strength of the network/system is secured there will be a shift of interest in developing the interface with a resulting better user-friendly experience. Maybe it’s the right evolution and one that will bring a stronger technology structure overall.
Adoption is another important parallel we can highlight, not only with the Internet but with all the disruptive technologies previous to the World Wide Web. It took 46 years for electricity, 35 years for telephone, 14 years for TV, and 7 years for the Web to reach 25% of global market penetration. We can expect a similar growth trajectory to happen in the cryptocurrency/blockchain space, perhaps at a faster rate since the world is more connected now, thanks to the already existing internet interactions.
Stay tuned because in part two of this blog we will explore the parallels between the technologies in education, capitals, and start-ups and the decentralization of the blockchain as the main aspect that will revolutionize the Internet as well.
In the meantime, blockchain as a distributed ledger for a secure network of transactions is finding wide adoption as is the case for academic and ID credentials embraced by The University of Nicosia and Block.co. The University of Nicosia and Block.co can help provide the necessary technical expertise to follow the whole process from creation to publication on the blockchain where the document will be safely stored for life and where it can be independently verified by any third party. They were the first ones to do it globally as early as 2014.
For more info, contact [Block.co](mailto:Block.co) directly or email at [[email protected]](mailto:[email protected]).
Tel +357 70007828
Get the latest from Block.co, like and follow us on social media:
✔️Facebook
✔️LinkedIn
✔️Twitter
✔️YouTube
✔️Medium
✔️Instagram
✔️Telegram
✔️Reddit
✔️GitHub
submitted by BlockDotCo to u/BlockDotCo [link] [comments]

Looking for frontend developers and artists for building CryptoPandas during the SLP hackathon!

We’re looking for experts for building CryptoPandas during the SLP hackathon.
CryptoPandas is basically an adaptation of CryptoKitties to the Bitcoin Cash blockchain, but with lower fees and an (arguably) cuter species.
CryptoPandas uses SLP NFT1 tokens to represent the genome of a panda. New pandas can be birthed by anyone who has a male and a female panda. Only panda tokens following the birthing specification will be considered valid, similar to how only SLP transactions have to follow the SLP rules, albeit invalid token transactions may exist on the blockchain.
For the user, the process looks as follows:
For the implementation during the hackathon, the following has to be implemented:
This would require the following experts: - A Script/CashScript developer (that could be me). - An SLP-savvy backend developer (could be me too). - A general backend developer who sets up the APIs for the frontend and does the panda validation. - An SLP-savvy frontend developer (assuming WebAssembly is an option, that would be pretty much the same as the above) for performing the birthing process. - A panda-savvy artist who’s able to create different layers for different traits of pandas (definitely not me). - A general frontend developer who’s good at creating a good UX (absolutely not me).
If any of the above seem attractive to you, please write me on telegram: @tobiassan
The technical process looks as follows (review SLP NFT1 specification for SLP details):
While this scheme is quite complex, it fully non-interactive, ie. the only thing the operators of CryptoPandas have to do is maintain a pool of anyone-can-spend parent tokens. Verification has the same properties as the Simple Ledger Protocol, meaning to verify a panda token, a DAG check has to be performed.
submitted by eyeofpython to btc [link] [comments]

QUANTOCOIN: THE FIRST GLOBAL ALTERNATIVE ASSET WITH IMMEDIATE LIQUIDITY AND CURRENCY FUNCTIONALITY.

QUANTOCOIN: THE FIRST GLOBAL ALTERNATIVE ASSET WITH IMMEDIATE LIQUIDITY AND CURRENCY FUNCTIONALITY.
https://preview.redd.it/ua9ro1b3nna21.jpg?width=790&format=pjpg&auto=webp&s=5c98eff0eda981fce190eedea64b5df940d7c1fb

INTRODUCING QUANTOCOIN

Blockchain enables new ways of exchanging value securely while ensuring a reliable transaction. It enables people to meter excess capacity and facilitates non-traditional ways of generating income. Despite all the benefits, there are some challenges that the technology needs to address, I will talk about some of these challenges:
Because of all the hype surrounding Bitcoin and other digital currencies, blockchain started appearing like a pyramid scheme.
The technology is yet to mature and is susceptible to capacity problems, system failure, unanticipated bugs, and perhaps most damaging, the huge disappointment of technically unsophisticated users.
The bitcoin blockchain lacks transactional capacity. Some suggest that the way to mitigate this is by using other consensus algorithms. Another way is to use a sidechain which is a fork of a larger blockchain like bitcoin, while using the parent blockchain’s infrastructure.
Transactions on the blockchain are immutable which creates a system that is a bit sociopathic. Immutability is a double edged sword.
Much work needs to be done in basic interface and experience. A bitcoin address is an alphanumeric code. You don’t type an IP address to access a website, why would you then type an alphanumeric code to access a bitcoin wallet.
For these reasons stated up here, the QUANTOCOIN platform is birthed so as to reshape the blockchain ecosystem and the digital economy world at large where financial banking activities are concerned!
https://preview.redd.it/1csvnpf9nna21.jpg?width=1170&format=pjpg&auto=webp&s=22e2bef5ac43a8d1ed5d290269930d0f9f08375b

Here is the vision statement of this project; a cryptocurrency must be accepted by the masses, so it can develop its revolutionary potential. QUANTOCOIN aims to solve problems associated with the old-fashion banking system. The present financial system is supported by large-scale financial institutions. QUANTOCOIN is a platform for the future of funding that is built on top of the Waves blockchain. It accelerates the growth of unbanked people by offering tools and services that save both time and resources.

HOW QUANTOCOIN WORKS

QUANTOCOIN has developed a next-generation model for the future of financial services and digital banking. This model uses Blockchain technology along with smartphones, as well as a new kind of bio-identification system which will be used to ensure safe and secure accounts for each user. This is a project to develop the future of the crypto-financial world. Its primary goal is to integrate and connect QTC into the traditional financial world and to create a single gateway through QTC platform for users, traders, investors and financial institutions with a whole range of add-on services.
QTC bank with QTC banking platform as the final stage of QTC project will bring solution by providing cryptocurrencies to mainstream everyday users.
Watch this short explainer video below that dive into the problem the team is solving with the QUANTOCOIN platform project, focusing more on how it is being done and why you should join what it is that the team members are building for the betterment of the ecosystem at large. Enjoy Watching!
The QTC mobile App will be made available for download via the iOS or Android marketplace and customers will finally be able to use cryptocurrencies to pay for any fiat-based goods or services instantly, easily and immediately on any POS terminals with NFC technology, QR code or through our qtcBeacons.

THE UNIQUE FEATURES OF QUANTOCOIN

Below are some good and special features of QUANTOCOIN that distinguish this project from any other;
  • qtcBEACON: A BLE beacon is a small device that is usually powered by battery or USB which emits a Bluetooth Low Energy signal. A modern smartphone in the vicinity can pick up the signal being emitted by the beacon to deliver quick, safe and easy mobile-based proximity payment solutions. Beacon-based payment apps is also phone agnostic, thus making it practically accessible to any smartphone user.
  • QR CODE: Quick-Response codes have the set of ubiquity that NFC lacks. They work a bit like old fashion barcodes, but they are digital and give them the power to enable you to complete mobile payments.
  • NFC: Near Field Communication, is at a very basic level simply contactless communication between two devices. In regard to payment processing, it is the technology that enables your phone to communicate with the payment terminal (POS) to initiate a transaction.
  • Lending ProtocolFirst instant cash application: QTC Lending Protocol is the next step of the successful development of revolutionary QTC platform that provides the immediate cash backed with cryptocurrency pool. Contributors can now use the QTC mobile app to hold/store their crypto assets without the need to sell them.
End-users are allowed to leverage all their crypto-assets (QTC, BTC, ETH etc.) to instantly obtain fiat cash which can be spent through payment card and cashless that is contactless QTC mobile app. Within this lending period, the contributor’s asset portfolio is used as collateral, dynamically and instantly managed, giving the opportunity for profit and long-term growth of the portfolio.


TOKEN INFORMATION

Token Name: Quantocoin
Token Symbol: QTCt
Blockchain Platform: Waves
Initial Price of 1 QTCt Token: from USD 1.2
Duration of DTO: 10 x 28 days periods (10 Months)
Amount token per one period: 7,000,000 QTCt
Available no. of tokens: 70,000,000 QTCt
Payment Gateway: Bitcoin (ВТC), Ethereum (ETH), fiat payments,VISA/Master
Settlement Period: 2(3) days after DTO period ends.
DTO duration: 1st July 2018 (GMT 00:00) - 30th April 2019 (GMT 29:59)
Note that; all tokens will be sold during DTO
https://preview.redd.it/2mguzdjdnna21.jpg?width=810&format=pjpg&auto=webp&s=43b578dd497094b95150a7e8b787e4c06cbc72ac

ROADMAP

By definition, Roadmap is a plan or strategy intended to achieve a particular goal. That is to say the QUANTOCOIN Roadmap is step-by-step means by which the mission of the project is to be fully achieved.
Below is a pictorial representation of the QUANTOCOIN Roadmap:
https://preview.redd.it/ght7ah6gnna21.jpg?width=1059&format=pjpg&auto=webp&s=7ce9e2f25e8bd36677a34d6fb8ccb1e82a33bc6a
https://preview.redd.it/jw64ak6gnna21.jpg?width=1073&format=pjpg&auto=webp&s=de95ff5914b44a2c4b6ae45f17f921608f8c331f

CORE TEAM TEAM OF QUANTOCOIN

Behind every good project, there must be a solid team who are always brainstorming and working towards on how to achieve the aim of the project. The core team of QUANTOCOIN consists of experts who have extensive experience in IT, blockchain technology, marketing and business models. Also, the people who made up the team are veterans in the traditional finance industry, with more than 25 years of experience in online trading, FX and private banking. The idea of QTC originated from long-term planning and several years of practice.
Below are the brains that make up the QUANTOCOIN Team;
https://preview.redd.it/ru70fjilnna21.jpg?width=1040&format=pjpg&auto=webp&s=282613b2edb085d7079e95178c093f1c6cbbfcc3

https://preview.redd.it/kg1zbximnna21.jpg?width=1024&format=pjpg&auto=webp&s=badb30a4a861f2a52a2ce94b92ff1b24d672c763
In conclusion, the idea behind QUANTOCOIN project is a very lucrative and sophisticated one. Rest assured, QUANTOCOIN aims to cater for the blockchain enthusiast who needs an easy to navigate platform that that can give reasonable transactions from banking activities. At the same time, QUANTOCOIN will provide powered-up, cutting-edge features for professional crypto guys who require the full range of earning more, decision making and fundamentals information at their fingertips.
It would be a good financial decision to take a look at the promising project and get along by registering an account on the official website and purchase the tokens while it is still very cheap now (token still at the initial offering).

Get connected anytime with the Project using the links below for more information, updates and participation:
WEBSITE: https://www.quantocoin.io/
WHITEPAPER: https://quantocoin.io/wp-content/uploads/2018/07/WP_QTC_2562018-TESTING2-1.pdf
ANN THREAD: https://bitcointalk.org/index.php?topic=2897398
TWITTER: https://twitter.com/Quantocoin
FACEBOOK: https://www.facebook.com/qtcdto/
INSTAGRAM: https://www.instagram.com/quantocoin/?hl=en
LINKEDIN: https://www.linkedin.com/company/quantocoin/
REDDIT: https://www.reddit.com/useQuantocoin/
GITHUB: https://github.com/Quantocoin
YOUTUBE: https://www.youtube.com/channel/UC9fCUh5XSXBsVTDJ9Bsdt1w

https://preview.redd.it/svhk39lunna21.jpg?width=869&format=pjpg&auto=webp&s=7bfeebb5928678f7397c3e346ffb4c326e2ff7a1

WRITER'S DETAILS
BitcoinTalk Username: cryptoblezin
BitcoinTalk Profile URL: https://bitcointalk.org/index.php?action=profile;u=2178561;sa=summary
ETH Address: 0xC89b8Dd7e3E137DB108575EeAe301E52b6C72d9F
submitted by blessingsdrop to ICOAnalysis [link] [comments]

A proposal for a decentralized social network layer capable of storing rich media

Hello folks!
I have been thinking about the idea of decentralised social network for quite some time, and recently the ideas formed what I think is a rather compete picture. In light of recent Yours announcements I think it's a proper time to share these ideas with a community.
It turned into a long post, and there is no guarantee these idea have a contact with reality, so forgive me if I stole a few minutes of your time.
Protools and standards that will help to understand the proposal (besides blockchain): Memo - blockchain-based base social network protocol, WebRTC transport protocol, WebTorrent JavaScript BitTorrent protocol implementation, BidDB - blockchain crawler by u/unwriter, Progressive Web Apps –– cross platform mobile and desktop apps installable without gatekeepers.
Overall, I prefer WebTorrent in my proposal instead of IPFS as BitTorrent protocol is proved to be robust for almost two decades, while IPFS at this moment is very young and overhyped.
What I propose is a layer that can exist on top of any Memo-like protocol, where Memo forms a base social network state, and the media layer extends its capabilities so that it's possible to store rich media files without any centralised hostings.
Here are the hypothesis/axioms I use as a basement for such a media layer
The proposed idea is based on a play of three actors, or a triangle of 'Original Posters' 'Moderators' and 'Viewers'. Below is detailed explanation of each role, and there are some sub-roles that will be discussed alongside.
Original Poster is anyone connected to the internet who is willing to share any kind of content with the world with only modern browser and a content itself in possession.
Moderator is anyone in the connected world who is willing to be engaged into a socially important role with only a desktop computer with decent amount of free disk space in possession. There is no need to ask a permission to become a moderator.
Viewer is anyone willing to enjoy the media without the need to be engaged with existing social media platforms.
Base technologies:
  1. A webtorrent enabled website with a support of basic bch wallet functionaloty.
  2. A webtorrent enabled website with a feed of op_return messages. Note: 1 and 2 can be implemented as a single platform. (e.g. instant.io x datacash x chainfeed)
  3. A webrtc enabled cross platform desktop torrent client hybridised with a bitdb instance
  4. A webtorrent enabled torrent tracker(s)
The flow:
The Original Poster uses a web browser to create a torrent of the attached media. OP registers the torrent on a tracker, puts infohash alongside a tracker url and desired hashtags into op_return and publishes the memo formatted transaction to bitcoin network. The progress bar shows the status of the 'pseudo' upload that's familiar to most non-tech savvy people. During that phase the content is in network’s ‘working-memory’.
The Moderator uses software to parse the op_return feed. The software continuously downloads all the media from initial seeders and presents it to moderator one by one. It does not open itself as a seeder until moderator decided whether this is a kind of content worth bothering. It's completely subjective decision and every moderator can follow personal strategy. It can be imagined as clicking the green and red buttons where the red one is clicked if the content is subjectively a complete garbage. Once the green button is clicked, moderator becomes a seeder of the content. Moderator can also 'reply' to OPs message with hashtags: every hashtag that corresponds to one of initial hashtags gives it additional weight. Every omitted hashtag loses weight. Some new hashtags can as well be introduced by a moderator. The deeper the history of moderator’s categorization activity, the more weight categorization transaction gives to hashtags (but this is a higher level concept and can vary from implementation to implementation). Moderator creates an internal queue of stored media and deletes the oldest content as soon as the storage threshold is hit (but some other policy can be implemented if moderator decides so). Described above is a level00 moderator who decided to judge the very unclassified content that's received directly from initial seeders.
If the collective speed of content approval is lower than speed of new content introduction, OPs is notified that it maybe necessary to wait for a prolonged time for content to be uploaded, or the fee can be included towards a 'super-moderator' address, so moderators who operate under a single swarm will priorities that content. That address can be a mulitisig where each moderator is a part of a joint account. Once in a while they unlock funds and distribute them in accordance to each moderator's contribution based on the number of 'categorisation transactions' – replies with hashtags, and there can be additional rules that prevent cheating such as only one categorisation transaction to each OP post is taken into account, or rules with some degree of centralisation that encourage seeding, such as the more the moderator seeds the more he earns from these fees if the swarm operates under a single tracker). Alternatively, payouts can be implemented as simple and centralised as existing mining pools.
There are Moderator sub-roles, such as a moderator can choose to only parse the content that was categorised to some degree (e.g. only nsfw content, or only non-nsfw content). The deeper the categorisation, the more precise is the kind of content that's fetched by a moderator, to a degree where moderator can actually enjoy the process a lot as he approves the kind of content he is the most interested in, akin to browsing chronologically filtered subreddit feed. Moderator can also choose to parse several 'categories' simply by 'subscribing' to several hashtags or hashtag tuples. The subroles can be named like moderator level01, level10, level11 etc. By replying to lower level moderator's categorisation transactions, higher-level moderators gives or removes hashtags weight.
The Viewer is presented with a feed of op_return media posts (similar to chainfeed.org), and the content is fetched on the fly from the webtorrent network. The moment the content is fetched the viewer becomes a seeder and continues seeding for as long as content is cached inside browser's storage. That way, the more moderators have approved the content, and the more followers the OP has, the longer the content will persist in a network's 'short-term’ memory.
The Viewer role has sub-roles as well. As soon as the user is engaged into that kind of social network, he can become a Loyal User by installing a special software on a desktop computer that is very similar to Moderators's software, but differs in a following way: viewer inputs Memo account identifier (which is a bitcoin address) into the software that only fetches and seeds the content that was liked by a user, completely in background. As the whole network state is a public information, each user can increase the level of loyalty by specifying the maximum 'dimension' of the content being fetched and seeded, where 1D is the content liked by initial viewer, 2D is the content liked by initial viewer and accounts followed by initial viewer and so on, up until around 6D, where mostly anything that was liked is stored within individual's storage threshold. Loyal Viewers can adopt different policies to restrict the content being fetched and seeded by blacklisting or prioritising certain hashtags, adopting some third-party priority/black lists, as well as specifying storage threshold. Contented that is stored by Loyal Users can be imagined as persisted in networks ‘long-term’ memory. The more Loyal Users are engaged in a network, and the more likes certain content has, the longer it will be stored.
It's worth noting that centralised torrent trackers are not points of failure per se as they are mostly used to pass the content from initial [browser] seeders to moderators. As soon as the content is approved by at least one moderator it can be listed on different trackers operated by different entities, and there can be a rotation of trackers if necessary. That said, each moderator can always re-register all of the content in possession on a new tracker, and the tracker can be adopted by web op_return feed providers. Moreover, the ongoing evolution of browser standards related to web-workers will make in-browser dht lookup a reality in a 2-3 years, which is likely a reasonable window to bootstrap such a network. OP can use some trackers only known among neighbours in particular area.
The layer is vulnerable to a situation where trackers blacklist certain content, and such content can be accessed by using a different op_return feed provider with different trackers, or a native app that will be able to fetch content seeders from the dht. Networks such as i2p can be used to create deep media layers operated anonymously. Also, as Tor is adopted by mainstream browsers (e.g. Brave) Viewers can access trackers through Tor, and such trackers are more resilient. These viewers will be unable to seed, however.
The layer is capable of storing any kind of content, but during bootstrap phase it will be most suitable for images, short video/audio messages, markdown formatted blogposts with embedded media. Each Moderator / Loyal Viewer can adopt different policies related to the size of the content being fetched and stored according to investments into storage facilities. If the proposed idea works, there will be parties willing to store some heavyweight content such as movies. If the layer is accessed from within a native app, it's even capable of livestreams, where the more users are watching a stream the more bandwidth there is for others to join, completely without any centralised content distribution networks.
As outlined above, the layer consist of short-term memory layer capable of storing content for minutes-days, and long-term memory layer capable of storing content for months and probably years. I use biological metaphors here instead of computer science ones as in my opinion the behaviour of this media layer resembles human memory more than computer memory, as ultimately it's a collective human brain decides what to remember and for how long. There is no guarantee that something will be stored at all, and at the same time some kind of content that's collectively perceived as valuable can be stored for a prolonged period of time.
Few words in regard to monetisation. Some heavily engaged players can choose to archive old content and provide access in trade for some micropayments. I see like the Joystream protocol can be used here with little changes such as adoption of webrtc transport protocol. Some different monetisation strategies can be discussed later as microtransaction technologies are more mature and well understood.
I am willing to form a workgroup of developers and creative enthusiasts who find the described idea interesting. I have been thinking about a possible starting point, so I have acquired the BlockPress source code with intention to distribute it in open source. We postponed the announcement a bit as the process of open-source release always takes time. BlockPress is an alternative Memo protocol implementation with a rather slick UI that's familiar to non tech savvy users - the quality I find extremely important. I think this can be a good starting point. If you think so as well, feel free to drop me a telegram message @taowanzou or [proton mail](mailto:[email protected]). Follow me on memo as well!
Sorry for any possible mistakes as English is not my primary language. And thanks for you time reading this!
submitted by taowanzou to btc [link] [comments]

Merkle Trees and Mountain Ranges - Making UTXO Set Growth Irrelevant With Low-Latency Delayed TXO Commitments

Original link: https://lists.linuxfoundation.org/pipermail/bitcoin-dev/2016-May/012715.html
Unedited text and originally written by:

Peter Todd pete at petertodd.org
Tue May 17 13:23:11 UTC 2016
Previous message: [bitcoin-dev] Bip44 extension for P2SH/P2WSH/...
Next message: [bitcoin-dev] Making UTXO Set Growth Irrelevant With Low-Latency Delayed TXO Commitments
Messages sorted by: [ date ] [ thread ] [ subject ] [ author ]
# Motivation

UTXO growth is a serious concern for Bitcoin's long-term decentralization. To
run a competitive mining operation potentially the entire UTXO set must be in
RAM to achieve competitive latency; your larger, more centralized, competitors
will have the UTXO set in RAM. Mining is a zero-sum game, so the extra latency
of not doing so if they do directly impacts your profit margin. Secondly,
having possession of the UTXO set is one of the minimum requirements to run a
full node; the larger the set the harder it is to run a full node.

Currently the maximum size of the UTXO set is unbounded as there is no
consensus rule that limits growth, other than the block-size limit itself; as
of writing the UTXO set is 1.3GB in the on-disk, compressed serialization,
which expands to significantly more in memory. UTXO growth is driven by a
number of factors, including the fact that there is little incentive to merge
inputs, lost coins, dust outputs that can't be economically spent, and
non-btc-value-transfer "blockchain" use-cases such as anti-replay oracles and
timestamping.

We don't have good tools to combat UTXO growth. Segregated Witness proposes to
give witness space a 75% discount, in part of make reducing the UTXO set size
by spending txouts cheaper. While this may change wallets to more often spend
dust, it's hard to imagine an incentive sufficiently strong to discourage most,
let alone all, UTXO growing behavior.

For example, timestamping applications often create unspendable outputs due to
ease of implementation, and because doing so is an easy way to make sure that
the data required to reconstruct the timestamp proof won't get lost - all
Bitcoin full nodes are forced to keep a copy of it. Similarly anti-replay
use-cases like using the UTXO set for key rotation piggyback on the uniquely
strong security and decentralization guarantee that Bitcoin provides; it's very
difficult - perhaps impossible - to provide these applications with
alternatives that are equally secure. These non-btc-value-transfer use-cases
can often afford to pay far higher fees per UTXO created than competing
btc-value-transfer use-cases; many users could afford to spend $50 to register
a new PGP key, yet would rather not spend $50 in fees to create a standard two
output transaction. Effective techniques to resist miner censorship exist, so
without resorting to whitelists blocking non-btc-value-transfer use-cases as
"spam" is not a long-term, incentive compatible, solution.

A hard upper limit on UTXO set size could create a more level playing field in
the form of fixed minimum requirements to run a performant Bitcoin node, and
make the issue of UTXO "spam" less important. However, making any coins
unspendable, regardless of age or value, is a politically untenable economic
change.


# TXO Commitments

A merkle tree committing to the state of all transaction outputs, both spent
and unspent, we can provide a method of compactly proving the current state of
an output. This lets us "archive" less frequently accessed parts of the UTXO
set, allowing full nodes to discard the associated data, still providing a
mechanism to spend those archived outputs by proving to those nodes that the
outputs are in fact unspent.

Specifically TXO commitments proposes a Merkle Mountain Range¹ (MMR), a
type of deterministic, indexable, insertion ordered merkle tree, which allows
new items to be cheaply appended to the tree with minimal storage requirements,
just log2(n) "mountain tips". Once an output is added to the TXO MMR it is
never removed; if an output is spent its status is updated in place. Both the
state of a specific item in the MMR, as well the validity of changes to items
in the MMR, can be proven with log2(n) sized proofs consisting of a merkle path
to the tip of the tree.

At an extreme, with TXO commitments we could even have no UTXO set at all,
entirely eliminating the UTXO growth problem. Transactions would simply be
accompanied by TXO commitment proofs showing that the outputs they wanted to
spend were still unspent; nodes could update the state of the TXO MMR purely
from TXO commitment proofs. However, the log2(n) bandwidth overhead per txin is
substantial, so a more realistic implementation is be to have a UTXO cache for
recent transactions, with TXO commitments acting as a alternate for the (rare)
event that an old txout needs to be spent.

Proofs can be generated and added to transactions without the involvement of
the signers, even after the fact; there's no need for the proof itself to
signed and the proof is not part of the transaction hash. Anyone with access to
TXO MMR data can (re)generate missing proofs, so minimal, if any, changes are
required to wallet software to make use of TXO commitments.


## Delayed Commitments

TXO commitments aren't a new idea - the author proposed them years ago in
response to UTXO commitments. However it's critical for small miners' orphan
rates that block validation be fast, and so far it has proven difficult to
create (U)TXO implementations with acceptable performance; updating and
recalculating cryptographicly hashed merkelized datasets is inherently more
work than not doing so. Fortunately if we maintain a UTXO set for recent
outputs, TXO commitments are only needed when spending old, archived, outputs.
We can take advantage of this by delaying the commitment, allowing it to be
calculated well in advance of it actually being used, thus changing a
latency-critical task into a much easier average throughput problem.

Concretely each block B_i commits to the TXO set state as of block B_{i-n}, in
other words what the TXO commitment would have been n blocks ago, if not for
the n block delay. Since that commitment only depends on the contents of the
blockchain up until block B_{i-n}, the contents of any block after are
irrelevant to the calculation.


## Implementation

Our proposed high-performance/low-latency delayed commitment full-node
implementation needs to store the following data:

1) UTXO set

Low-latency K:V map of txouts definitely known to be unspent. Similar to
existing UTXO implementation, but with the key difference that old,
unspent, outputs may be pruned from the UTXO set.


2) STXO set

Low-latency set of transaction outputs known to have been spent by
transactions after the most recent TXO commitment, but created prior to the
TXO commitment.


3) TXO journal

FIFO of outputs that need to be marked as spent in the TXO MMR. Appends
must be low-latency; removals can be high-latency.


4) TXO MMR list

Prunable, ordered list of TXO MMR's, mainly the highest pending commitment,
backed by a reference counted, cryptographically hashed object store
indexed by digest (similar to how git repos work). High-latency ok. We'll
cover this in more in detail later.


### Fast-Path: Verifying a Txout Spend In a Block

When a transaction output is spent by a transaction in a block we have two
cases:

1) Recently created output

Output created after the most recent TXO commitment, so it should be in the
UTXO set; the transaction spending it does not need a TXO commitment proof.
Remove the output from the UTXO set and append it to the TXO journal.

2) Archived output

Output created prior to the most recent TXO commitment, so there's no
guarantee it's in the UTXO set; transaction will have a TXO commitment
proof for the most recent TXO commitment showing that it was unspent.
Check that the output isn't already in the STXO set (double-spent), and if
not add it. Append the output and TXO commitment proof to the TXO journal.

In both cases recording an output as spent requires no more than two key:value
updates, and one journal append. The existing UTXO set requires one key:value
update per spend, so we can expect new block validation latency to be within 2x
of the status quo even in the worst case of 100% archived output spends.


### Slow-Path: Calculating Pending TXO Commitments

In a low-priority background task we flush the TXO journal, recording the
outputs spent by each block in the TXO MMR, and hashing MMR data to obtain the
TXO commitment digest. Additionally this background task removes STXO's that
have been recorded in TXO commitments, and prunes TXO commitment data no longer
needed.

Throughput for the TXO commitment calculation will be worse than the existing
UTXO only scheme. This impacts bulk verification, e.g. initial block download.
That said, TXO commitments provides other possible tradeoffs that can mitigate
impact of slower validation throughput, such as skipping validation of old
history, as well as fraud proof approaches.


### TXO MMR Implementation Details

Each TXO MMR state is a modification of the previous one with most information
shared, so we an space-efficiently store a large number of TXO commitments
states, where each state is a small delta of the previous state, by sharing
unchanged data between each state; cycles are impossible in merkelized data
structures, so simple reference counting is sufficient for garbage collection.
Data no longer needed can be pruned by dropping it from the database, and
unpruned by adding it again. Since everything is committed to via cryptographic
hash, we're guaranteed that regardless of where we get the data, after
unpruning we'll have the right data.

Let's look at how the TXO MMR works in detail. Consider the following TXO MMR
with two txouts, which we'll call state #0:

0
/ \
a b

If we add another entry we get state #1:

1
/ \
0 \
/ \ \
a b c

Note how it 100% of the state #0 data was reused in commitment #1. Let's
add two more entries to get state #2:

2
/ \
2 \
/ \ \
/ \ \
/ \ \
0 2 \
/ \ / \ \
a b c d e

This time part of state #1 wasn't reused - it's wasn't a perfect binary
tree - but we've still got a lot of re-use.

Now suppose state #2 is committed into the blockchain by the most recent block.
Future transactions attempting to spend outputs created as of state #2 are
obliged to prove that they are unspent; essentially they're forced to provide
part of the state #2 MMR data. This lets us prune that data, discarding it,
leaving us with only the bare minimum data we need to append new txouts to the
TXO MMR, the tips of the perfect binary trees ("mountains") within the MMR:

2
/ \
2 \
\
\
\
\
\
e

Note that we're glossing over some nuance here about exactly what data needs to
be kept; depending on the details of the implementation the only data we need
for nodes "2" and "e" may be their hash digest.

Adding another three more txouts results in state #3:

3
/ \
/ \
/ \
/ \
/ \
/ \
/ \
2 3
/ \
/ \
/ \
3 3
/ \ / \
e f g h

Suppose recently created txout f is spent. We have all the data required to
update the MMR, giving us state #4. It modifies two inner nodes and one leaf
node:

4
/ \
/ \
/ \
/ \
/ \
/ \
/ \
2 4
/ \
/ \
/ \
4 3
/ \ / \
e (f) g h

If an archived txout is spent requires the transaction to provide the merkle
path to the most recently committed TXO, in our case state #2. If txout b is
spent that means the transaction must provide the following data from state #2:

2
/
2
/
/
/
0
\
b

We can add that data to our local knowledge of the TXO MMR, unpruning part of
it:

4
/ \
/ \
/ \
/ \
/ \
/ \
/ \
2 4
/ / \
/ / \
/ / \
0 4 3
\ / \ / \
b e (f) g h

Remember, we haven't _modified_ state #4 yet; we just have more data about it.
When we mark txout b as spent we get state #5:

5
/ \
/ \
/ \
/ \
/ \
/ \
/ \
5 4
/ / \
/ / \
/ / \
5 4 3
\ / \ / \
(b) e (f) g h

Secondly by now state #3 has been committed into the chain, and transactions
that want to spend txouts created as of state #3 must provide a TXO proof
consisting of state #3 data. The leaf nodes for outputs g and h, and the inner
node above them, are part of state #3, so we prune them:

5
/ \
/ \
/ \
/ \
/ \
/ \
/ \
5 4
/ /
/ /
/ /
5 4
\ / \
(b) e (f)

Finally, lets put this all together, by spending txouts a, c, and g, and
creating three new txouts i, j, and k. State #3 was the most recently committed
state, so the transactions spending a and g are providing merkle paths up to
it. This includes part of the state #2 data:

3
/ \
/ \
/ \
/ \
/ \
/ \
/ \
2 3
/ \ \
/ \ \
/ \ \
0 2 3
/ / /
a c g

After unpruning we have the following data for state #5:

5
/ \
/ \
/ \
/ \
/ \
/ \
/ \
5 4
/ \ / \
/ \ / \
/ \ / \
5 2 4 3
/ \ / / \ /
a (b) c e (f) g

That's sufficient to mark the three outputs as spent and add the three new
txouts, resulting in state #6:

6
/ \
/ \
/ \
/ \
/ \
6 \
/ \ \
/ \ \
/ \ \
/ \ \
/ \ \
/ \ \
/ \ \
6 6 \
/ \ / \ \
/ \ / \ 6
/ \ / \ / \
6 6 4 6 6 \
/ \ / / \ / / \ \
(a) (b) (c) e (f) (g) i j k

Again, state #4 related data can be pruned. In addition, depending on how the
STXO set is implemented may also be able to prune data related to spent txouts
after that state, including inner nodes where all txouts under them have been
spent (more on pruning spent inner nodes later).


### Consensus and Pruning

It's important to note that pruning behavior is consensus critical: a full node
that is missing data due to pruning it too soon will fall out of consensus, and
a miner that fails to include a merkle proof that is required by the consensus
is creating an invalid block. At the same time many full nodes will have
significantly more data on hand than the bare minimum so they can help wallets
make transactions spending old coins; implementations should strongly consider
separating the data that is, and isn't, strictly required for consensus.

A reasonable approach for the low-level cryptography may be to actually treat
the two cases differently, with the TXO commitments committing too what data
does and does not need to be kept on hand by the UTXO expiration rules. On the
other hand, leaving that uncommitted allows for certain types of soft-forks
where the protocol is changed to require more data than it previously did.


### Consensus Critical Storage Overheads

Only the UTXO and STXO sets need to be kept on fast random access storage.
Since STXO set entries can only be created by spending a UTXO - and are smaller
than a UTXO entry - we can guarantee that the peak size of the UTXO and STXO
sets combined will always be less than the peak size of the UTXO set alone in
the existing UTXO-only scheme (though the combined size can be temporarily
higher than what the UTXO set size alone would be when large numbers of
archived txouts are spent).

TXO journal entries and unpruned entries in the TXO MMR have log2(n) maximum
overhead per entry: a unique merkle path to a TXO commitment (by "unique" we
mean that no other entry shares data with it). On a reasonably fast system the
TXO journal will be flushed quickly, converting it into TXO MMR data; the TXO
journal will never be more than a few blocks in size.

Transactions spending non-archived txouts are not required to provide any TXO
commitment data; we must have that data on hand in the form of one TXO MMR
entry per UTXO. Once spent however the TXO MMR leaf node associated with that
non-archived txout can be immediately pruned - it's no longer in the UTXO set
so any attempt to spend it will fail; the data is now immutable and we'll never
need it again. Inner nodes in the TXO MMR can also be pruned if all leafs under
them are fully spent; detecting this is easy the TXO MMR is a merkle-sum tree,
with each inner node committing to the sum of the unspent txouts under it.

When a archived txout is spent the transaction is required to provide a merkle
path to the most recent TXO commitment. As shown above that path is sufficient
information to unprune the necessary nodes in the TXO MMR and apply the spend
immediately, reducing this case to the TXO journal size question (non-consensus
critical overhead is a different question, which we'll address in the next
section).

Taking all this into account the only significant storage overhead of our TXO
commitments scheme when compared to the status quo is the log2(n) merkle path
overhead; as long as less than 1/log2(n) of the UTXO set is active,
non-archived, UTXO's we've come out ahead, even in the unrealistic case where
all storage available is equally fast. In the real world that isn't yet the
case - even SSD's significantly slower than RAM.


### Non-Consensus Critical Storage Overheads

Transactions spending archived txouts pose two challenges:

1) Obtaining up-to-date TXO commitment proofs

2) Updating those proofs as blocks are mined

The first challenge can be handled by specialized archival nodes, not unlike
how some nodes make transaction data available to wallets via bloom filters or
the Electrum protocol. There's a whole variety of options available, and the
the data can be easily sharded to scale horizontally; the data is
self-validating allowing horizontal scaling without trust.

While miners and relay nodes don't need to be concerned about the initial
commitment proof, updating that proof is another matter. If a node aggressively
prunes old versions of the TXO MMR as it calculates pending TXO commitments, it
won't have the data available to update the TXO commitment proof to be against
the next block, when that block is found; the child nodes of the TXO MMR tip
are guaranteed to have changed, yet aggressive pruning would have discarded that
data.

Relay nodes could ignore this problem if they simply accept the fact that
they'll only be able to fully relay the transaction once, when it is initially
broadcast, and won't be able to provide mempool functionality after the initial
relay. Modulo high-latency mixnets, this is probably acceptable; the author has
previously argued that relay nodes don't need a mempool² at all.

For a miner though not having the data necessary to update the proofs as blocks
are found means potentially losing out on transactions fees. So how much extra
data is necessary to make this a non-issue?

Since the TXO MMR is insertion ordered, spending a non-archived txout can only
invalidate the upper nodes in of the archived txout's TXO MMR proof (if this
isn't clear, imagine a two-level scheme, with a per-block TXO MMRs, committed
by a master MMR for all blocks). The maximum number of relevant inner nodes
changed is log2(n) per block, so if there are n non-archival blocks between the
most recent TXO commitment and the pending TXO MMR tip, we have to store
log2(n)*n inner nodes - on the order of a few dozen MB even when n is a
(seemingly ridiculously high) year worth of blocks.

Archived txout spends on the other hand can invalidate TXO MMR proofs at any
level - consider the case of two adjacent txouts being spent. To guarantee
success requires storing full proofs. However, they're limited by the blocksize
limit, and additionally are expected to be relatively uncommon. For example, if
1% of 1MB blocks was archival spends, our hypothetical year long TXO commitment
delay is only a few hundred MB of data with low-IO-performance requirements.


## Security Model

Of course, a TXO commitment delay of a year sounds ridiculous. Even the slowest
imaginable computer isn't going to need more than a few blocks of TXO
commitment delay to keep up ~100% of the time, and there's no reason why we
can't have the UTXO archive delay be significantly longer than the TXO
commitment delay.

However, as with UTXO commitments, TXO commitments raise issues with Bitcoin's
security model by allowing relatively miners to profitably mine transactions
without bothering to validate prior history. At the extreme, if there was no
commitment delay at all at the cost of a bit of some extra network bandwidth
"full" nodes could operate and even mine blocks completely statelessly by
expecting all transactions to include "proof" that their inputs are unspent; a
TXO commitment proof for a commitment you haven't verified isn't a proof that a
transaction output is unspent, it's a proof that some miners claimed the txout
was unspent.

At one extreme, we could simply implement TXO commitments in a "virtual"
fashion, without miners actually including the TXO commitment digest in their
blocks at all. Full nodes would be forced to compute the commitment from
scratch, in the same way they are forced to compute the UTXO state, or total
work. Of course a full node operator who doesn't want to verify old history can
get a copy of the TXO state from a trusted source - no different from how you
could get a copy of the UTXO set from a trusted source.

A more pragmatic approach is to accept that people will do that anyway, and
instead assume that sufficiently old blocks are valid. But how old is
"sufficiently old"? First of all, if your full node implementation comes "from
the factory" with a reasonably up-to-date minimum accepted total-work
thresholdⁱ - in other words it won't accept a chain with less than that amount
of total work - it may be reasonable to assume any Sybil attacker with
sufficient hashing power to make a forked chain meeting that threshold with,
say, six months worth of blocks has enough hashing power to threaten the main
chain as well.

That leaves public attempts to falsify TXO commitments, done out in the open by
the majority of hashing power. In this circumstance the "assumed valid"
threshold determines how long the attack would have to go on before full nodes
start accepting the invalid chain, or at least, newly installed/recently reset
full nodes. The minimum age that we can "assume valid" is tradeoff between
political/social/technical concerns; we probably want at least a few weeks to
guarantee the defenders a chance to organise themselves.

With this in mind, a longer-than-technically-necessary TXO commitment delayʲ
may help ensure that full node software actually validates some minimum number
of blocks out-of-the-box, without taking shortcuts. However this can be
achieved in a wide variety of ways, such as the author's prev-block-proof
proposal³, fraud proofs, or even a PoW with an inner loop dependent on
blockchain data. Like UTXO commitments, TXO commitments are also potentially
very useful in reducing the need for SPV wallet software to trust third parties
providing them with transaction data.

i) Checkpoints that reject any chain without a specific block are a more
common, if uglier, way of achieving this protection.

j) A good homework problem is to figure out how the TXO commitment could be
designed such that the delay could be reduced in a soft-fork.


## Further Work

While we've shown that TXO commitments certainly could be implemented without
increasing peak IO bandwidth/block validation latency significantly with the
delayed commitment approach, we're far from being certain that they should be
implemented this way (or at all).

1) Can a TXO commitment scheme be optimized sufficiently to be used directly
without a commitment delay? Obviously it'd be preferable to avoid all the above
complexity entirely.

2) Is it possible to use a metric other than age, e.g. priority? While this
complicates the pruning logic, it could use the UTXO set space more
efficiently, especially if your goal is to prioritise bitcoin value-transfer
over other uses (though if "normal" wallets nearly never need to use TXO
commitments proofs to spend outputs, the infrastructure to actually do this may
rot).

3) Should UTXO archiving be based on a fixed size UTXO set, rather than an
age/priority/etc. threshold?

4) By fixing the problem (or possibly just "fixing" the problem) are we
encouraging/legitimising blockchain use-cases other than BTC value transfer?
Should we?

5) Instead of TXO commitment proofs counting towards the blocksize limit, can
we use a different miner fairness/decentralization metric/incentive? For
instance it might be reasonable for the TXO commitment proof size to be
discounted, or ignored entirely, if a proof-of-propagation scheme (e.g.
thinblocks) is used to ensure all miners have received the proof in advance.

6) How does this interact with fraud proofs? Obviously furthering dependency on
non-cryptographically-committed STXO/UTXO databases is incompatible with the
modularized validation approach to implementing fraud proofs.


# References

1) "Merkle Mountain Ranges",
Peter Todd, OpenTimestamps, Mar 18 2013,
https://github.com/opentimestamps/opentimestamps-serveblob/mastedoc/merkle-mountain-range.md

2) "Do we really need a mempool? (for relay nodes)",
Peter Todd, bitcoin-dev mailing list, Jul 18th 2015,
https://lists.linuxfoundation.org/pipermail/bitcoin-dev/2015-July/009479.html

3) "Segregated witnesses and validationless mining",
Peter Todd, bitcoin-dev mailing list, Dec 23rd 2015,
https://lists.linuxfoundation.org/pipermail/bitcoin-dev/2015-Decembe012103.html

--
https://petertodd.org 'peter'[:-1]@petertodd.org
-------------- next part --------------
A non-text attachment was scrubbed...
Name: signature.asc
Type: application/pgp-signature
Size: 455 bytes
Desc: Digital signature
URL: <http://lists.linuxfoundation.org/pipermail/bitcoin-dev/attachments/20160517/33f69665/attachment-0001.sig>
submitted by Godballz to CryptoTechnology [link] [comments]

Ravencoin Open Developer Meeting - 1/4/2019

[14:04] Hi everyone! [14:04] :dabbitwave: [14:04] Hey Everybody! [14:04] Hello 😃 [14:04] Sorry we're getting started a bit late. [14:04] Topics: SLC Meetup (March 15th) [14:04] 👋 [14:04] Roadmap breakdown - posted to github [14:05] IPFS (integration) [14:05] greetings 👋 [14:05] So, SLC Meetup on the 15th! [14:05] Great! [14:05] Hi! [14:06] Hi all — a special thanks to the developers and congratulations on an amazing first year!!! # [14:06] <[Dev] Blondfrogs> Hello Everyone! [14:07] We have a tentative agenda with @Tron , @corby speaking. [14:08] We would like to have nice walkthrough of the Raven DevKit for the meetup. [14:08] We are planning on hosting a meetup in SLC at the Overstock building on March 15th from 6:00pm-9:00pm. It is free admission, but there is a page on meetup.com where people can rsvp so that we have a somewhat accurate headcount for food. [14:08] sup guys [14:08] hey russ [14:09] We are planning on having a few speakers and have allotted a bit of time at the end for people to meet and greet each other. [14:09] can you guys link us to the page somewhere when thats available? 😄 [14:10] free food?! [14:10] todays topic? [14:10] yeah can we indicate pepperoni pizza [14:10] Sounds good to me @Jeroz Nothing ordered yet though. 😃 [14:10] only pepperoni pizza is served at true blockchain meetings right [14:10] :blobhide: [14:10] Absolutely. The itinerary just needs to be finalized and then I'll make a broad post about the rest of the details. [14:11] https://www.meetup.com/Salt-Lake-City-salt-lake-city-Meetup/ [14:11] 😭 so far away [14:11] West Coast! [14:11] @MTarget But there's pizza, so worth the travel time. [14:11] lol [14:12] I'll be watching the stream if its available since i'm from montreal/canada 😛 [14:12] Ah yes, I love $300 pizza 😉 [14:12] as long as I get to see your smiling faces @Tron @RavencoinDev then it's worth the time [14:12] We'll be there. [14:12] We'll be messaging additional details as they get finalized. [14:12] Greeting and salutations! [14:12] sup [14:13] Hey, $300 is considerably cheaper than 2 $3,700,000 pizzas. [14:14] Ok, switching topics... [14:14] yeah its a way to fly, [14:14] question is whether those piza's will be paid for in RVN coin or not :ThinkBlack: [14:14] Roadmap [14:14] It hasn't changed, just added some detail. [14:14] https://github.com/RavenProject/Ravencoin/tree/masteroadmap [14:15] nice [14:15] This now links to a breakdown for messaging, voting, anti-spam, and rewards (dividends) [14:15] will there be any additional RPC functionality coming in the future, thinking in terms of some functions that are only available in ravencore-lib [14:15] apologies if now is not time to ask questions, i can wait for later [14:15] "Phase 7 - Compatibility Mode" - that's new 😮 [14:15] The protocol for messaging is pretty well established, but the rest isn't in stone (code) yet. [14:16] can you give us details on compatibility mode? [14:16] In broad brush strokes. [14:17] The idea is to allow ravend to act as a daemon that looks like a single coin. [14:17] so ravend that only works with the bitcoin asset? [14:18] interesting [14:19] So you start it with an option to only work with a single asset/token account or something? [14:19] hmm compelling what is the reason for this? some kind of scale or performance? [14:19] ^ [14:19] Example: Configure ravend to listen for transfer RPC call for senttoaddress or sendfrom, but for a specific asset. This would allow easy integration into existing system for assets. [14:20] Only the daemon or the whole wallet UI? [14:20] yeah thats great, rpc functions dont allow us to do this yet, if i recall [14:20] or at least we depend more on ravencore lib [14:20] so like asset zmq [14:20] that's smart [14:20] @Tron it also sounds like it makes our life easier working with RPC, instead of core all the time for some functionality [14:21] if i understand correctly anyways [14:21] So you could run numerous instances of ravend each on their own network and RPC port, each configured for a different asset. You would need some balance of RVN in each one to cover transaction fees, then. [14:21] id be curious to know what all the advantages are of this [14:21] one more question, how would i decentralize the gateway between bitcoin mainnet/ravencoin mainnet? in the current RSK implementation they use a federated gateway, how would we avoid this? [14:21] it sounds neato [14:21] Just the daemon. The alternative is to get exchanges to adapt to our RPC calls for assets. It is easier if it just looks like Bitcoin, Litecoin or RVN to them, but it is really transferring FREE_HUGS [14:22] That makes sense. Should further increased exchange adoption for each asset. [14:22] hmm yeah its just easier for wallet integration because its basically the same as rvn and bitcoin but for a specific asset [14:22] so this is in specific mind of exchange listings for assets i guess [14:23] if i understand rightly [14:23] @traysi Gut feel is to allow ravend to handle a few different assets on different threads. [14:23] Are you going to call it kawmeleon mode? [14:23] Lol [14:23] I read that as kaw-melon mode. [14:24] same lol [14:24] so in one single swoop it possible to create a specific wallet and server daemon for specific assets. great. this makes it easier for exchanges, and has some added advantages with processing data too right? [14:24] Still keeping a RVN balance in the wallet, as well, Tron. How will that work is sendtoaddress sends the token instead of the RVN? A receive-RVN/send tokens-only wallet? [14:25] @traysi Yes [14:25] sendtoaddress on the other port (non RVN port) would send the asset. [14:25] This will be a hugely useful feature. [14:25] ^ [14:26] @Tron currently rpc function not support getaddresses senttowallet and this has to be done in ravencore lib, will this change you propose improve this situation [14:26] Config might look like {"port":2222, "asset":"FREE_HUGS", "rpcuser":"hugger", "rpcpass":"gi3afja33"} [14:26] how will this work cross-chain? [14:28] @push We'd have to go through the rpc calls and work out which ones are supported in compatibility mode. Obviously the mining ones don't apply. And some are generic like getinfo. [14:28] ok cool 👍 cheers [14:29] for now we continue using ravencore lib for our plans to keep track i just wondering if better way [14:29] as we had some issue after realising no rpc function for getting addresses of people who had sent rvn [14:29] @push | ravenland.org all of the node explorer and ravencore-lib functionality is based on RPC (including the addressindex-related calls). Nothing you can't do with RPC, although I'm not sure of the use cases you're referring to.. [14:29] interesting, so ravencore lib is using getrawtransaction somehow [14:29] i thought this may be the case [14:29] that is very useful thankyou for sharing this [14:30] look into addressindex flag and related RPC calls for functions that operate on addresses outside your wallet [14:30] thank you that is very useful, tbh i am not very skilled programmer so just decoding the hex at the raven-cli commandline was a challenge, i shall look more into this, valued information thanks as this was a big ? for us [14:31] Ok, things have gone quiet. New topic. [14:31] IPFS (integration) [14:31] GO [14:33] ... [14:33] <[Dev] Blondfrogs> So, we have been adding ipfs integration into the wallet for messaging. This will allow the wallets to do some pretty sweet stuff. For instance, you will be able to create your ipfs data file for issuing an asset. Push it to ipfs from the wallet, and add the hash right into the issuance data. This is going to allow for a much more seamless flow into the app. [14:34] <[Dev] Blondfrogs> This ofcourse, will also allow for users to create messages, and post them on ipfs and be able to easily and quickly format and send messages out on the network with ipfs data. [14:34] It will also allow optional meta-data with each transaction that goes in IPFS. [14:34] will i be able to view ipfs images natively in the wallet? [14:34] <[Dev] Blondfrogs> Images no [14:34] We discussed the option to disable all IPFS integration also. [14:35] @russ (kb: russkidooski) Probably not. There's some risk to being an image viewer for ANY data. [14:35] No option in wallet to opt into image viewing? [14:35] cool so drag and drop ipfs , if someone wanted to attach an object like an image or a file they could drag drop into ui and it create hash and attach string to transaction command parameters automatically [14:35] We could probably provide a link -- with a warning. [14:35] nomore going to globalupload.io [14:35] :ThinkBlack: [14:35] I understand that the wallet will rely on globalupload.io. (phase 1). Is it not dangerous to rely on an external network? Or am I missing something? [14:36] hmm [14:36] interesting, i suppose you could hash at two different endpoints and compare them [14:36] if you were that worried [14:36] and only submit one to the chain [14:36] You will be able to configure a URL that will be used as an IPFS browser. [14:36] Oh ic [14:36] you wont flood ipfs because only one hash per unique file [14:36] <[Dev] Blondfrogs> There are multiple options for ipfs integration. We are building it so you can run your own ipfs node locally. [14:36] <[Dev] Blondfrogs> or, point it to whatever service you would like. e.g. cloudflare [14:36] this is very cool developments, great to see this [14:37] Just like the external block explorer link currently in preferences. [14:37] @[Dev] Blondfrogs what about a native ipfs swarm for ravencoin only? [14:37] We have discussed that as an option. [14:37] @push | ravenland.org Considering having a fallback of upload through globalupload.io and download through cloudflare. [14:37] <[Dev] Blondfrogs> @russ (kb: russkidooski) We talked about that, but no decisions have been made yet. [14:37] yeah, i would just use two endpoints and strcompare the hash [14:37] as long as they agree good [14:37] submit tran [14:38] else 'potentially mysterious activity' [14:38] ? [14:38] if you submitted the file to ipfs api endpoints [14:38] Will the metadata just be a form with text only fields? [14:39] and then you would get 2 hashes, from 2 independent services [14:39] that way you would not be relying on a central hash service [14:39] and have some means of checking if a returned hash value was intercepted or transformed [14:39] i was answering jeroz' question [14:40] about relying on a single api endpoint for upload ipfs object [14:40] We have also kicked around the idea of hosting our own JSON only IPFS upload/browse service. [14:41] I have a service like this that is simple using php [14:41] we only use it for images right now [14:41] but fairly easy to do [14:41] Yup [14:42] Further questions about IPFS? [14:43] contract handling? file attach handling? or just text fields to generate json? [14:44] trying to get an idea of what the wallet will offer for attaching data [14:44] Probably just text fields that meet the meta-data spec. [14:44] ok noted [14:44] What do you mean by contract handling @sull [14:45] We won't prevent other hashes from being added. [14:45] asset contract (pdf etc) hash etc [14:45] <[Dev] Blondfrogs> also, being able to load from a file [14:45] got it, thanks [14:47] Let's do some general Q&A [14:48] Maybe just a heads up or something to look for in the future but as of right now, it takes roughly 12 hours to sync up the Qt-wallet from scratch. Did a clean installation on my linux PC last night. [14:48] Any plans or discussions related to lack of privacy of asset transfers and the ability to front run when sending to an exchange? [14:48] ^ [14:48] Is there a way to apply to help moderate for example the Telegram / Discord, i spend alot of time on both places, sometimes i pm mods if needed. [14:49] Any developed plans for Asset TX fee adjustment? [14:49] also this^ [14:49] @mxL86 We just created a card on the public board to look into that. [14:49] General remark: https://raven.wiki/wiki/Development#Phase_7_-_Compatible_Mode = updated reflecting Tron's explanation. [14:49] @mxL86 That's a great question. We need to do some profiling and speed it up. I do know that the fix we added from Bitcoin (that saved our bacon) slowed things down. [14:50] Adding to @mxL86 the sync times substantially increased coinciding with the asset layer activation. Please run some internal benchmarks and see where the daemon is wasting all its cycles on each block. We should be able to handle dozens per second but it takes a couple seconds per block. [14:50] @BW__ no plans currently for zk proofs or anything if that's what you're asking [14:50] You are doing a great job. Is there a plan that all this things (IPFS) could be some day implemented in mobile wallet? Or just in QT? [14:50] i notice also that asset transactions had some effect on sync time as we were making a few. Some spikes i not analysed the io and cpu activity properly but will if there is interest [14:51] we are testing some stuff so run into things i am happy to share [14:51] @BW__ Might look at Grin and Beam to see if we can integrate Mimble Wimble -- down the road. [14:51] yeees [14:51] @J. | ravenland.org work with the telegram mods. Not something the developers handle. [14:51] i love you [14:51] @J. | ravenland.org That would be best brought up with the operators/mods of teh telegram channel. [14:51] @corby @Tron thnx [14:51] @S1LVA | GetRavencoin.org we're planning on bumping fees to... something higher! [14:51] no catastrophic failures, just some transaction too smals, and mempool issues so far, still learning [14:52] @corby i thought that this may happen :ThinkBlack: [14:52] @corby x10? 100x? 1000x? Ballpark? [14:52] Definitely ballpark. [14:52] 😃 [14:52] 😂 [14:52] Is a ballpark like a googolplex? [14:53] @push | ravenland.org asset transactions are definitely more expensive to sync [14:53] yes yes they are [14:53] they are also more expensive to make i believe [14:53] 10,000x! [14:53] as some sync process seems to occur before they are done [14:53] @traysi ★★★★★ thanks for the suggestions we are going to be looking at optimizations [14:53] But, it is way slower than we like. Going to look into it. [14:53] i do not understand fully its operation [14:53] 1000x at minimum in my opinion [14:53] its too easy to spam the network [14:54] yes there has been some reports of ahem spam lately [14:54] :blobhide: [14:54] 😉 [14:54] cough cough ravenland [14:54] @russ (kb: russkidooski) we're in agreement -- it's too low [14:54] default fee 0.001 [14:54] ^ something around here [14:54] @corby yep we all are i think [14:55] waaay too low [14:55] meaningful transactions start with meaningful capital expense [14:55] though there is another scenario , there are some larger volume, more objective rich use cases of the chain that would suffer considerably from that [14:55] just worth mentioning, as i have beeen thinking about this a lot [14:55] there are some way around, like i could add 1000 ipfs hashes to a single unique entity, i tested this and it does work [14:56] @russ (kb: russkidooski) What would you suggest. [14:57] I had a PR for fee increase and push back. [14:57] Ignore the push back. 0.001 RVN is not even a micro-farthing in fiat terms [14:57] definitely around 1000x [14:57] Vocal minority for sure [14:57] ^ yep [14:57] @russ (kb: russkidooski) That sounds reasonable. [14:57] Couple hundred Fentons [14:58] right now an asset transaction is 0.01 of a penny essentially [14:58] 1 RVN would work now, but not when RVN is over $1. [14:58] yes exactly [14:58] Hi. Late to the party. [14:58] We are also talking about a min fee. The system will auto-adapt if blocks fill up. [14:58] im thinking tron, some heavy transaction use cases would fall out of utility use if that happened [14:58] so whats the thinking there [14:59] is there a way around the problem, bulked ipfs hash transactions? [14:59] 1000x would put us around btc levels [14:59] maybe a minimum 500x? [14:59] @russ (kb: russkidooski) Agreed. [14:59] <[Dev] Blondfrogs> It is time to wrap it up here. Everyone. Thank you all for your questions and thoughts. We will be back in 2 weeks. 😃 [14:59] Small increase and review. [14:59] Thanks all! [14:59] Cheers. [15:00] yeah sorry for 1 million questions guys hope i didnt take up too much time [15:00] cheers all 👍 [15:00] Thanks everyone [15:00] Thanks everyone for participating!!! [15:00] That is what we are here for [15:00] 100x-500x increase, 1000x maximum [15:00] 🍺

submitted by Chatturga to Ravencoin [link] [comments]

Cryptoverse: A Distributed Space MMO

Many people get into gamedev with the desire to make an MMO, and then reality hits. They’re hard to make, expensive to host, and time consuming to moderate. Even after years of gamedev, I’m still interested in them, so I’ve set out to create a distributed MMO about exploring space, trading, and warfare. I’ve just finished my first alpha version, which is more a test of the protocol than a game itself, but it’s a start! If you want to jump to the general overview, you can find that here, and downloads for the client here. Since I don’t want to just blogspam you guys, I’ve saved the technical details for this post!
How does it work?
At its heart, Cryptoverse is a loosely turn-based space MMO, with updates to the game’s state every two minutes or so1. My codebase, which can be found here under MIT license, is heavily inspired by Bitcoin. There’s no need for a centralized server, the rules are enforced by everyone on the network. A shared blockchain keeps track of events and the order in which they occur. In a similar fashion to Bitcoin, people who maintain the blockchain are rewarded. Instead of some kind of currency, though, they’re rewarded with ships they can use in combat. The ownership of these ships is tracked by a user’s public and private key-pair, so only the owner of the ships can order them to attack or jump to another system. Cheating is a non-issue since it’s a trustless system, every server and client will be validating the blockchain. Only updates to the blockchain that the majority of the network agrees upon will be built upon. In order to corrupt the blockchain, an attacker would have to have more than 50% of the network’s computing power. To allow the universe to expand as the game progresses, a new star system is created every two minutes or so, when a block is added to the chain. Right now there’s not much to do in these systems, but eventually you’ll be able to harvest them for resources, build infrastructure, and deploy defenses.
Future gameplay changes
There are a lot of features this first alpha is missing, so here are a few features I know I want to add.
As you can see, there are a lot of features I still need to add before this is funTM, but I’m a big believer in open testing and candid feedback, so please tell me what you think! If you want to vote on features you're interested in, you can do that on Cryptoverse's Trello board.
1 The time between updates depends on the number of people maintaining and the difficulty of creating new blocks, but it’s supposed to average out to two minutes
2 The resource reward for maintainers will always need to be better than what you could get in game, otherwise no one would want to maintain the blockchain
3 Current versions of the client include my test server’s url, but in the future you’ll be able to define your own and get new ones from other servers
4 Yeah yeah, I should have added this already, but at least the server has validation on it. This will only be necessary once peer to peer servers are enabled in the next update
Edit
There's definitely a bug I'll need to fix, but I'm AFK for another 24 hours. If you keep probing and get Posted starlog with response 400, that's a symptom of it. Well this is what alphas are for!
submitted by Spacew00t to gamedev [link] [comments]

Proof Of Transaction: Is Efficient Part 1

The New Blockchain consensus mechanism
👉Proof-Of-Transaction (POT) which solves the POW and POS problems.
At the heart of every blockchain system is a consensus model and a protocol that defines the way nodes agree on the order and validity of transactions. This is particularly important in public blockchains which are open to many attack vectors and use economic incentives and fee structures, to ensure misbehavior is not desirable for participants and full-scale attacks are not viable economically.
However, the search for the perfect consensus model is still ongoing. The two most common options are Proof of Work (POW), as used in Bitcoin, and Proof of Stake (POS), which was first used in Peercon.
Both models have their flaws, particularly since they both favor the wealthy, either through the amount of hardware required Proof-Of-Work (POW), or the funds accumulated Proof-Of-Stake (POS). Proof-Of-Work is also back drawn for its energy consumption, and Proof-Of-Stake directly encourages locking up vast amounts of cryptocurrencies instead of using them. Unlikely those, TAU attempts to change this with Proof-Of-Transaction (POT) by incentivizing circulation.
To as love the cryptocurrency problems which have been seen on POW and POS, TAU has come up with the following key features:
Taucoin Features:
Proof-Of-Transaction:- the more transactions, the more reward
Automatic Harvest Club Participation:- no work or knowledge required in joining a harvest club
Instant harvesting reward share through “club wiring”
Near Zero Cost Transaction Fees:- most of transaction fees are returned throughout the year.
One year to accumulate weight:- to prevent spam and speculation
Environmentally friendly consensus mechanism that is secure
Built 100% from our team (not a fork of an existing coin)
Test driven development process focused purely on technology first
First cryptocurrency to focus on the velocity of coin circulation
First cryptocurrency to give no advantage to hoarding
First cryptocurrency to have near zero cost transactions in perpetuity.
Proof of Transactions
POT uses a sliding 365-day window and calculates the number of transactions executed in this window. This number is used as a weight for determining the likelihood of being chosen as a block producer. The actual process of determining the next block producer is a pseudo-random process, based on the algorithm used in NXT, one of the early POS of stake blockchains, weighted by the number of transactions.
The goal of this is making consensus fairer by incentivizing keeping coins in circulation, instead of favoring wealthy participants. The protocol natively supports harvest clubs, the equivalent of mining and staking pools. The feature allows addresses to delegate their transaction history to each other, accumulating weight in the consensus model and sharing proceeds.
Making transactions, meaning actually using cryptocurrencies, has long been identified as one of the keys to adoption. Holding cryptocurrencies for staking or as an investment might turn cryptocurrencies into interesting financial instruments, but does not fit the definition of digital currency. In this sense, TAU’s model of POT might actually be on to something and help adoption of cryptocurrencies as a method of payment.
To get Taucoins (1500), start here👇
https://www.taucoin.io/account/login?referralURL=04acb93ca8564c018fb6a154990341e1fde80c16e0b5b632702821ac57101d23
TAU coin will be trade on the own exchange, starting 18th of October this year.
💰Bitcointalk Bounty/Airdrop💰 https://bitcointalk.org/index.php?topic=5035589.0
NB: 🏆Currently, Taucoin can be mined using Ubuntu OS (16.04 & 14.04 tested)
🏆Engineering underway, to make mining via mobile phones possible
Whitepaper link: https://www.taucoin.io/whitePapeTAU%20White%20Paper%20v0.1.pdf
submitted by Nileke to airdrops [link] [comments]

Introducing a POW through a soft-fork | Devrandom | Nov 01 2017

Devrandom on Nov 01 2017:
Hi all,
Feedback is welcome on the draft below. In particular, I want to see if
there is interest in further development of the idea and also interested in
any attack vectors or undesirable dynamics.
(Formatted version available here:
https://github.com/devrandom/btc-papers/blob/masteaux-pow.md )

Soft-fork Introduction of a New POW

Motivation:

not have economies of scale
mining power fluctuations
Note however that choice of a suitable POW will require deep analysis.
Some pitfalls include: botnet mining, POWs that seem ASIC resistant but are
not, unexpected/covert optimization.
In particular, unexpected/covert optimizations, such as ASCIBOOST, present
a potential centralizing and destabilizing force.

Design

Aux POW intermediate block

Auxiliary POW blocks are introduced between normal blocks - i.e. the chain
alternates between the two POWs.
Each aux-POW block points to the previous normal block and contains
transactions just like a normal block.
Each normal block points to the previous aux-POW block and must contain all
transactions from the aux-POW block.
Block space is not increased.
The new intermediate block and the pointers are introduced via a soft-fork
restriction.

Reward for aux POW miners

The reward for the aux POW smoothly increases from zero to a target value
(e.g. 1/2 of the total reward) over time.
The reward is transferred via a soft-fork restriction requiring a coinbase
output to an address published in the
aux-POW block.

Aux POW difficulty adjustment

Difficulty adjustments remain independent for the two POWs.
The difficulty of the aux POW is adjusted based on the average time between
normal block found
to aux block found.
Further details are dependent on the specific POW.

Heaviest chain rule change

This is a semi-hard change, because non-upgraded nodes can get on the wrong
chain in case of attack. However,
it might be possible to construct an alert system that notifies
non-upgraded nodes of an upcoming rule change.
All blocks are still valid, so this is not a hardforking change.
The heaviest chain definition changes from sum of difficulty to sum of:
mainDifficulty ^ x * auxDifficulty ^ y 
where we start at:
x = 1; y = 0 
and end at values of x and y that are related to the target relative
rewards. For example, if the target rewards
are equally distributed, we will want ot end up at:
x = 1/2; y = 1/2 
so that both POWs have equal weight. If the aux POW is to become dominant,
x should end small relative to y.

Questions and Answers

weight? A: 1/2 of the reward should be transferred
to aux miners and x = 1/2, y = 1/2.
most of the reward should be transferred to
aux miners and x = 0, y = 1. The main difficulty will tend to zero, and
aux miners will just trivially generate the
main block immediately after finding an aux block, with identical content.
optimized by skipping transactions already
transferred.
would agree if they believe that
the coins will increase in value due to improved security properties.

Open Questions

become idle while a block of the other type is being mined. In practice,
the spare capacity can be used to find alternative ("attacking") blocks or
mine other coins. Is that a problem?
types of hardware?

POW candidates

confirmation)

Next Steps

detrimental behavior patterns (e.g. block withholding / selfish mining)

Credits

Bram Cohen came up with a similar idea back in March:
https://lists.linuxfoundation.org/pipermail/bitcoin-dev/2017-March/013744.html
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://lists.linuxfoundation.org/pipermail/bitcoin-dev/attachments/20171101/9dc7ba4e/attachment.html
original: https://lists.linuxfoundation.org/pipermail/bitcoin-dev/2017-Novembe015236.html
submitted by dev_list_bot to bitcoin_devlist [link] [comments]

[Pre-BIP] Community Consensus Voting System | t. khan | Feb 02 2017

t. khan on Feb 02 2017:
Please comment on this work-in-progress BIP.
Thanks,
BIP: ?
Layer: Process
Title: Community Consensus Voting System
Author: t.khan
Comments-Summary: No comments yet.
Comments-URI: TBD
Status: Draft
Type: Standards Track
Created: 2017-02-02
License: BSD-2
Voting Address: 3CoFA3JiK5wxe9ze2HoDGDTmZvkE5Uuwh8 (just an example, don’t
send to this!)
Abstract
Community Consensus Voting System (CCVS) will allow developers to measure
support for BIPs prior to implementation.
Motivation
We currently have no way of measuring consensus for potential changes to
the Bitcoin protocol. This is especially problematic for controversial
changes such as the max block size limit. As a result, we have many
proposed solutions but no clear direction.
Also, due to our lack of ability to measure consensus, there is a general
feeling among many in the community that developers aren’t listening to
their concerns. This is a valid complaint, as it’s not possible to listen
to thousands of voices all shouting different things in a crowded
room—basically the situation in the Bitcoin community today.
The CCVS will allow the general public, miners, companies using Bitcoin,
and developers to vote for their preferred BIP in a way that’s public and
relatively difficult (expensive) to manipulate.
Specification
Each competing BIP will be assigned a unique bitcoin address which is added
to each header. Anyone who wanted to vote would cast their ballot by
sending a small amount (0.0001 btc) to their preferred BIP's address. Each
transaction counts as 1 vote.
Confirmed Vote Multiplier:
Mining Pools, companies using Bitcoin, and Core maintainers/contributors
are allowed one confirmed vote each. A confirmed vote is worth 10,000x a
regular vote.
For example:
Slush Pool casts a vote for their preferred BIP and then states publicly
(on their blog) their vote and the transaction ID and emails the URL to the
admin of this system. In the final tally, this vote will count as 10,000
votes.
Coinbase, Antpool, BitPay, BitFury, etc., all do the same.
Confirmed votes would be added to a new section in each respective BIP as a
public record.
Voting would run for a pre-defined period, ending when a particular block
number is mined.
Rationale
Confirmed Vote Multiplier - The purpose of this is twofold; it gives a
larger voice to organizations and the people who will have to do the work
to implement whatever BIP the community prefers, and it will negate the
effect of anyone trying to skew the results by voting repeatedly.
Definitions
Miner: any individual or organization that has mined at least one valid
block in the last 2016 blocks.
Company using Bitcoin: any organization using Bitcoin for financial, asset
or other purposes, with either under development and released solutions.
Developer: any individual who has or had commit access, and any individual
who has authored a BIP
Unresolved Issues
Node voting: It would be desirable for any full node running an up-to-date
blockchain to also be able to vote with a multiplier (e.g. 100x). But as
this would require code changes, it is outside the scope of this BIP.
Copyright
This BIP is licensed under the BSD 2-clause license.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://lists.linuxfoundation.org/pipermail/bitcoin-dev/attachments/20170202/b354d474/attachment.html
original: https://lists.linuxfoundation.org/pipermail/bitcoin-dev/2017-February/013525.html
submitted by dev_list_bot to bitcoin_devlist [link] [comments]

Flag day activation of segwit | shaolinfry | Mar 12 2017

shaolinfry on Mar 12 2017:
I recently posted about so called "user activated soft forks" and received a lot of feedback. Much of this was how such methodologies could be applied to segwit which appears to have fallen under the miner veto category I explained in my original proposal, where there is apparently a lot of support for the proposal from the economy, but a few mining pools are vetoing the activation.
It turns out Bitcoin already used flag day activation for P2SH[1], a soft fork which is remarkably similar to segwit. The disadvantage of a UASF for segwit is there is an existing deployment. A UASF would require another wide upgrade cycle (from what I can see, around 80% of existing nodes have upgraded from pre-witness, to NODE_WITNESS capability[2][3]. While absolute node count is meaningless, the uprgrade trend from version to version seems significant.
Also it is quite clear a substantial portion of the ecosystem industry has put in time and resources into segwit adoption, in the form of upgrading wallet code, updating libraries and various other integration work that requires significant time and money. Further more, others have built systems that rely on segwit, having put significant engineering resources into developing systems that require segwit - such as several lightning network system. This is much more significant social proof than running a node.
The delayed activation of segwit is also holding back a raft protocol of innovations such as MAST, Covenants, Schnorr signature schemes and signature aggregation and other script innovations of which, much of the development work is already done.
A better option would be to release code that causes the existing segwit deployment to activate without requiring a completely new deployment nor subject to hash power veto. This could be achieved if the economic majority agree to run code that rejects non-signalling segwit blocks. Then from the perspective of all existing witness nodes, miners trigger the BIP9 activation. Such a rule could come into effect 4-6 weeks before the BIP9 timeout. If a large part of the economic majority publicly say that they will adopt this new client, miners will have to signal bip9 segwit activation in order for their blocks to be valid.
I have drafted a BIP proposal so the community may discuss https://gist.github.com/shaolinfry/743157b0b1ee14e1ddc95031f1057e4c (full text below).
References:
Proposal text:
BIP: bip-segwit-flagday Title: Flag day activation for segwit deployment Author: Shaolin Fry Comments-Summary: No comments yet. Comments-URI: https://github.com/bitcoin/bips/wiki/Comments:BIP-???? Status: Draft Type: Informational Created: 2017-03-12 License: BSD-3-Clause CC0-1.0 ==Abstract== This document specifies a BIP16 like soft fork flag day activation of the segregated witness BIP9 deployment known as "segwit". ==Definitions== "existing segwit deployment" refer to the BIP9 "segwit" deployment using bit 1, between November 15th 2016 and November 15th 2017 to activate BIP141, BIP143 and BIP147. ==Motivation== Cause the mandatory activation of the existing segwit deployment before the end of midnight November 15th 2017. ==Specification== All times are specified according to median past time. This BIP will be activate between midnight October 1st 2017 (epoch time 1538352000) and midnight November 15th 2017 (epoch time 1510704000) if the existing segwit deployment is not activated before epoch time 1538352000. This BIP will cease to be active when the existing segwit deployment activates. While this BIP is active, all blocks must set the nVersion header top 3 bits to 001 together with bit field (1<<1) (according to the existing segwit deployment). Blocks that do not signal as required will be rejected. === Reference implementation === // mandatory segwit activation between Oct 1st 2017 and Nov 15th 2017 inclusive if (pindex->GetMedianTimePast() >= 1538352000 && pindex->GetMedianTimePast() <= 1510704000 && !IsWitnessEnabled(pindex->pprev, chainparams.GetConsensus())) { if (!((pindex->nVersion & VERSIONBITS_TOP_MASK) == VERSIONBITS_TOP_BITS) && (pindex->nVersion & VersionBitsMask(params, Consensus::DEPLOYMENT_SEGWIT)) != 0) { return state.DoS(2, error("ConnectBlock(): relayed block must signal for segwit, please upgrade"), REJECT_INVALID, "bad-no-segwit"); } } ==Backwards Compatibility== This deployment is compatible with the existing "segwit" bit 1 deployment scheduled between midnight November 15th, 2016 and midnight November 15th, 2017. ==Rationale== Historically, the P2SH soft fork (BIP16) was activated using a predetermined flag day where nodes began enforcing the new rules. P2SH was successfully activated with relatively few issues By orphaning non-signalling blocks during the last month of the BIP9 bit 1 "segwit" deployment, this BIP can cause the existing "segwit" deployment to activate without needing to release a new deployment. ==References== [https://github.com/bitcoin/bitcoin/blob/v0.6.0/src/main.cpp#L1281-L1283 P2SH flag day activation]. ==Copyright== This document is placed in the public domain.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://lists.linuxfoundation.org/pipermail/bitcoin-dev/attachments/20170312/6e8cc65a/attachment-0001.html
original: https://lists.linuxfoundation.org/pipermail/bitcoin-dev/2017-March/013714.html
submitted by dev_list_bot to bitcoin_devlist [link] [comments]

Proposal for P2P Wireless (Bluetooth LE) transfer of Payment URI | Paul Puey | Feb 05 2015

Paul Puey on Feb 05 2015:
Airbitz has developed and implemented a method for communicating a bitcoin
URI across Bluetooth (BLE) or any other P2P, mid range, wireless, broadcast
medium. The currently documented implementation is available in our iOS and
Android mobile wallet (updated Android version with BLE coming in about 1
week). We would like to have the BIP pulled into Github for review and
discussion. Here is the current BIP:
BIP: TBD
Title: P2P Wireless URI transfer
Authors: Thomas Baker , Paul Puey
Contributors: Joey Krug
Status: proposal
Type: Standards Track
Created: 2015-01-12
Table of Contents
-
Abstract
-
Motivation
-
Specification
-
Compatibility
-
Examples
-
References
Abstract
This is a protocol for peer-to-peer wireless transfer of a URI request
using an open broadcast or advertisement channel such as Bluetooth,
Bluetooth Low Energy, or WiFi Direct.
Motivation
There are disadvantages for a merchant (requester) and customer (sender) to
exchange a URI request using QR codes that can be eliminated by using
wireless broadcast or advertisements.
Current QR code scan method to transfer a request URI from merchant
(Requester) to customer (Sender) is cumbersome. A usual scenario is a
merchant with a POS terminal for order entry and a separate tablet for
transacting payments with bitcoin, and a customer with a smartphone. After
the order is entered, the merchant enters payment request information into
the tablet, generates the QR code representing the URI, and presents this
to the customer. The customer prepares to scan the QR code with their
smartphone by maneuvering the camera to the tablet. The tablet screen must
be relatively clean, point at the customer, and held steady. The smartphone
camera lens must be clean, point at the tablet screen, come into range, and
held steady to focus and wait for a QR scan. Environmental conditions such
as bright outdoor sunlight, indoor spot lights, or significant distance
between QR code and camera can create difficult and cumbersome experiences
for users.
Using a wireless local broadcast allows the merchant to just enter the
payment and wait. The tablet and smartphone are not maneuvered to align in
any way. The customer observes broadcast listings, selects the appropriate
one from possible simultaneous broadcasts from other POS stations nearby,
examines the URI request details such as amount, and decides whether to
send funds, initiating a bitcoin network transfer. The merchant and
customer then receive the transaction confirmations and are done with the
sale. Merchant and customer devices are kept private and secured in their
own possession.
The URI and other broadcast identification (Joe’s Grill #1) only contain
public information. However, a copycat broadcaster acting as MITM might
duplicate the broadcast simultaneously as the merchant, attempting to lure
the customer to send funds to the copycat. That attack is mitigated with
this broadcast method because of the partial address in the broadcast.
Specification
Requester generates a bitcoin URI request of variable length, and a limited
descriptive identifier string. Requester then broadcasts the URI’s partial
public address () plus identifier () over a publicly visible
wireless channel.
Sender scans for broadcasts on their device, examines and selects the
desired request by the identifier and partial address. This connects a data
channel to Requester.
Requester sends full URI back over the data channel.
Sender device ensures is part of the full URI public address and
checks the full address integrity. Checking the broadcast and full URI
integrity prevents a copycat device within range from copying the partial
address and fooling the customer into sending funds to the copycat instead.
Below is a description of the protocol through Bluetooth Smart (Low Energy).
Requestor Sender - Bitcoin transaction roles
Peripheral Central - Bluetooth GAP definitions
Mode Mode
1 |-----------| - Requestor Advertises partial bitcoin: URI +
Name
| ... |
2 |<-------------| - Subscribe then send sender's Name, requesting
a response
3 |-----------| - ACK
4 |<-------------| - request Read Characteristic from peripheral
5 |-----------| - Sender receives full bitcoin: URI
1.
Peripheral advertises over a service UUID a BLE extended advertisement
with a Scan Response containing the partial address of a bitcoin URI and a
Name, any plain text. The entire response is limited to 26 characters. The
first 10 make up the first 10 characters of the bitcoin URI public address
where to send bitcoin, and must be present. The remaining characters are
any plain text such as “The Habit 1” or “Starbucks-Reg 1”, more human
readable information. The partial address serves as a check against a
nearby attacker who may try to lure a Sender into sending payment to a
separate wallet by advertising a similar Scan Response but cannot replicate
a public address with the same leading 10 characters and different trailing
characters.
2.
When the Central scans the advertisement, it may display the Scan
Response in a human readable listing using the two pieces of information.
If Central chooses this advertisement to receive the full request, it then
subscribes to the service and writes the characteristic (a second UUID)
with it’s own name, or a blank if not sending a name, to the Peripheral.
3.
Peripheral gets a characteristic write request of the Central’s name,
and acknowledges the receipt by sending a server response.
4.
Central receives a characteristic write (from the response) and
immediately requests the entire bitcoin URI by issuing a read request on
that characteristic.
5.
Peripheral receives the read request and sends the entire bitcoin URI
over that characteristic up to 512 bytes.
This ends the proposed specification as the bitcoin URI transfer is
complete. The Sender would then normally confirm the request and decide
whether to initiate the fund transfer.
Compatibility
There are no prior BIPs covering this.
Examples
Airbitz iOS Bluetooth Low Energy to Bluetooth Low Energy request transfer.
References
[image: logo]
Paul Puey CEO / Co-Founder, Airbitz Inc
+1-619-850-8624 | http://airbitz.co | San Diego
<http://facebook.com/airbitz> <http://twitter.com/airbitz>
<https://plus.google.com/118173667510609425617>
<https://go.airbitz.co/comments/feed/> <http://linkedin.com/in/paulpuey>
<https://angel.co/paul-puey>
DOWNLOAD THE AIRBITZ WALLET:
<https://play.google.com/store/apps/details?id=com.airbitz>
<https://itunes.apple.com/us/app/airbitz/id843536046>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.linuxfoundation.org/pipermail/bitcoin-dev/attachments/20150205/a00a4c8a/attachment.html>
original: http://lists.linuxfoundation.org/pipermail/bitcoin-dev/2015-February/007320.html
submitted by bitcoin-devlist-bot to bitcoin_devlist [link] [comments]

From an experiment to a worldwide standard

I've recently found out about ZeroNet, and the IPFS protocol in the process. The technology behind it is something I've envisioned for a long time now, ever since the SOPA / PIPA / ACTA scandals years ago which issued the first warning signs of an approaching censorship era. Although reliance on ISP's for internet access can't be solved without the use of new hardware, IPFS still found a way to make websites impossible to censor or take down. For this reason, I'm among those who believe the standard is extremely important and needs to be adopted as soon as possible!
Unfortunately, I'm saddened to say there's a major obstacle standing in the way of achieving that. As it's one that might be easily overlooked at this stage, I decided to create a thread about it, in hopes that the developers of Zeronet and IPFS as well as anyone with the resources to make a difference might be inspired. The obstacle is getting the standard to become a significant new norm, and convincing important players to adopt it.
At the stage, the idea behind IPFS and torrent based websites is just a concept being developed by a group. To put it honestly, it will not achieve anything as long as it remains the pet project of a few thousand nerds... as much as I'd be proud of being one myself, if only I had the knowledge in programming and the learning abilities needed to understand the code that powers it. I believe this is the reason why Tor is something the average user only knows of abstractly and few people actually use: It requires a special web browser (Tor Browser) and in some circumstances extra configuration steps to set it up... creating the impression that Tor is only a place for hackers, putting aside criminals and the dark side of the deep web. Oppositely, if anyone could simply type an onion URL into Firefox or Chrome or Internet Explorer, any major website that might want to adopt it could easily do so without inconveniencing its users.
In my opinion, IPFS technology and the Zeronet platform will have to strive for gradual introduction into the lives of the average internet user. As governments and corporations grow increasingly desperate to control the internet through spying and censorship, while at the same time bandwidth is an increasingly expensive resource worldwide (hosting costs are going up while site loading speeds are going down), I believe this push must be taken into account as early as possible. The technology will do us little good if we will only see it used 10 years from now... at a time when angrier governments might have even banned encryption in rogue regimes, or parts of the internet have fallen to censorship or were torn apart by struggles related to costs vs. bandwidth.
I compiled a list of the four important concerns that come to mind, which I believe the project might want to consider pushing for or should be on the lookout over. Feel free to post your opinion on this, as well as information on how much progress has been made in each department.
For clarity, I am in no way part of the team developing either ZeroNet or IPFS, and in fact just found out about them yesterday. I have however read up on the important aspects, and understand just how important and useful it can be... as such my wish is to see it adopted as a new standard. Seeing the trajectory of similar projects in the past, I conclude this will be a difficult process in which effort will need to be poured separately: Not everyone will hear about it, and many of the people who will won't care unless they have an example to follow. Attempts like this have failed to reach their true potential in the past (eg: Diaspora) solely because they couldn't persuade enough people to use them and abandon the current paradigm. Please don't let a free and open internet wait several more years, as it's already long overdue! Thank you.
submitted by MirceaKitsune to ipfs [link] [comments]

The Strange Birth & History of Monero, Part III: Decentralized team

You can read here part I (by americanpegaus). This is the post that motivated me to make the part II. Now i'm doing a third part, and there'll be a final 4th part. This is probably too much but i wasn't able to make it shorter. Some will be interested in going through all them, and maybe someone is even willing to make a summary of the whole serie :D.
Monero - an anonymous coin based on CryptoNote technology
https://bitcointalk.org/index.php?topic=582080.0
Comentarios de interés:
-4: "No change, this is just a renaming. In the future, the binaries will have to be changed, as well as some URL, but that's all. By the way, this very account (monero) is shared by several user and is meant to make it easier to change the OP in case of vacancy of the OP. This idea of a shared OP comes from Karmacoin.
Some more things to come:
"
(https://bitcointalk.org/index.php?topic=582080.msg6362672#msg6362672)
-5: “Before this thread is too big, I would like to state that a bug has been identified in the emission curve and we are currently in the process of fixing it (me, TFT, and smooth). Currently coins are emitted at double the rate that was intended. We will correct this in the future, likely by bitshifting values of outputs before a certain height, and then correcting 1 min blocks to 2 min blocks. The changes proposed will be published to a Monero Improvement Protocol on github.”
(https://bitcointalk.org/index.php?topic=582080.msg6363016#msg6363016)
[tacotime make public the bug in the emission curve: token creation is currently 2 times what was intended to be, see this chart BTC vs the actual XMR curve, as it was and it is now, vs the curve that was initially planned in yellow see chart]
-14: “Moving discussion to more relevant thread, previous found here:
https://bitcointalk.org/index.php?topic=578192.msg6364026#msg6364026
I have to say that I am surprised that such an idea [halving current balances and then changing block target to 2 min with same block reward to solve the emission curve issue] is even being countenanced - there are several obvious arguments against it.
Perception - what kind of uproar would happen if this was tried on a more established coin? How can users be expected to trust a coin where it is perceived that the devs are able and willing to "dip" into people's wallets to solve problems?
Technically - people are trying to suggest that this will make no difference since it applies to reward and supply, which might be fair enough if the cap was halved also, but it isn't. People's holdings in the coin are being halved, however it is dressed up.
Market price - How can introducing uncertainty in the contents of people's wallets possibly help market price? I may well be making a fool of myself here, but I have never heard of such a fix before, unless you had savings in a Cypriot bank - has this ever been done for another coin?”
(https://bitcointalk.org/index.php?topic=582080.msg6364174#msg6364174)
-15: “You make good points but unfortunately conflicting statements were made and it isn't possible to stick to them all. It was said that this coin had a mining reward schedule similar to bitcoin. In fact it is twice as fast as intended, even even a bit more than twice as fast as bitcoin.
If you acquired your coins on the basis of the advertised reward schedule, you would be disappointed, and rightfully so, as more coins come to into existence more quickly than you were led to believe.
To simply ignore that aspect of the bug is highly problematic. Every solution may be highly problematic, but the one being proposed was agreed as being the least bad by most of the major stakeholders. Maybe it will still not work, this coin will collapse, and there will need to be a relaunch, in which case all your coins will likely be worthless. I hope that doesn't happen.”
(https://bitcointalk.org/index.php?topic=582080.msg6364242#msg6364242)
[smooth tries to justify his proposal to solve the emission curve issue: halve every current balance and change block target to 2 min with same block reward]
-16: “This coin wasn't working as advertised. It was supposed to be mined slowly like BTC but under the current emission schedule, 39% would be mined by the first year and 86% by the fourth year. Those targets have been moved out by a factor of 2, i.e. 86% mined by year 8, which is more like BTC's 75% by year 8. So the cap has been moved out much further into the future, constraining present and near-term supply, which is what determines the price.”
(https://bitcointalk.org/index.php?topic=582080.msg6364257#msg6364257)
[eizh supports smooth’s plan]
-20: “So long as the process is fair and transparent it makes no difference what the number is... n or n/2 is the same relative value so long as the /2 is applied to everyone. Correcting this now will avoid people accusing the coin of a favourable premine for people who mined in the first week.”
(https://bitcointalk.org/index.php?topic=582080.msg6364338#msg6364338)
[random user supporting smooth’s idea]
-21: “Why not a reduction in block reward of slightly more than half to bring it into line with the proposed graph? That would avoid all sorts of perceptual problems, would not upset present coin holders and be barely noticeable to future miners since less than one percent of coins have been mined so far, the alteration would be very small?”
(https://bitcointalk.org/index.php?topic=582080.msg6364348#msg6364348)
-22: “Because that still turns into a pre-mine or instamine where a few people got twice as many coins as everyone else in the first week.
This was always a bug, and should be treated as such.”
(https://bitcointalk.org/index.php?topic=582080.msg6364370#msg6364370)
[smooth wants to be sure they can’t be stigmatized as “premine”]
-23: “No, not true [answering to "it makes no difference what the number is... n or n/2 is the same relative value so long as the /2 is applied to everyone"]. Your share of the 18,000,000 coins is being halved - rightly or wrongly.”
(https://bitcointalk.org/index.php?topic=582080.msg6364382#msg6364382)
[good point made by a user that is battling “hard” with smooth and his proposal]
-28: “+1 for halving all coins in circulation. Would they completely disappear? What would the process be?”
-31: “I will wait for the next coin based on CryptoNote. Many people, including myself, avoided BMR because TFT released without accepting input from anyone (afaik). I pm'ed TFT 8 days before launch to help and didn't get response until after launch. Based on posting within the thread, I bet there were other people. Now the broken code gets "fixed" by taking away coins.”
(https://bitcointalk.org/index.php?topic=582080.msg6364531#msg6364531)
-32: “What you say is true, and I can't blame anyone from simply dropping this coin and wanting a complete fresh start instead. On the other hand, this coin is still gaining in popularity and is already getting close to bytecoin in hash rate, while avoiding its ninja premine. There is a lot done right here, and definitely a few mistakes.”
(https://bitcointalk.org/index.php?topic=582080.msg6364574#msg6364574)
[smooth stands for the project legitimacy despite the bugs]
-37: “Since everything is scaled and retroactive, the only person to be affected is... me. Tongue Because I bought BMR with BTC, priced it with incorrect information, and my share relative to the eventual maximum has been halved. Oh well. The rest merely mined coins that never should have been mined. The "taking away coins" isn't a symptom of the fix: it's the fundamental thing that needed fixing. The result is more egalitarian and follows the original intention. Software is always a work-in-progress. Waiting for something ideal at launch is pretty hopeless. edit: Let me point out that most top cryptocurrencies today were released before KGW and other new difficulty retargeting algorithms became widespread. Consequently they had massive instamines on the first day, even favorites in good standing like LTC. Here the early miners are voluntarily reducing their eventual stake for the sake of fairness. How cool is that?”
(https://bitcointalk.org/index.php?topic=582080.msg6364886#msg6364886)
[this is eizh supporting the project too]
-43: “I'm baffled that people are arguing about us making the emission schedule more fair. I'm an early adopter. This halves my money, and it's what I want to do. There's another change that needs to be talked about too: we don't believe that microscopic levels of inflation achieved at 9 or 10 years will secure a proof-of-work network. In fact, there's a vast amount of evidence from DogeCoin and InfiniteCoin that it will not. So, we'd like to fix reward when it goes between 0.25 - 1.00 coins. To do so, we need to further bitshift values to decrease the supply under 264-1 atomic units to accommodate this. Again, this hurts early adopters (like me), but is designed to ensure the correct operation of the chain in the long run. It's less than a week old, and if we're going to hardfork in economic changes that make sense we should do it now. We're real devs turning monero into the coin it should have been, and our active commitment should be nothing but good news. Fuck the pump and dumps, we're here to create something with value that people can use.”
(https://bitcointalk.org/index.php?topic=582080.msg6366134#msg6366134)
[tacotime brings to the public for first time the tail emission proposal and writes what is my favourite sentence of the whole monero history: “Fuck the pump and dumps, we're here to create something with value that people can use”]
-51: “I think this is the right attitude. Like you I stand to "lose" from this decision in having my early mining halved, but I welcome it. Given how scammy the average coin launch is, I think maximizing fairness for everyone is the right move. Combining a fair distribution with the innovation of Cryptonote tech could be what differentiates Monero from other coins.”
(https://bitcointalk.org/index.php?topic=582080.msg6366346#msg6366346)
-59: “Hello! It is very good that you've created this thread. I'm ok about renaming. But I can't agree with any protocol changes based only on decisions made by bitcointalk.org people. This is because not all miners are continiously reading forum. Any decision about protocol changes are to be made by hashpower-based voting. From my side I will agree on such a decision only if more than 50% of miners will agree. Without even such a simple majority from miners such changes are meaningless. In case of hardfork that isn't supported by majority of miners the network will split into two nets with low-power fork and high-power not-forking branches. I don't think that this will be good for anybody. Such a voting is easy to be implemented by setting minor_version of blocks to a specific value and counting decisions made after 1000 of blocks. Do you agree with such a procedure?”
(https://bitcointalk.org/index.php?topic=582080.msg6368478#msg6368478)
[TFT appears after a couple days of inactivity]
-63: “In few days I will publish a code with merged mining support. This code will be turned ON only by voting process from miners. What does it mean:
The same procedure is suitable for all other protocol changes.”
(https://bitcointalk.org/index.php?topic=582080.msg6368720#msg6368720)
[And now he is back, TFT is all about merged mining]
-67: “We don't agree that a reverse split amounts to "taking" coins. I also wouldn't agree that a regular forward split would be "giving" coins. It's an exchange of old coins with new coins, with very nearly the exact same value. There is a very slight difference in value due to the way the reward schedule is capped, but that won't be relevant for years or decades. Such a change is entirely reasonable to fix an error in a in coin that has only existed for a week.”
(https://bitcointalk.org/index.php?topic=582080.msg6368861#msg6368861)
-68: “There were no error made in this coin but now there is an initiative to make some changes. Changes are always bad and changes destroy participant confidence even in case these changes are looking as useful. We have to be very careful before making any changes in coins”
(https://bitcointalk.org/index.php?topic=582080.msg6368939#msg6368939)
[TFT does not accept the unexpected emission curve as a bug]
-72: “You are wrong TFT. The original announcement described the coin as having a reward curve "close to Bitcoin's original curve" (those are your exact words). The code as implemented has a reward curve that is nothing like bitcoin. It will be 86% mined in 4 years. It will be 98% mined in 8 years. Bitcoin is 50% mined in 4 years, and 75% in 8 years.
With respect TFT, you did the original fork, and you deserve credit for that. But this coin has now gone beyond your initial vision. It isn't just a question of whether miners are on bitcointalk or not.
There is a great team of people who are working hard to make this coin a success, and this team is collaborating regularly through forum posts, IRC, PM and email. And beyond that a community of users who by and large have been very supportive of the efforts we've taken to move this forward.
Also, miners aren't the only stakeholders, and while a miner voting process is great, it isn't the answer to every question. Though I do agree that miners need to be on board with any hard fork to avoid a harmful split.”
(https://bitcointalk.org/index.php?topic=582080.msg6369137#msg6369137)
[smooth breaks out publicily for first time against TFT]
-75: “I suppose that merged mining as a possible option is a good idea as soon as nobody is forced to use it. MM is a possibility to accept PoW calculated for some other network. It helps to increase a security of both networks and makes it possible for miners not to choose between two networks if they want both:
Important things to know about MM:
Actually the only change that goes with MM is that we are able to accept PoW from some other net with same hash-function. Each miner can decide his own other net he will merge mine BMR with.
And this is still very secure.
This way I don't see any disadvantage in merged mining. What disadvantages do you see in MM?”
(https://bitcointalk.org/index.php?topic=582080.msg6369255#msg6369255)
[TFT stands for merged mining]
-77: “Merged mining essentially forces people to merge both coins because that is the only economically rational decision. I do not want to support the ninja-premined coin with our hash rate.
Merged mining makes perfect sense for a coin with a very low hash rate, otherwise unable to secure itself effectively. That is the case with coins that merge mine with bitcoin. This coin already has 60% of the hash rate of bytecoin, and has no need to attach itself to another coin and encourage sharing of hash rate between the two. It stands well on its own and will likely eclipse bytecoin very soon.
I want people to make a clear choice between the fair launched coin and the ninja-premine that was already 80% mined before it was made public. Given such a choice I believe most will just choose this coin. Letting them choose both allows bytecoin to free ride on what we are doing here. Let the ninja-preminers go their own way.”
(https://bitcointalk.org/index.php?topic=582080.msg6369386#msg6369386)
[smooth again]
-85: “One of you is saying that there was no mistake in the emission formula, while the other is. I'm not asking which I should believe . . I'm asking for a way to verify this”
(https://bitcointalk.org/index.php?topic=582080.msg6369874#msg6369874)
[those that have not been paying attention to the soap opera since the beginning do not understand anything at all]
-86: “The quote I posted "close to Bitcoin's original curve" is from the original announcement here: https://bitcointalk.org/index.php?topic=563821.0
I think there was also some discussion on the thread about it being desirable to do that.
At one point in that discussion, I suggested increasing the denominator by a factor of 4, which is what ended up being done, but I also suggested retaining the block target at 2 minutes, which was not done. The effect of making one change without the other is to double the emission rate from something close to bitcoin to something much faster (see chart a few pages back on this thread).”
(https://bitcointalk.org/index.php?topic=582080.msg6369935#msg6369935)
[smooth answers just a few minutes later]
-92: “I'm happy the Bitmonero attracts so much interest.
I'm not happy that some people want to destroy it.
Here is a simple a clear statement about plans: https://bitcointalk.org/index.php?topic=582670
We have two kind of stakeholders we have respect: miders and coin owners.
Before any protocol changes we will ask miners for agreement. No changes without explicit agreement of miners is possible.
We will never take away or discount any coins that are already emitted. This is the way we respect coin owners.
All other issues can be discussed, proposed and voted for. I understand that there are other opinions. All decisions that aren't supported in this coin can be introduced in any new coin. It's ok to start a new fork. It's not ok to try to destroy an existsing network.”
(https://bitcointalk.org/index.php?topic=582080.msg6370324#msg6370324)
[TFT is kinda upset – he can see how the community is “somehow” taking over]
-94: “Sounds like there's probably going to be another fork then. Sigh.
I guess it will take a few tries to get this coin right.
The problem with not adjusting existing coins is that it make this a premine/instamine. If the emission schedule is changed but not as a bug fix, then earlier miners got an unfair advantage over everyone else. Certainly there are coins with premines and instamines, but there's a huge stigma and such a coin will never achieve the level of success we see for this coin. This was carefully discussed during the team meeting, which was announced a day ahead of time, and everyone with any visible involvement with the coin, you included, was invited. It is unfortunate you couldn't make it to that meeting TFT.”
(https://bitcointalk.org/index.php?topic=582080.msg6370411#msg6370411)
[smooth is desperate due to TFT lack of interest in collaboration, and he publicly speaks about an scission for first time]
-115: “Very rough website online, monero.cc (in case you asked, the domain name was voted on IRC, like the crypto name and its code). Webdesigner, webmaster, writers... wanted.”
(https://bitcointalk.org/index.php?topic=582080.msg6374702#msg6374702)
[Even though the lack of consensus and the obvious chaos, the community keeps going on: Monero already has his own site]
-152: “Here's one idea on fixing the emissions without adjusting coin balances.
We temporarily reduce the emission rate to half of the new target for as long as it takes for the total emission from 0 to match the new curve. Thus there will be a temporary period when mining is very slow, and during that period there was a premine.
But once that period is compete, from the perspective of new adopters, there was no premine -- the total amount of coins emitted is exactly what the slow curve says it should be (and the average rate since genesis is almost the same as the rate at which they are mining, for the first year or so at least).
This means the mining rewards will be very low for a while (if done now then roughly two weeks), and may not attract many new miners. However, I think there enough of us early adopters (and even some new adopters who are willing to make a temporary sacrifice) who want to see this coin succeed to carry it through this period.
The sooner this is done the shorter the catch up period needs to be.”
(https://bitcointalk.org/index.php?topic=582080.msg6378032#msg6378032)
[smooth makes a proposal to solve the “emission curve bug” without changing users balances and without favoring the early miners]
-182: “We have added a poll in the freenode IRC room "Poll #2: "Emission future of Monero, please vote!!" started by stickh3ad. Options: #1: "Keep emission like now"; #2: "Keep emission but change blocktime and final reward"; #3: "Keep emission but change blocktime"; #4: "Keep emission but change final reward"; #5: "Change emission"; #6: "Change emission and block time"; #7: "Change emission and block time and final reward"
Right now everyone is voting for #4, including me.”
(https://bitcointalk.org/index.php?topic=582080.msg6379518#msg6379518)
[tacotime announces an ongoing votation on IRC]
-184: “ change emission: need to bitshift old values on the network or double values after a certain block. controversial. not sure if necessary. can be difficult to implement. keep emission: straightforward, we don't keep change emission or block time. change final reward is simple. if (blockSubsidy < finalSubsidy) return finalSubsidy; else return blockSubsidy;”
(https://bitcointalk.org/index.php?topic=582080.msg6379562#msg6379562)
-188: “Yeah, well. We need to change the front page to reflect this if we can all agree on it.
We should post the emissions curve and the height and value that subsidy will be locked in to.
In my opinion this is the least disruptive thing we can do at the moment, and should ensure that the fork continues to be mineable and secure in about 8 years time without relying on fees to secure it (which I think you agree is a bad idea).”
(https://bitcointalk.org/index.php?topic=582080.msg6379871#msg6379871)
[tacotime]
-190: “I don't think the proposed reward curve is bad by any means. I do think it is bad to change the overall intent of a coin's structure and being close to bitcoins reward curve was a bit part of the intent of this coin. It was launched in response to the observation that bytecoin was 80% mined in less than two years (too fast) and also that it was ninja premined, with a stated goal that the new coin have a reward curve close to bitcoin.
At this point I'm pretty much willing to throw in the towel on this launch:
  1. No GUI
  2. No web site
  3. Botched reward curve (at least botched relative to stated intent)
  4. No pool (and people who are enthusiastically trying to mine having trouble getting any blocks; some of them have probably given up and moved on).
  5. No effective team behind it at launch
  6. No Mac binaries (I don't think this is all that big a deal, but its another nail)
I thought this could be fixed but with all the confusion and lack of clear direction or any consistent vision, now I'm not so sure.
I also believe that merged mining is basically a disaster for this coin, and is probably being quietly promoted by the ninjas holding 80% of bytecoin, because they know it keeps their coin from being left behind, and by virtue of first mover advantage, probably relegates any successors to effective irrelevance (like namecoin, etc.).
We can do better. It's probably time to just do better.”
(https://bitcointalk.org/index.php?topic=582080.msg6380065#msg6380065)
[smooth is disappointed]
-191: “The website does exist now, it's just not particularly informative yet. :) But, I agree that thankful_for_today has severely mislead everyone by stating the emission was "close to Bitcoin's" (if he's denying that /2 rather than /4 emission schedule was unintentional, as he seems to be). I'm also against BCN merge mining. It works against the goal of overtaking BCN and if that's not a goal, I don't know what we're even doing here. I'll dedicate my meagre mining to voting against that.
That said, you yourself have previously outlined why relaunches and further clones fail. I'd rather stick with this one and fix it.”
(https://bitcointalk.org/index.php?topic=582080.msg6380235#msg6380235)
[eizh tries to keep smooth on board]
-196: “BCN is still growing as well. It is up to 1.2 million now. If merged mining happens, (almost) everyone will just mine both. The difficulty on this coin will jump up to match BCN (in fact both will likely go higher since the hash rate will be combined) and again it is an instamine situation. (Those here the first week get the benefit of easy non-merged mining, everyone else does not.) Comments were made on this thread about this not being yet another pump-and-dump alt. I think that could have been the case, but sadly, I don't really believe that it is.”
(https://bitcointalk.org/index.php?topic=582080.msg6380778#msg6380778)
-198: “There's no point in fragmenting talent. If you don't think merge mining is a good idea, I'd prefer we just not add it to the code.
Bitcoin had no web site or GUI either initially. Bitcoin-QT was the third Bitcoin client.
If people want a pool, they can make one. There's no point in centralizing the network when it's just began, though. Surely you must feel this way.”
(https://bitcointalk.org/index.php?topic=582080.msg6381866#msg6381866)
[tacotime also wants smooth on board]
-201: “My personal opinion is that I will abandon the fork if merge mining is added. And then we can discuss a new fork. Until then I don't think Monero will be taken over by another fork.”
(https://bitcointalk.org/index.php?topic=582080.msg6381970#msg6381970)
[tacotime opens the season: if merged mining is implemented, he will leave the ship]
-203: “Ditto on this. If the intention wasn't to provide a clearweb launched alternative to BCN, then I don't see a reason for this fork to exist. BCN is competition and miners should make a choice.”
(https://bitcointalk.org/index.php?topic=582080.msg6382097#msg6382097)
[eizh supports tacotime]
-204: “+1 Even at the expense of how much I already "invested" in this coin.”
(https://bitcointalk.org/index.php?topic=582080.msg6382177#msg6382177)
[NoodleDoodle is also against merged mining]
This is basically everything worth reading in this thread. This thread was created in the wrong category, and its short life of about 2 days was pretty interesting. Merged mining was rejected and it ended up with the inactivity of TFT for +7 days and the creation of a new github repo the 30th of April. It is only 12 days since launch and a decentralized team is being built.
Basically the community had forked (but not the chain) and it was evolving and moving forward to its still unclear future.
These are the main takeaways of this thread:
  • The legitimacy of the "leaders" of the community is proven when they proposed and supported the idea of halving the balances for the greater good to solve the emission curve issue without any possible instamine accusation. Also their long-term goals and values rejecting merged-mining with a "primined scam"
  • It is decided that, as for now, it is “too late” to change the emission curve, and finally monero will mint 50% of its coin in ~1.3 years (bitcoin did it after 3.66 years) and 86% of its coins in 4 years (bitcoin does it in ~11 years) (was also voted here) (see also this chart)
  • It is decided that a “minimum subsidy” or “tail emission” to incentivize miners “forever” and avoid scaling fees will be added (it will be finally added to the code march 2015)
  • Merged mining is plainly rejected by the future “core team” and soon rejected by "everyone". This will trigger TFT inactivity.
  • The future “core team” is somehow being formed in a decentralized way: tacotime, eizh, NoodleDoodle, smooth and many others
And the most important. All this (and what is coming soon) is a proof of the decentralization of Monero. Probably comparable to Bitcoin first days. This is not a company building a for-profit project (even if on the paper it is not for-profit), this a group of disconnected individuals sharing a goal and working together to reach it.
Soon will be following a final part where i'll collect the bitcointalk logs in the current official announcement threads. There you'll be able to follow the decentralized first steps of develoment (open source pool, miner optimizations and exchanges, all surrounded by fud trolls, lots of excitmen and a rapidly growing collaborative community.
submitted by el_hispano to Monero [link] [comments]

Beginner’s Guide #11: Bitcoin and the Macroeconomy with Travis Kling Injective Protocol - the exchange of the future Who Decides The Value Of Bitcoins - D-Central Bitcoin Q&A: Decentralised exchanges with fiat Blockchain 101 Ep 47 - What is a Timestamp?

David has already given a good explanation of the term coinbase, but I'd like to give further details on the coinbase transaction.. The coinbase transaction is a special type of transaction.. Every block must have a coinbase transaction, other transactions are optional. The coinbase transaction must be the first transaction of the block (it follows that there can only be one per block). Bitcoin: A Peer-to-Peer Electronic Cash System Satoshi Nakamoto [email protected] www.bitcoin.org Abstract. A purely peer-to-peer version of electronic cash would allow online payments to be sent directly from one party to another without going through a financial institution. Digital signatures provide part of the solution, but the main Bitcoin is a decentralized digital currency that enables instant payments to anyone, anywhere in the world. Bitcoin uses peer-to-peer technology to operate with no central authority: transaction management and money issuance are carried out collectively by the network. The original Bitcoin software by Satoshi Nakamoto was released under the MIT license. . Most client software, derived or "from On the Main tab, click Start Bitcoin on system login. Click the Ok button to save the new settings. The next time you login to your desktop, Bitcoin Core GUI should be automatically started as an icon in the tray. If Bitcoin Core GUI does not automatically start, you may need to add it to an .xinit or .xsession file as described here. A guide to help you understand what blockchain is and how it can be used by industries. You've probably encountered a definition like this: “blockchain is a distributed, decentralized, public

[index] [10720] [691] [26829] [3220] [23200] [22305] [20848] [17187] [27193] [21954]

Beginner’s Guide #11: Bitcoin and the Macroeconomy with Travis Kling

A node is a computer that is participating the global peer-to-peer Bitcoin network. They propagate transactions and blocks everywhere. Full nodes act independently as authoritative verifiers. He is the author of two books: “Mastering Bitcoin,” published by O’Reilly Media and considered the best technical guide to bitcoin; “The Internet of Money,” a book about why bitcoin matters. He is the author of two books: “Mastering Bitcoin,” published by O’Reilly Media and considered the best technical guide to bitcoin; “The Internet of Money,” a book about why bitcoin matters. Counterparty is a new compliment to the Bitcoin protocol that performs general finance, user-defined asset exchange, and contract execution functions in a decentralized manner. In this video, from ... 🔥Injective Protocol - Decentralized Trading Exchange Protocol!🔥 - Duration: 6:23. Айвар Ямалтдинов. ЗАРАБОТОК В ИНТЕРНЕТЕ 2,155 views

Flag Counter