Speed dating blogg

United States About Blog As a dating coach for women over 50, the author Lisa Copeland guide, advise, educate and nurture single women to successfully find and happily date a Quality Man. She offers the keys to successful dating through education programs, coaching and a supportive social media community geared specifically for single women over 50. Using dating sites and speed dating services can be extremely nerve-wracking, which is why we try to make it easy. Check out our blog page for advice. The Speed Dating Service Since 2002 If you’re gearing up for our World’s Largest Virtual Speed Dating event on Sunday, September 20th then this post is going to come in handy for you. Having been a host of live Speed Dating events in the past, I know A LOT about the topic. When I spoke with singles before any event, they were most worried about having a good conversation and keeping it flowing with their date. psychologist Tara Palmatier in her blog A Shrink for Men. Don t miss out book your place now. Dinner theatre home to talk about anything Keep in mind, the speed dating scene tends to attract singles of a more casual caliber in their dating mindset. So if you’re looking for a serious relationship, speed dating may not be the best approach to find that…but it’s still a fun way to spend a Friday night. In speed dating, you may find one or two singles who turn into a real date. There's some new research on speed dating, coming from researchers in Germany (Asendorpf, Penke, and Back, in press). They set up a speed-dating event and invited a total of 382 people (190 men ... 7 year marriage contract advice Are You The One Ask bisexual bisexual women Break Up Covid19 Dating-Advice Dating Videos flirting Friendship ghosting identiity infidelity lesbian Lesbian/Bi Matchmaking lesbian bed death lesbian dating Lesbian dinner party Lesbian Events Lesbian fast flirting Lesbians lesbian sex Lesbian speed dating love ... Good Speed Dating Questions. Dating Dani, our resident dating expert, has put together a list of useful speed dating questions for you to ask to keep the conversation flowing. “We can all get a little stuck for words and tongue tied from time to time and not know what to say at a speed dating event. Find speed dating stock images in HD and millions of other royalty-free stock photos, illustrations and vectors in the Shutterstock collection. Thousands of new, high-quality pictures added every day. United Kingdom About Blog Avenues Over 50's Dating Specialise in Mature Dating UK & Senior Matchmaking services. We also provide personal introductions for over 50's and fifty plus. Follow the Avenues dating blog, we are an exclusive dating agency for those over 50 and are looking for high-quality matchmaking services.

FRL03WT - Winter Testing Preview & Submission Guide

2019.08.04 01:26 ParisHL FRL03WT - Winter Testing Preview & Submission Guide

Track Information

Track Layout
Information
Track Full Name: Dubai Autodrome GP
Location: Dubai
Country: United Arab Emirates
Length: 5.390
GPS Co-Ordinates: 25.050633 55.239129

Car Parts Relied On

Floor Rear Wing Suspension Brakes Front Wing
24% 26% 23% 13% 14%

Sim Date

Sessions
Day 1: 9th August 2019
Day 2: 10th August 2019
Day 3: 11th August 2019

Race Weekend Strategy Submissions

User Activity
All team owners are checked for activity after each race weekend. From the commencement of the season, team owners that are inactive will receive the following penalties.
A team owner can give prior notices for inactivity, for example a holiday or a known loss of internet connections. In such a case the team boss will not receive warnings. The notices have to be given by the team owner themselves to the FRL Presidents, others cannot do that for him/her! Also absences need to be reasonable in time and reason!
All warnings and strikes will be reset back to zero at the start of each season.
Winter Testing Strategy
Prior to the Winter Testing weekend, you can submit a strategy, which will allow you to replace your main drivers with any of your test drivers and/or youth drivers for one or more of the three testing days, as well as testing your team's car setup.
Set-up hours and engine mode do not carry over to the regular season.
Setup Points
There are two areas of the car for each user to set up, engine mode and chassis work. Higher BHP in the engine mode will increase a cars top speed whereas more set up time invested in chassis work will increase overall speed.
The user can spend between -40 and +40 BHP in both qualifying and the race for each car each weekend. At the end of the season teams will be charged 0.01m per BHP over zero when totalled. Teams will be advised how much they have spent in total each race.
1 ‘bhp’ setup point = 0.020s per lap quicker, roughly. For example, +40 BHP points in Qualifying will make the car concerned 0.800 seconds a lap quicker. Spending more BHP will also make your car more unreliable while spending less will make your car more reliable but also slower.
Users will also have a total of 128 hours of setup to spend on each car for either Qualifying or the Race for each weekend with a maximum of 8 hours per session. Teams can also upgrades their Mechanics Training Facility under factories to increase the effectiveness of chassis set up.
1 setup hour spent at a base level Mechanics Factory is a 0.035s per lap improvement, roughly. For example 8 hours spent in Qualifying at a Level 1 Factory will make the car concerned 0.320s a lap quicker.
Once a team has exhausted their hours on a particular car, they cannot submit anymore setup hours but teams are also under no obligation to use all of their allocation.
If you sub out one of your main drivers for a Test or Youth Driver, they will have their setup points allocation halved, but it will still cost your the same amount of setup points. For example, if you allocate 8 hours each for both Qualifying and Race, but sub out the Main Driver for a Youth Driver, your setup increase will only be 4 and 4.
A driver must drive the car they have driven the most in that season.
How to Submit
Private Message the FRL Presidents in Discord via your private team channel with the following:
- Car A
Day 1 Driver:
Engine Mode:
Chassis Setup Hours:
Day 2 Driver:
Engine Mode:
Chassis Setup Hours:
Day 3 Driver:
Engine Mode:
Chassis Setup Hours:
- Car B
Day 1 Driver:
Engine Mode:
Chassis Setup Hours:
Day 2 Driver:
Engine Mode:
Chassis Setup Hours:
Day 3 Driver:
Engine Mode:
Chassis Setup Hours:
As an example, your submission could look something like this:
- Car A
Day 1 Driver: John Smith
Engine Mode: +20
Chassis Setup Hours: 8
Day 2 Driver: John Smith
Engine Mode: 0
Chassis Setup Hours: 0
Day 3 Driver: John Smith
Engine Mode: -40
Chassis Setup Hours: 7
- Car B
Day 1 Driver: Frank Bloggs
Engine Mode: 0
Chassis Setup Hours: 0
Day 2 Driver: Johnny Commons
Engine Mode: +14
Chassis Setup Hours: 1
Day 3 Driver: Johnny Commons
Engine Mode: -24
Chassis Setup Hours: 5
Submission Deadline
YOU HAVE UNTIL 1PM GMT ON THURSDAY TO SUBMIT YOUR STRATEGIES. SUBMITTING AFTER THE DEADLINE WILL BE AT THE DISCRETION OF THE PRESIDENTS.
submitted by ParisHL to frl [link] [comments]


2018.10.16 22:11 fantastic_comment The journey of leaving Facebookistan

This is a manual on how to leave Facebookistan (Facebook, Instragram, WhatsApp). It is fundamental that you read first the step by step guide to leave Facebook. This is an extend version that explains the Stage 2 - "Bridge" Mode in more detail.
In the following paragraphs is assumed that you already understand all the problems about Facebook Inc and why people are still there. If you don't understand some concepts like Network Effects, ad tech concepts such as pixel tracking, cookie syncing, device/browser fingerprint, real time bidding, hashing, data brokers and how they are related with Facebook please read some of these papers/reports. It is also recommended to watch some of these documentaries and talks.
You have to consider that the large majority of your audience has insufficient computer skills (digital illiteracy), more than 95% of the world population like mention in this study. Most CS grads also don't understand the problem. You can realize that by just looking how many are on Facebookistan. To really understand all the problems of Facebook, you need to be above that level. Not only it is required to have knowledge in computer science, but also in economics, psychology and ethics. This is why it is very important to educate yourself to properly follow this manual.
This process takes 30 days without taking in account the period of self learning. During those 30 days, you will use Facebookistan against itself with just one purpose - decrease the information asymmetry between you and your social graph about the problems of Facebookistan. You will transfer the information and knowledge you have been acquire during the period of self learning. Note that the large majority of people, around 70%, rely only on Facebook for entertainment mixed with a low percentage of news articles - Filter Bubble. In countries with Free Basics/internet.org, Facebookistan is the internet. So it is not surprise they never hear about the scandals about Facebook. Some of them only read the news content present inside Facebookistan, and don't get the "full picture" of the problem. Dopamine effects tells them to scroll down to check the next article, waiting for an update that rewards them - acting like a “slot machine for the brain”.
Some of those who leave Facebookistan return because they not follow this important step. That was the reason to Facebook increase the deletion grace period from 14 days to 30 days. Remember that the main power of Facebookistan relies on Network Effects - Why Facebook Keeps Beating Every Rival: It’s the Network, of Course. The main idea is to give the following message: you are leaving Facebookistan not just because Facebook Inc hurts you but also hurts them, you are doing it to protect them, like a true friend (must read webpage about friendship in The Nicomachean Ethics). You will create multiple empathic interactions - pathos - with your social graph. In the end, when the information asymmetry is close to zero, if he/she doesn't recognize the issue, it means that he/she is not your friend, doesn't respect you. Friendship requires mutual recognition and moral values. If this happens you should think and act like a stoic, you have no power to change their atitudes so it is better to leave them. Facebookistan manipulates society to think that it is better and necessary today to stay on Facebookistan to connect with friends, the fear of missing out (FoMO). Studies shows exactly the opposite, being off Facebookistan, makes you happier, less stressed, less lonely/more sociable, you start to build strong relationships and you become more focus and productive.
You will use (for the first time) Facebookistan to send your messages, if you still are on Instagram or WhatsApp, post the same message on those services via stories. Facebookistan is a huge marketplace to sell user attention. Your posts will compete against thousands of others from people and companies that want to reach your social graph. Facebook prioritize the ones that pays and then the ones that the algorithm thinks will convert in engagement inside the platform via shares, likes, reactions for each individual user. The algorithm prioritize content uploaded directly to Facebookistan: pictures, short videos and short messages. Remember that the majority of Facebookistan useds check the feed via mobile phones in short intervals. This is why each message should be easy to digest, less than 3 seconds to read/watch. Because of these aspects, your posts will probably have almost no user engagement: comments, likes,... and it is perfectly fine. Each of your posts you will be seen by at most by 5 elements of your social graph and if you are lucky, one will read the entire post. Some of them will be censor, like this one. If this happens, use it in your advantage, take a screenshot and uploaded to Facebookistan.
If you post daily, it means during this period you will reach almost the entire social graph. Don't forget to choose your audience, every post should be only seen by your social graph. Select the option to share only with friends. Note that majority of humans are capable of maintaining 150 stable social relationships - Dunbar's number. Some can go up to 250-300. You can post twice a day to be sure that you reach your entire social graph.
Each day should be spent on a specific problem of Facebookistan. For example, on first day, you can discuss how Facebook is stealing money from people that create good educational videos. Share the video from Kurzgesagt (In a Nutshell) - How Facebook is Stealing Billions of Views and one of the articles mentioned in the description hours later. Write a small text explaining the problem, or if you are lazy copy/paste the video description.
On the second day, you can write about censorship, post the video produced by Belgium Museum - Social media doesn't want you to see Rubens' paintings and hours later write a small post about Facebook censors Norway Prime Minister Erna Solberg and the largest Norway’s newspaper about the "Napalm Girl", read the following article.
On the next day, you write a post about Facebook doing large scale psychological experiments, like the news feed experiment to control emotions involving 689,000 users in which friends' postings were moved to influence moods. Another one more recent reveals that Facebook shares psychological insights on young people with advertisers.
Year over year the number of problems of Facebookistan increases. Why can not Mark Zuckerberg fix Facebookistan? The answer is simple, the problem of Facebookistan is Facebookistan itself - the business model. As a corporation, shareholders demand more profits from their product - selling user attention. This requires more personal data and better targeting technologies. Facebookistan build more features on top of existing services to scale quickly like Facebook Dating and Portal video chat and it buys more services such as Instagram, WhatsApp and Onavo to gather more personal data from each used (or not). Free basics/internet.org is a clearly example to Facebookistan to reach the maximum number of people in developing countries - digital colonialism. Facebookistan uses the same playbook as the Tobacco companies - influence young people as soon as possible. Facebook Messenger Kids is a product to addict kids for future profits. Secondly, Facebookistan invests in machine learning technologies, labs to understand and extract more behavior data, in data centers, submarine cables around the world and network hardware to control and speed up the data harvesting. The video Monologue of the Algorithm: how Facebook turns users data into its profit summarizes this entire paragraph perfectly. The video description has more excellent resources.
Selling user attention has massive negative consequences for everyone: privacy violations and "data leaks" like the Cambridge Analytica, political lobbying against privacy regulations, avoiding corporate taxes, helping dictatorial regimes with propaganda, the free basics/internet.org anti net neutrality program, instant articles that hurts journalism, spreading of conspiracy theories and fringe discourse, ... You can find more on the Wikipedia article Criticism of Facebook. You have to enumerate and explain this important problem to your social graph - Facebookistan can't be fixed. Facebookistan is a structural threat to free society and democracy. John Lanchester wrote probably the best critic about the Facebookistan, which is a review of 3 excellent books and cites a number of studies to support his point. I recommend you to read that article. One recent book Anti-Social Media: How Facebook Disconnects Us and Undermines Democracy discuss the problem in more detail.
Another aspect that you should explain multiple times is that Instagram and WhatsApp are Facebook Inc services that share personal data between them. Staying on WhatsApp or Instagram is the same as being on Facebook. The vast majority of people, around 60% in USA, don’t know that Facebook owns Instagram and WhatsApp (around 50% in US). It is fundamental to explain the problems of Instagram and WhatsApp and why you are leaving both.
It can happen that someone reply to your posts with partial or wrong information - "bullsh*t!", a sympton of information asymmetry. The more knwoledge you have, the easier it is to rebut arguments based on the bullsh*t asymmetry principle. Be kind and argue with them what are the wrong assumptions. This webpage shows counterarguments from those who (still) are on Facebook/Instagram/WhatsApp.
If you want some fun, post some comics that criticizes Facebook, Mark Zuckerberg and Sheryl Sandberg. The Joy of Tech has good ones. Last Week Tonight produce in July a video ad in response to the one produced by Facebook about the Cambridge Analytica scandal.
During this period, Facebookistan will be in the news because of another scandal. Take those opportunities to post and explain in further detail the problem. Check the facebook section of the major newspapers/magazines, for english native speakers, BBC, The Guardian, TechCrunch, Wired, more sources here.
After 30 days your timeline will be similar to Facebook: a timeline of abuse by Privacy International. It is time for the grand finale. You post the the last message/letter saying that you will leave Facebookistan because it is the natural action/behavior from the arguments you wrote in the last month - phronesis. Facebook Inc don't respect us, the only solution is to leave those services. Your letter should be similar to Get your loved ones off Facebook (must read). This letter should contain some articles you post during the last 30 days. One suggestion to increase more views and user engagement is to change your profile and cover photo "to a toilet" hours before posting the final letter. The feed algorithm prioritize those kind of changes and future post from you. If some one ask you why you change the pictures, just link to the Last Week Tonight episode about Facebook.
It is important to suggest alternatives during this period - "the bridge". With the knowledge you have now, you know that the only solution relies on federated services - Think Like the Internet, or How to Fight Facebook, and Win. The key idea here is consensus - adopt just one solution that doesn't compromise the social graph of your close ones. People's attention is finite, no human can split/spent their attention between 5 or more different chat services - the power of network effects.
Now that you inform and reach a consensus with your social graph, you should remove your presence on Facebookistan. Again, follow the step by step guide to leave Facebook to avoid Facebookistan tracking you and grabbing personal data from people you love.
Fell free to blog, copy/adapt this text and share it on the open web. The markdown of this manual is available for free here
submitted by fantastic_comment to nosurf [link] [comments]


2018.10.11 13:32 fantastic_comment The journey of leaving Facebookistan

This is a manual on how to leave Facebookistan (Facebook, Instragram, WhatsApp). It is fundamental that you read first the step by step guide to leave Facebook. This is an extend version that explains the Stage 2 - "Bridge" Mode in more detail.
In the following paragraphs is assumed that you already understand all the problems about Facebook Inc and why people are still there. If you don't understand some concepts like Network Effects, ad tech concepts such as pixel tracking, cookie syncing, device/browser fingerprint, real time bidding, hashing, data brokers and how they are related with Facebook please read some of these papers/reports. It is also recommended to watch some of these documentaries and talks.
You have to consider that the large majority of your audience has insufficient computer skills (digital illiteracy), more than 95% of the world population like mention in this study. Most CS grads also don't understand the problem. You can realize that by just looking how many are on Facebookistan. To really understand all the problems of Facebook, you need to be above that level. Not only it is required to have knowledge in computer science, but also in economics, psychology and ethics. This is why it is very important to educate yourself to properly follow this manual.
This process takes 30 days without taking in account the period of self learning. During those 30 days, you will use Facebookistan against itself with just one purpose - decrease the information asymmetry between you and your social graph about the problems of Facebookistan. You will transfer the information and knowledge you have been acquire during the period of self learning. Note that the large majority of people, around 70%, rely only on Facebook for entertainment mixed with a low percentage of news articles - Filter Bubble. In countries with Free Basics/internet.org, Facebookistan is the internet. So it is not surprise they never hear about the scandals about Facebook. Some of them only read the news content present inside Facebookistan, and don't get the "full picture" of the problem. Dopamine effects tells them to scroll down to check the next article, waiting for an update that rewards them - acting like a “slot machine for the brain”.
Some of those who leave Facebookistan return because they not follow this important step. That was the reason to Facebook increase the deletion grace period from 14 days to 30 days. Remember that the main power of Facebookistan relies on Network Effects - Why Facebook Keeps Beating Every Rival: It’s the Network, of Course. The main idea is to give the following message: you are leaving Facebookistan not just because Facebook Inc hurts you but also hurts them, you are doing it to protect them, like a true friend (must read webpage about friendship in The Nicomachean Ethics). You will create multiple empathic interactions - pathos - with your social graph. In the end, when the information asymmetry is close to zero, if he/she doesn't recognize the issue, it means that he/she is not your friend, doesn't respect you. Friendship requires mutual recognition and moral values. If this happens you should think and act like a stoic, you have no power to change their atitudes so it is better to leave them. Facebookistan manipulates society to think that it is better and necessary today to stay on Facebookistan to connect with friends, the fear of missing out (FoMO). Studies shows exactly the opposite, being off Facebookistan, makes you happier, less stressed, less lonely/more sociable, you start to build strong relationships and you become more focus and productive.
You will use (for the first time) Facebookistan to send your messages, if you still are on Instagram or WhatsApp, post the same message on those services via stories. Facebookistan is a huge marketplace to sell user attention. Your posts will compete against thousands of others from people and companies that want to reach your social graph. Facebook prioritize the ones that pays and then the ones that the algorithm thinks will convert in engagement inside the platform via shares, likes, reactions for each individual user. The algorithm prioritize content uploaded directly to Facebookistan: pictures, short videos and short messages. Remember that the majority of Facebookistan useds check the feed via mobile phones in short intervals. This is why each message should be easy to digest, less than 3 seconds to read/watch. Because of these aspects, your posts will probably have almost no user engagement: comments, likes,... and it is perfectly fine. Each of your posts you will be seen by at most by 5 elements of your social graph and if you are lucky, one will read the entire post. Some of them will be censor, like this one. If this happens, use it in your advantage, take a screenshot and uploaded to Facebookistan. Seee this excellent post describing the process - 14 reasons why #DeleteFacebook
If you post daily, it means during this period you will reach almost the entire social graph. Don't forget to choose your audience, every post should be only seen by your social graph. Select the option to share only with friends. Note that majority of humans are capable of maintaining 150 stable social relationships - Dunbar's number. Some can go up to 250-300. You can post twice a day to be sure that you reach your entire social graph.
Each day should be spent on a specific problem of Facebookistan. For example, on first day, you can discuss how Facebook is stealing money from people that create good educational videos. Share the video from Kurzgesagt (In a Nutshell) - How Facebook is Stealing Billions of Views and one of the articles mentioned in the description hours later. Write a small text explaining the problem, or if you are lazy copy/paste the video description.
On the second day, you can write about censorship, post the video produced by Belgium Museum - Social media doesn't want you to see Rubens' paintings and hours later write a small post about Facebook censors Norway Prime Minister Erna Solberg and the largest Norway’s newspaper about the "Napalm Girl", read the following article.
On the next day, you write a post about Facebook doing large scale psychological experiments, like the news feed experiment to control emotions involving 689,000 users in which friends' postings were moved to influence moods. Another one more recent reveals that Facebook shares psychological insights on young people with advertisers.
Year over year the number of problems of Facebookistan increases. Why can not Mark Zuckerberg fix Facebookistan? The answer is simple, the problem of Facebookistan is Facebookistan itself - the business model. As a corporation, shareholders demand more profits from their product - selling user attention. This requires more personal data and better targeting technologies. Facebookistan build more features on top of existing services to scale quickly like Facebook Dating and Portal video chat and it buys more services such as Instagram, WhatsApp and Onavo to gather more personal data from each used (or not). Free basics/internet.org is a clearly example to Facebookistan to reach the maximum number of people in developing countries - digital colonialism. Facebookistan uses the same playbook as the Tobacco companies - influence young people as soon as possible. Facebook Messenger Kids is a product to addict kids for future profits. Secondly, Facebookistan invests in machine learning technologies, labs to understand and extract more behavior data, in data centers, submarine cables around the world and network hardware to control and speed up the data harvesting. The video Monologue of the Algorithm: how Facebook turns users data into its profit summarizes this entire paragraph perfectly. The video description has more excellent resources.
Selling user attention has massive negative consequences for everyone: privacy violations and "data leaks" like the Cambridge Analytica, political lobbying against privacy regulations, avoiding corporate taxes, helping dictatorial regimes with propaganda, the free basics/internet.org anti net neutrality program, instant articles that hurts journalism, spreading of conspiracy theories and fringe discourse, ... You can find more on the Wikipedia article Criticism of Facebook. You have to enumerate and explain this important problem to your social graph - Facebookistan can't be fixed. Facebookistan is a structural threat to free society and democracy. John Lanchester wrote probably the best critic about the Facebookistan, which is a review of 3 excellent books and cites a number of studies to support his point. I recommend you to read that article. One recent book Anti-Social Media: How Facebook Disconnects Us and Undermines Democracy discuss the problem in more detail.
Another aspect that you should explain multiple times is that Instagram and WhatsApp are Facebook Inc services that share personal data between them. Staying on WhatsApp or Instagram is the same as being on Facebook. The vast majority of people, around 60% in USA, don’t know that Facebook owns Instagram and WhatsApp (around 50% in US). It is fundamental to explain the problems of Instagram and WhatsApp and why you are leaving both.
It can happen that someone reply to your posts with partial or wrong information - "bullsh*t!", a sympton of information asymmetry. The more knwoledge you have, the easier it is to rebut arguments based on the bullsh*t asymmetry principle. Be kind and argue with them what are the wrong assumptions. This webpage shows counterarguments from those who (still) are on Facebook/Instagram/WhatsApp.
If you want some fun, post some comics that criticizes Facebook, Mark Zuckerberg and Sheryl Sandberg. The Joy of Tech has good ones. Last Week Tonight produce in July a video ad in response to the one produced by Facebook about the Cambridge Analytica scandal.
During this period, Facebookistan will be in the news because of another scandal. Take those opportunities to post and explain in further detail the problem. Check the facebook section of the major newspapers/magazines, for english native speakers, BBC, The Guardian, TechCrunch, Wired, more sources here.
After 30 days your timeline will be similar to Facebook: a timeline of abuse by Privacy International. It is time for the grand finale. You post the the last message/letter saying that you will leave Facebookistan because it is the natural action/behavior from the arguments you wrote in the last month - phronesis. Facebook Inc don't respect us, the only solution is to leave those services. Your letter should be similar to Get your loved ones off Facebook (must read). This letter should contain some articles you post during the last 30 days. One suggestion to increase more views and user engagement is to change your profile and cover photo "to a toilet" hours before posting the final letter. The feed algorithm prioritize those kind of changes and future post from you. If some one ask you why you change the pictures, just link to the Last Week Tonight episode about Facebook.
It is important to suggest alternatives during this period - "the bridge". With the knowledge you have now, you know that the only solution relies on federated services - Think Like the Internet, or How to Fight Facebook, and Win. The key idea here is consensus - adopt just one solution that doesn't compromise the social graph of your close ones. People's attention is finite, no human can split/spent their attention between 5 or more different chat services - the power of network effects.
Now that you inform and reach a consensus with your social graph, you should remove your presence on Facebookistan. Again, follow the step by step guide to leave Facebook to avoid Facebookistan tracking you and grabbing personal data from people you love.
Fell free to blog, copy/adapt this text and share it on the open web. The markdown of this manual is available for free here
submitted by fantastic_comment to AntiFacebook [link] [comments]


2015.12.26 20:16 ampromoco An attempt at a fully comprehensive look at how to scale bitcoin. Lets bring Bitcoin out of Beta!

 
WARNING THIS IS GOING TO BE A REALLY REALLY LONG POST BUT PLEASE READ IT ALL. SCALING BITCOIN IS A COMPLEX ISSUE! HOPEFULLY HAVING ALL THE INFO IN ONE PLACE SHOULD BE USEFUL
 
Like many people in the community I've spent the past month or so looking deeply into the bitcoin scaling debate. I feel there has never been a fully comprehensive thread on how bitcoin could scale. The closest I have seen is gavinandresen's medium posts back in the summer describing the problem and a solution, and pre-emptively answering supposed problems with the solution. While these posts got to the core of the issue and spawned the debate we have been having, they were quite general and could have used more data in support. This is my research and proposal to scale bitcoin and bring the community back together.
 
 
The Problem
 
There seems to me to be five main fundamental forces at play in finding a balanced solution;
  • 'node distribution',
  • 'mining decentralisation',
  • 'network utility',
  • 'time',
  • 'adoption'.
 
 
Node Distribution
Bandwidth has a relationship to node count and therefore 'node distribution'. This is because if bandwidth becomes too high then fewer people will be able to run a node. To a lesser extent bandwidth also effects 'mining decentralisation' as miners/pool owners also need to be able to run a node. I would argue that the centralisation pressures in relation to bandwidth are negligible though in comparison to the centralisation pressure caused by the usefulness of larger pools in reducing variance. The cost of a faster internet connection is negligible in comparison to the turnover of the pools. It is important to note the distinction between bandwidth required to propagate blocks quickly and the bandwidth required to propagate transactions. The bandwidth required to simply propagate transactions is still low today.
New node time (i.e. the time it takes to start up a new node) also has a relationship with node distribution. i.e. If it takes too long to start a new node then fewer people will be willing to take the time and resources to start a new node.
Storage Space also has a relationship with node distribution. If the blockchain takes up too much space on a computer then less people will be willing to store the whole blockchain.
Any suitable solution should look to not decrease node distribution significantly.
 
Mining Decentralisation
Broadcast time (the time it takes to upload a block to a peer) has a relationship with mining centralisation pressures. This is because increasing broadcast time increases the propagation time, which increases the orphan rate. If the orphan rate it too high then individual miners will tend towards larger pools.
Validation time (the time it to validate a block) has a relationship with mining centralisation pressures. This is because increasing validation time increases the propagation time, which increases the orphan rate. If the orphan rate it too high then individual miners will tend towards larger pools.
Any suitable solution should look to not increase mining centralisation significantly.
 
Network Utility
Network Utility is one that I find is often overlooked, is not well understood but is equally as important. The network utility force acts as a kind of disclaimer to the other two forces. It has a balancing effect. Increasing the network utility will likely increase user adoption (The more useful something is, the more people will want to use it) and therefore decreasing network utility will likely decrease user adoption. User adoption has a relationship with node count. i.e. the more people, companies and organisations know about and use bitcoin, the more people, companies and organisations that will run nodes. For example we could reduce block size down to 10KB, which would reduce broadcast time and validation time significantly. This would also therefore reduce mining centralisation pressures significantly. What is very important to realise though is that network utility would also be significantly be reduced (fewer people able to use bitcoin) and therefore so would node distribution. Conversely, if we increased the block size (not the limit) right now to 10GB, the network utility would be very high as bitcoin would be able to process a large number of transactions but node distribution would be low and mining centralisation pressures would be high due to the larger resource requirements.
Any suitable solution should look to increase network utility as time increases.
 
Time
Time is an important force because of how technology improves over time. Technology improves over time in a semi-predicable fashion (often exponential). As we move through time, the cost of resources required to run the bitcoin network (if the resource requirements remained static) will decrease. This means that we are able to increase resource requirements proportional to technological improvements/cost reductions without any increase in costs to the network. Technological improvements are not perfectly predictable though so it could be advantageous to allow some buffer room for when technological improvements do not keep up with predictions. This buffer should not be applied at the expense of the balance between the other forces though (i.e. make the buffer too big and network utility will be significantly decreased).
 
 
Adoption
Increasing adoption means more people using the bitcoin/blockchain network. The more people use bitcoin the more utility it has, and the more utility Bitcoin has the more people will want to use it (network effect). The more people use bitcoin, the more people there that have an incentive to protect bitcoin.
Any suitable solution should look to increase adoption as time increases.
 
 
The Solution Proposed by some of the bitcoin developers - The Lightning Network
 
The Lightning Network (LN) is an attempt at scaling the number of transactions that can happen between parties by not publishing any transaction onto the blockchain unless it is absolutely necessary. This is achieved by having people pool bitcoin together in a "Channel" and then these people can transact instantly within that channel. If any shenanigans happen between any of the parties, the channel can be closed and the transactions will be settled on the blockchain. The second part of their plan is limit the block size to turn bitcoin into a settlement network. The original block size limit of 1MB was originally put in place by Satoshi as an anti-DOS measure. It was to make sure a bad actor could not propagate a very large block that would crash nodes and increase the size of the blockchain unnecessarily. Certain developers now want to use this 1MB limit in a different way to make sure that resource requirements will stay low, block space always remains full, fees increase significantly and people use the lightning network as their main way of transacting rather than the blockchain. They also say that keeping the resource requirements very low will make sure that bitcoin remains decentralised.
 
Problems with The Lightning Network
The LN works relatively well (in theory) when the cost and time to publish a set of transactions to the network are kept low. Unfortunately, when the cost and time to publish a set of transactions on the blockchain become high, the LN's utility is diminished. The trust you get from a transaction on the LN comes only from the trustless nature of having transactions published to the bitcoin network. What this means is that if a transaction cannot be published on the bitcoin network then the LN transaction is not secured at all. As transactions fees rise on the bitcoin blockchain the LN utility is diminished. Lets take an example:
  • Cost of publishing a transaction to the bitcoin network = $20
  • LN transaction between Bob and Alice = $20.
  • Transaction between Bob and Alice has problem therefore we want to publish it to the blockchain.
  • Amount of funds left after transaction is published to the blockchain = $20 - $20 = $0.
This is also not a binary situation. If for example in this scenario, the cost to publish the transaction to blockchain was $10 then still only 50% of the transaction would be secure. It is unlikely anyone really call this a secure transaction.
Will a user make a non-secured/poorly secured transaction on the LN when they could make the same transaction via an altcoin or non-cryptocurrency transaction and have it well secured? It's unlikely. What is much more likely to happen is that transaction that are not secured by bitcoin because of the cost to publish to the blockchain will simply overflow into altcoins or will simply not happen on any cryptocurrency network. The reality is though, that we don't know exactly what will happen because there is no precedent for it.
Another problem outside of security is convenience. With a highly oversaturated block space (very large backlog of transactions) it could take months to have a transaction published to the blockchain. During this time your funds will simply be stuck. If you want to buy a coffee with a shop you don't have a channel open with, instead of simply paying with bitcoin directly, you would have to wait months to open a channel by publishing a transaction to the bitcoin blockchain. I think your coffee might be a little cold by then (and mouldy).
I suggest reading this excellent post HERE for other rather significant problems with the LN when people are forced to use it.
The LN is currently not complete and due to its high complexity it will take some time to have industry wide implementation. If it is implemented on top of a bitcoin-as-a-settlement-network economy it will likely have very little utility.
 
Uses of The LN
The LN is actually an extremely useful layer-2 technology when it is used with it's strengths. When the bitcoin blockchain is fast and cheap to transact on, the LN is also extremely useful. One of the major uses for the LN is for trust-based transactions. If you are transacting often between a set of parties you can truly trust then using LN makes absolute sense since the trustless model of bitcoin is not necessary. Then once you require your funds to be unlocked again it will only take a short time and small cost to open them up to the full bitcoin network again. Another excellent use of LN would be for layer-3 apps. For example a casino app: Anyone can by into the casino channel and play using real bitcoins instantly in the knowledge that is anything nefarious happens you can instantly settle and unlock your funds. Another example would be a computer game where you can use real bitcoin in game, the only difference is that you connect to the game's LN channel and can transact instantly and cheaply. Then whenever you want to unlock your funds you can settle on the blockchain and use your bitcoins normally again.
LN is hugely more powerful, the more powerful bitcoin is. The people making the LN need to stick with its strengths rather than sell it as an all-in-one solution to bitcoin's scaling problem. It is just one piece of the puzzle.
 
 
Improving Network Efficiency
 
The more efficient the network, the more we can do with what we already have. There are a number of possible efficiency improvements to the network and each of them has a slightly different effect.
 
Pruning
Pruning allows the stored blockchain size to be reduced significantly by not storing old data. This has the effect of lowering the resource requirements of running a node. a 40GB unpruned blockchain would be reduced in size to 550MB. (It is important to note that a pruned node has lower utility to the network)
 
Thin Blocks
Thin blocks uses the fact that most of the nodes in the network already have a list of almost all the same transactions ready to be put into the blockchain before a block is found. If all nodes use the same/similar policy for which transactions to include in a block then you only need to broadcast a small amount of information across the network for all nodes to know which transactions have been included (as opposed to broadcasting a list of all transactions included in the block). Thin Blocks have the advantage of reducing propagation which lowers the mining centralisation pressure due to orphaned blocks.
 
libsecp256k1 libsecp256k1 allows a more efficient way of validating transactions. This means that propagation time is reduced which lowers the mining centralisation pressure due to orphaned blocks. It also means reduced time to bootstrap the blockchain for a new node.
 
Serialised Broadcast
Currently block transmission to peers happens in parallel to all connected peers. Obviously for block propagation this is a poor choice in comparison to serial transmission to each peer one by one. Using parallel transmission means that the more peers you have, the slower the propagation, whereas serial transmission does not suffer this problem. The problem that serial transmission does suffer from though is variance. If the order that you send blocks to peers in is random, then it means sometimes you will send blocks to a peer who has a slow/fast connection and/or is able to validate slowly/quickly. This would mean the average propagation time would increase with serialised transmission but depending on your luck you would sometimes have faster propagation and sometimes have slower propagation. As this will lower propagation time it will also lower the mining centralisation pressure due to orphaned blocks. (This is just a concept at the moment but I don't see why it couldn't be implemented).
 
Serialised Broadcast Sorting
This is a fix for the variance that would occur due to serialised broadcast. This sorts the order that you broadcast a block to each peer into; fastest upload + validation speed first and slowest upload speed and validation speed last. This not only decreases the variance to zero but also allows blocks to propagation to happen much faster. This also has the effect of lowering the mining centralisation pressure due to orphaned blocks. (This is just a concept at the moment but I don't see why it couldn't be implemented).
 
Here is a table below that shows roughly what the effects these solutions should have.
Name Bandwidth Broadcast Time Validation Time New Node Time Storage Space
Pruning 1 1 1 1 0.014
Thin Blocks 0.42 0.1 0.1 1 1
libsecp256k1 1 1 0.2 0.6 1
Serialised Broadcast 1 0.5 1 1 1
KYN 1 0.75 1 1 1
Segregated Witness 1 1 1 0.4 1
TOTAL 0.42 0.0375 0.02 0.24 0.014
Multiplier 2.38 26.7 50 - 70
(The "multiplier" shows how many times higher the block size could be relative to the specific function.)
 
 
The Factors in Finding a Balanced Solution
 
At the beginning of this post I detailed a relatively simple framework for finding a solution by describing what the problem is. There seems to me to be five main fundamental forces at play in finding a balanced solution; 'node distribution', 'mining decentralisation', 'network utility', 'time' and 'adoption'. The optimal solution needs to find a balance between all of these forces taking into account a buffer to offset our inability to predict the future with absolute accuracy.
To find a suitable buffer we need to assign a set of red line values which certain values should not pass if we want to make sure bitcoin continues to function as well as today (at a minimum). For example, percentage of orphans should stay below a certain value. These values can only be a best estimate due to the complexity of bitcoin economics, although I have tried to provide as sound reasoning as possible.
 
Propagation time
It seems a fair limit for this would be roughly what we have now. Bitcoin is still functioning now. Could mining be more decentralised? Yes, of course, but it seems bitcoin is working fine right now and therefore our currently propagation time for blocks is a fairly conservative limit to set. Currently 1MB blocks take around 15 seconds to propagate more than 50% of the network. 15 second propagation time is what I will be using as a limit in the solution to create a buffer.
 
Orphan Rate
This is obviously a value that is a function of propagation time so the same reasoning should be used. I will use a 3% limit on orphan rate in the solution to create a buffer.
 
Non-Pruned Node Storage Cost
For this I am choosing a limit of $200 in the near-term and $600 in the long-term. I have chosen these values based on what I think is a reasonable (maximum) for a business or enthusiast to pay to run a full node. As the number of transactions increases as more people use bitcoin the number of people willing to pay a higher price to run a node will also increase although the percentage of people will decrease. These are of course best guess values as there is no way of knowing exactly what percentage of users are willing to pay what.
 
Pruned Node Storage Cost
For this I am choosing a limit of $3 in the near-term (next 5 years) and $9 in the long-term (Next 25 years). I have chosen these values based on what I think is a reasonable (maximum) for normal bitcoin user to pay. In fact this cost will more likely be zero as almost all users have an amount of storage free on their computers.
 
Percentage of Downstream Bandwidth Used
This is a best guess at what I think people who run nodes would be willing to use to be connected to the bitcoin network directly. I believe using 10% (maximum) of a users downstream bandwidth is the limit of what is reasonable for a full node (pruned and non-pruned). Most users would continue to access the blockchain via SPV wallets though. Downstream is generally a much more valuable resource to a user than upstream due to the nature of the internet usage.
 
Percentage of Upstream Bandwidth Used
This is a best guess at what I think people who run nodes would be willing to use to be connected to the bitcoin network directly. I believe using 25% (maximum) of a users downstream bandwidth is the limit of what is reasonable for a full node (pruned and non-pruned). Most users would continue to access the blockchain via SPV wallets though. Upstream is generally a much less valuable resource to a user than downstream due to the nature of the internet usage.
 
Time to Bootstrap a New Node
My limit for this value is at 5 days using 50% of downstream bandwidth in the near-term and 30 days in the long-term. This seems like a reasonable number to me for someone who wants to start running a full node. Currently opening a new bank account takes at least week until everything is set up and you have received your cards, so it seems to me people would be willing to wait this long to become connected. Again, this is a best guess on what people would be willing to do to access the blockchain in the future. Most users requiring less security will be able to use an SPV wallet.
It is important to note that we only need enough nodes to make sure the blockchain is distributed across many places with many backups of the full blockchain. It is likely that a few thousand is a minimum for this. Increasing this amount to hundreds of thousands or millions of full nodes is not necessarily that much of an advantage to node distribution but could be a significant disadvantage to mining centralisation. This is because the more nodes you have in the network, the longer it takes to propagate >50% of it.
 
Storage Cost Price Reduction Over Time
Storage cost follows a linear logarithmic trend. Costs of HDD reducing by 10 times every 5 years, although this has slowed over the past few years. This can be attributed to the flooding in South East Asia and the transition to SSD technology. SSD technology also follows the linear logarithmic trend of costs reducing 10 times every 5 years, or roughly decreasing 37% per year.
 
Average Upload and Download Bandwidth Increases Over Time
Average upload and download bandwidth increases in a linear logarithmic trend. Both upload and download bandwidth follow the same trend of doubling roughly every two years, or increasing 40% per year.
 
Price
I was hesitant to include this one here but I feel it is unavoidable. Contrary to what people say (often when the price is trending downwards) bitcoin price is an extremely important metric in the long-term. Depending on bitcoin's price, bitcoin's is useful to; enthusiasts->some users->small companies->large companies->nations->the world, in roughly that order. The higher bitcoin's price is the more liquid the market will be and the more difficult it will be to move the price, therefore increasing bitcoin's utility. Bitcoin's price in the long-term is linked to adoption, which seems to happen in waves, as can be seen in the price bubbles over the years. If we are planning/aiming for bitcoin to at least become a currency with equal value to one of the worlds major currencies then we need to plan for a market cap and price that reflect that. I personally think there are two useful targets we should use to reflect our aims. The first, lower target is for bitcoin to have a market cap the size of a major national currency. This would put the market cap at around 2.1 trillion dollars or $100,000 per bitcoin. The second higher target is for bitcoin to become the world's major reserve currency. This would give bitcoin a market cap of around 21 trillion dollars and a value of $1,000,000 per bitcoin. A final, and much more difficult target is likely to be bitcoin as the only currency across the world, but I am not sure exactly how this could work so for now I don't think this is worth considering.
 
As price increases, so does the subsidy reward given out to miners who find blocks. This reward is semi-dynamic in that it remains static (in btc terms) until 210,000 blocks are found and then the subsidy is then cut in half. This continues to happen until all 21,000,000 bitcoins have been mined. If the value of each bitcoin increases faster than the btc denominated subsidy decreases then the USD denominated reward will be averagely increasing. Historically the bitcoin price has increased significantly faster than subsidy decreases. The btc denominated subsidy halves roughly every 4 years but the price of bitcoin has historically increased roughly 50 fold in the same time.
 
Bitcoin adoption should happen in a roughly s-curve dynamic like every other technology adoption. This means exponential adoption until the market saturation starts and adoption slows, then the finally is the market becomes fully saturated and adoption slowly stops (i.e. bitcoin is fully adopted). If we assume the top of this adoption s-curve has one of the market caps above (i.e. bitcoin is successful) then we can use this assumption to see how we can transition from a subsidy paid network to a transaction fee paid network.
 
Adoption
Adoption is the most difficult metric to determine. In fact it is impossible to determine accurately now, let alone in the future. It is also the one of the most important factors. There is no point in building software that no one is going to use after all. Equally, there is no point in achieving a large amount of adoption if bitcoin offers none of the original value propositions. Clearly there is a balance to be had. Some amount of bitcoin's original value proposition is worth losing in favour of adoption, and some amount of adoption is worth losing to keep bitcoin's original value proposition. A suitable solution should find a good balance between the two. It is clear though that any solution must have increased adoption as a basic requirement, otherwise it is not a solution at all.
 
One major factor related to adoption that I rarely see mentioned, is stability and predictability. This is relevant to both end users and businesses. End users rely on stability and predictability so that they do not have to constantly check if something has changed. When a person goes to get money from a cash machine or spend money in a shop, their experience is almost identical every single time. It is highly dependable. They don't need to keep up-to-date on how cash machines or shops work to make sure they are not defrauded. They know exactly what is going to happen without having to expend any effort. The more deviation from the standard experience a user experiences and the more often a user experiences a deviation, the less likely a user is going to want to continue to use that service. Users require predictability extending into the past. Businesses who's bottom line is often dependent on reliable services also require stability and predictability. Businesses require predictability that extends into the future so that they can plan. A business is less likely to use a service for which they do not know they can depend on in the future (or they know they cannot depend on).
For bitcoin to achieve mass adoption it needs a long-term predictable and stable plan for people to rely on.
 
 
The Proposal
 
This proposal is one based on determining a best fit balance of every factor and a large enough buffer to allows for our inability to perfectly predict the future. No one can predict the future with absolutely certainty but it does not mean we cannot make educated guesses and plan for it.
 
The first part of the proposal is to spend 2016 implementing all available efficiency improvements (i.e the ones detailed above) and making sure the move to a scaled bitcoin happens as smoothly as possible. It seems we should set a target of implementing all of the above improvements within the first 6 months of 2016. These improvements should be implemented in the first hardfork of its kind, with full community wide consensus. A hardfork with this much consensus is the perfect time to test and learn from the hardforking mechanism. Thanks to Seg Wit, this would give us an effective 2 fold capacity increase and set us on our path to scalability.
 
The second part of the proposal is to target the release of a second hardfork to happen at the end of 2016. Inline with all the above factors this would start with a real block size limit increase to 2MB (effectively increasing the throughput to 4x compared to today thanks to Seg Wit) and a doubling of the block size limit every two years thereafter (with linear scaling in between). The scaling would end with an 8GB block size limit in the year 2039.
 
 
How does the Proposal fit inside the Limits
 
 
Propagation time
If trends for average upload and bandwidth continue then propagation time for a block to reach >50% of the nodes in the network should never go above 1s. This is significantly quickly than propagation times we currently see.
In a worst case scenario we can we wrong in the negative direction (i.e. bandwidth does not increase as quickly as predicted) by 15% absolute and 37.5% relative (i.e. bandwidth improves at a rate of 25% per year rather than the predicted 40%) and we would still only ever see propagation times similar to today and it would take 20 years before this would happen.
 
Orphan Rate
Using our best guess predictions the orphan rate would never go over 0.2%.
In a worst case scenario where we are wrong in our bandwidth prediction in the negative direction by 37.5% relative, orphan rate would never go above 2.3% and it would take over 20 years to happen.
 
Non-Pruned Node Storage Cost
Using our best guess predictions the cost of storage for a non-pruned full node would never exceed $40 with blocks consistently 50% full and would in fact decrease significantly after reaching the peak cost. If blocks were consistently 100% full (which is highly unlikely) then the maximum cost of an un-pruned full node would never exceed $90.
In a worst case scenario where we are wrong in our bandwidth prediction in the negative direction by 37.5% relative and we are wrong in our storage cost prediction by 20% relative (storage cost decreases in cost by 25% per year instead of the predicted 37% per year), we would see a max cost to run a node with 50% full blocks of $100 by 2022 and $300 by 2039. If blocks are always 100% full then this max cost rises to $230 by 2022 and $650 in 2039. It is important to note that for storage costs to be as high as this, bitcoin will have to be enormously successful, meaning many many more people will be incentivised to run a full node (businesses etc.)
 
Pruned Node Storage Cost
Using our best guess predictions the cost of storage for a pruned full node would never exceed $0.60 with blocks consistently 50% full. If blocks were consistently 100% full (which is highly unlikely) then the max cost of an un-pruned full node would never exceed $1.30.
In a worst case scenario where we are wrong in our bandwidth prediction in the negative direction by 37.5% relative and we are wrong in our storage cost prediction by 20% relative (storage cost decreases in cost by 25% per year instead of the predicted 37% per year), we would see a max cost to run a node with 50% full blocks of $1.40 by 2022 and $5 by 2039. If blocks are always 100% full then this max cost rises to $3.20 by 2022 and $10 in 2039. It is important to note that at this amount of storage the cost would be effectively zero since users almost always have a large amount of free storage space on computers they already own.
 
Percentage of Downstream Bandwidth Used
Using our best guess predictions running a full node will never use more than 0.3% of a users download bandwidth (on average).
In a worst case scenario we can we wrong in the negative direction by 37.5% relative in our bandwidth predictions and we would still only ever see a max download bandwidth use of 4% (average).
 
Percentage of Upstream Bandwidth Used
Using our best guess predictions running a full node will never use more than 1.6% of a users download bandwidth (on average).
In a worst case scenario we can we wrong in the negative direction by 37.5% relative in our bandwidth predictions and we would only ever see a max download bandwidth use of 24% (average) and this would take over 20 years to occur.
 
Time to Bootstrap a New Node
Using our best guess predictions bootstrapping a new node onto the network should never take more than just over a day using 50% bandwidth.
In a worst case scenario we can we wrong in the negative direction by 37.5% relative in our bandwidth predictions and it would take one and 1/4 days to bootstrap the blockchain using 50% of the download bandwidth. By 2039 it would take 16 days to bootstrap the entire blockchain when using 50% bandwidth. I think it is important to note that by this point it is very possible the bootstrapping the blockchain could very well be done by simply buying an SSD with blockchain already bootstrapped. 16 days would be a lot of time to download software but it does not necessarily mean a decrease in centralisation. As you will see in the next section, if bitcoin has reached this level of adoption, there may well be many parties will to spend 16 days downloading the blockchain.
 
What if Things Turn Out Worse than the Worse Case?
While it is likely that future trends in the technology required to scale bitcoin will continue relatively similar to the past, it is possible that the predictions are completely and utterly wrong. This plan takes this into account though by making sure the buffer is large enough to give us time to adjust our course. Even if no technological/cost improvements (near zero likelihood) are made to bandwidth and storage in the future this proposal still gives us years to adjust course.
 
 
What Does This Mean for Bitcoin?
 
Significantly Increased Adoption
For comparison, Paypal handles around 285 transactions per second (tps), VISA handles around 2000tps and the total global non-cash transactions are around 12,400tps.
Currently bitcoin is capable of handling a maximum of around 3.5 transactions every second which are published to the blockchain roughly every 10 minutes. With Seg Wit implemented via a hardfork, bitcoin will be capable or around 7tps. With this proposal bitcoin will be capable of handling more transactions than Paypal (assuming Paypal experiences growth of around 7% per year) in the year 2027. Bitcoin will overtake VISA's transaction capability by the year 2035 and at the end of the growth cycle in 2039 it will be able to handle close to 50% of the total global non-cash transactions.
When you add on top second layer protocols( like the LN), sidechains, altcoins and off-chain transactions, there should be more than enough capacity for the whole world and every possible conceivable use for digital value transfer.
 
Transitioning from a Subsidy to a Transaction Fee Model
Currently mining is mostly incentivised by the subsidy that is given by the network (currently 25btc per block). If bitcoin is to widely successful it is likely that price increases will continue to outweigh btc denominated subsidy decreases for some time. This means that currently it is likely to be impossible to try to force the network into matching a significant portion of the subsidy with fees. The amount of fees being paid to miners has averagely increased over time and look like they will continue to do so. It is likely that the optimal time for fees to start seriously replacing the subsidy is when bitcoin adoption starts to slow. Unless you take a pessimistic view of bitcoin (thinking bitcoin is as big as it ever will be), it is reasonable to assume this will not happen for some time.
With this proposal, using an average fee of just $0.05, total transaction fees per day would be:
  • Year 2020 = $90,720
  • Year 2025 = $483,840.00
  • Year 2030 = $2,903,040.00
  • Year 2035 = $15,482,880.00
  • Year 2041 = $123,863,040.00 (full 8GB Blocks)
Miners currently earn a total of around $2 million dollars per day in revenue, significantly less than the $124 million dollars in transaction fee revenue possible using this proposal. That also doesn't include the subsidy which would still play some role until the year 2140. This transaction fee revenue would be a yearly revenue of $45 billion for miners when transaction fees are only $0.05 on average.
 
 
Proposal Data
You can use these two spreadsheets (1 - 2 ) to see the various metrics at play over time. The first spreadsheet shows the data using the predicted trends and the second spreadsheet shows the data with the worst case trends.
 
 
Summary
 
It's very clear we are on the edge/midst of a community (and possibly a network) split. This is a very dangerous situation for bitcoin. A huge divide has appeared in the community and opinions are becoming more and more entrenched on both sides. If we cannot come together and find a way forward it will be bad for everyone except bitcoin's competition and enemies. While this proposal is born from an attempt at finding a balance based on as many relevant factors as possible, it also fortunately happens to fall in between the two sides of the debate. Hopefully the community can see this proposal as a way of making a compromise, releasing the entrenchment and finding a way forward to scale bitcoin. I have no doubt that if we can do this, bitcoin will have enormous success in the years to come.
 
Lets bring bitcoin out of beta together!!
submitted by ampromoco to btc [link] [comments]


2015.12.26 20:14 ampromoco An attempt at a fully comprehensive look at how to scale bitcoin. Lets bring Bitcoin out of Beta!

 
WARNING THIS IS GOING TO BE A REALLY REALLY LONG POST BUT PLEASE READ IT ALL. SCALING BITCOIN IS A COMPLEX ISSUE! HOPEFULLY HAVING ALL THE INFO IN ONE PLACE SHOULD BE USEFUL
 
Like many people in the community I've spent the past month or so looking deeply into the bitcoin scaling debate. I feel there has never been a fully comprehensive thread on how bitcoin could scale. The closest I have seen is gavinandresen's medium posts back in the summer describing the problem and a solution, and pre-emptively answering supposed problems with the solution. While these posts got to the core of the issue and spawned the debate we have been having, they were quite general and could have used more data in support. This is my research and proposal to scale bitcoin and bring the community back together.
 
 
The Problem
 
There seems to me to be five main fundamental forces at play in finding a balanced solution;
  • 'node distribution',
  • 'mining decentralisation',
  • 'network utility',
  • 'time',
  • 'adoption'.
 
 
Node Distribution
Bandwidth has a relationship to node count and therefore 'node distribution'. This is because if bandwidth becomes too high then fewer people will be able to run a node. To a lesser extent bandwidth also effects 'mining decentralisation' as miners/pool owners also need to be able to run a node. I would argue that the centralisation pressures in relation to bandwidth are negligible though in comparison to the centralisation pressure caused by the usefulness of larger pools in reducing variance. The cost of a faster internet connection is negligible in comparison to the turnover of the pools. It is important to note the distinction between bandwidth required to propagate blocks quickly and the bandwidth required to propagate transactions. The bandwidth required to simply propagate transactions is still low today.
New node time (i.e. the time it takes to start up a new node) also has a relationship with node distribution. i.e. If it takes too long to start a new node then fewer people will be willing to take the time and resources to start a new node.
Storage Space also has a relationship with node distribution. If the blockchain takes up too much space on a computer then less people will be willing to store the whole blockchain.
Any suitable solution should look to not decrease node distribution significantly.
 
Mining Decentralisation
Broadcast time (the time it takes to upload a block to a peer) has a relationship with mining centralisation pressures. This is because increasing broadcast time increases the propagation time, which increases the orphan rate. If the orphan rate it too high then individual miners will tend towards larger pools.
Validation time (the time it to validate a block) has a relationship with mining centralisation pressures. This is because increasing validation time increases the propagation time, which increases the orphan rate. If the orphan rate it too high then individual miners will tend towards larger pools.
Any suitable solution should look to not increase mining centralisation significantly.
 
Network Utility
Network Utility is one that I find is often overlooked, is not well understood but is equally as important. The network utility force acts as a kind of disclaimer to the other two forces. It has a balancing effect. Increasing the network utility will likely increase user adoption (The more useful something is, the more people will want to use it) and therefore decreasing network utility will likely decrease user adoption. User adoption has a relationship with node count. i.e. the more people, companies and organisations know about and use bitcoin, the more people, companies and organisations that will run nodes. For example we could reduce block size down to 10KB, which would reduce broadcast time and validation time significantly. This would also therefore reduce mining centralisation pressures significantly. What is very important to realise though is that network utility would also be significantly be reduced (fewer people able to use bitcoin) and therefore so would node distribution. Conversely, if we increased the block size (not the limit) right now to 10GB, the network utility would be very high as bitcoin would be able to process a large number of transactions but node distribution would be low and mining centralisation pressures would be high due to the larger resource requirements.
Any suitable solution should look to increase network utility as time increases.
 
Time
Time is an important force because of how technology improves over time. Technology improves over time in a semi-predicable fashion (often exponential). As we move through time, the cost of resources required to run the bitcoin network (if the resource requirements remained static) will decrease. This means that we are able to increase resource requirements proportional to technological improvements/cost reductions without any increase in costs to the network. Technological improvements are not perfectly predictable though so it could be advantageous to allow some buffer room for when technological improvements do not keep up with predictions. This buffer should not be applied at the expense of the balance between the other forces though (i.e. make the buffer too big and network utility will be significantly decreased).
 
 
Adoption
Increasing adoption means more people using the bitcoin/blockchain network. The more people use bitcoin the more utility it has, and the more utility Bitcoin has the more people will want to use it (network effect). The more people use bitcoin, the more people there that have an incentive to protect bitcoin.
Any suitable solution should look to increase adoption as time increases.
 
 
The Solution Proposed by some of the bitcoin developers - The Lightning Network
 
The Lightning Network (LN) is an attempt at scaling the number of transactions that can happen between parties by not publishing any transaction onto the blockchain unless it is absolutely necessary. This is achieved by having people pool bitcoin together in a "Channel" and then these people can transact instantly within that channel. If any shenanigans happen between any of the parties, the channel can be closed and the transactions will be settled on the blockchain. The second part of their plan is limit the block size to turn bitcoin into a settlement network. The original block size limit of 1MB was originally put in place by Satoshi as an anti-DOS measure. It was to make sure a bad actor could not propagate a very large block that would crash nodes and increase the size of the blockchain unnecessarily. Certain developers now want to use this 1MB limit in a different way to make sure that resource requirements will stay low, block space always remains full, fees increase significantly and people use the lightning network as their main way of transacting rather than the blockchain. They also say that keeping the resource requirements very low will make sure that bitcoin remains decentralised.
 
Problems with The Lightning Network
The LN works relatively well (in theory) when the cost and time to publish a set of transactions to the network are kept low. Unfortunately, when the cost and time to publish a set of transactions on the blockchain become high, the LN's utility is diminished. The trust you get from a transaction on the LN comes only from the trustless nature of having transactions published to the bitcoin network. What this means is that if a transaction cannot be published on the bitcoin network then the LN transaction is not secured at all. As transactions fees rise on the bitcoin blockchain the LN utility is diminished. Lets take an example:
  • Cost of publishing a transaction to the bitcoin network = $20
  • LN transaction between Bob and Alice = $20.
  • Transaction between Bob and Alice has problem therefore we want to publish it to the blockchain.
  • Amount of funds left after transaction is published to the blockchain = $20 - $20 = $0.
This is also not a binary situation. If for example in this scenario, the cost to publish the transaction to blockchain was $10 then still only 50% of the transaction would be secure. It is unlikely anyone really call this a secure transaction.
Will a user make a non-secured/poorly secured transaction on the LN when they could make the same transaction via an altcoin or non-cryptocurrency transaction and have it well secured? It's unlikely. What is much more likely to happen is that transaction that are not secured by bitcoin because of the cost to publish to the blockchain will simply overflow into altcoins or will simply not happen on any cryptocurrency network. The reality is though, that we don't know exactly what will happen because there is no precedent for it.
Another problem outside of security is convenience. With a highly oversaturated block space (very large backlog of transactions) it could take months to have a transaction published to the blockchain. During this time your funds will simply be stuck. If you want to buy a coffee with a shop you don't have a channel open with, instead of simply paying with bitcoin directly, you would have to wait months to open a channel by publishing a transaction to the bitcoin blockchain. I think your coffee might be a little cold by then (and mouldy).
I suggest reading this excellent post HERE for other rather significant problems with the LN when people are forced to use it.
The LN is currently not complete and due to its high complexity it will take some time to have industry wide implementation. If it is implemented on top of a bitcoin-as-a-settlement-network economy it will likely have very little utility.
 
Uses of The LN
The LN is actually an extremely useful layer-2 technology when it is used with it's strengths. When the bitcoin blockchain is fast and cheap to transact on, the LN is also extremely useful. One of the major uses for the LN is for trust-based transactions. If you are transacting often between a set of parties you can truly trust then using LN makes absolute sense since the trustless model of bitcoin is not necessary. Then once you require your funds to be unlocked again it will only take a short time and small cost to open them up to the full bitcoin network again. Another excellent use of LN would be for layer-3 apps. For example a casino app: Anyone can by into the casino channel and play using real bitcoins instantly in the knowledge that is anything nefarious happens you can instantly settle and unlock your funds. Another example would be a computer game where you can use real bitcoin in game, the only difference is that you connect to the game's LN channel and can transact instantly and cheaply. Then whenever you want to unlock your funds you can settle on the blockchain and use your bitcoins normally again.
LN is hugely more powerful, the more powerful bitcoin is. The people making the LN need to stick with its strengths rather than sell it as an all-in-one solution to bitcoin's scaling problem. It is just one piece of the puzzle.
 
 
Improving Network Efficiency
 
The more efficient the network, the more we can do with what we already have. There are a number of possible efficiency improvements to the network and each of them has a slightly different effect.
 
Pruning
Pruning allows the stored blockchain size to be reduced significantly by not storing old data. This has the effect of lowering the resource requirements of running a node. a 40GB unpruned blockchain would be reduced in size to 550MB. (It is important to note that a pruned node has lower utility to the network)
 
Thin Blocks
Thin blocks uses the fact that most of the nodes in the network already have a list of almost all the same transactions ready to be put into the blockchain before a block is found. If all nodes use the same/similar policy for which transactions to include in a block then you only need to broadcast a small amount of information across the network for all nodes to know which transactions have been included (as opposed to broadcasting a list of all transactions included in the block). Thin Blocks have the advantage of reducing propagation which lowers the mining centralisation pressure due to orphaned blocks.
 
libsecp256k1 libsecp256k1 allows a more efficient way of validating transactions. This means that propagation time is reduced which lowers the mining centralisation pressure due to orphaned blocks. It also means reduced time to bootstrap the blockchain for a new node.
 
Serialised Broadcast
Currently block transmission to peers happens in parallel to all connected peers. Obviously for block propagation this is a poor choice in comparison to serial transmission to each peer one by one. Using parallel transmission means that the more peers you have, the slower the propagation, whereas serial transmission does not suffer this problem. The problem that serial transmission does suffer from though is variance. If the order that you send blocks to peers in is random, then it means sometimes you will send blocks to a peer who has a slow/fast connection and/or is able to validate slowly/quickly. This would mean the average propagation time would increase with serialised transmission but depending on your luck you would sometimes have faster propagation and sometimes have slower propagation. As this will lower propagation time it will also lower the mining centralisation pressure due to orphaned blocks. (This is just a concept at the moment but I don't see why it couldn't be implemented).
 
Serialised Broadcast Sorting
This is a fix for the variance that would occur due to serialised broadcast. This sorts the order that you broadcast a block to each peer into; fastest upload + validation speed first and slowest upload speed and validation speed last. This not only decreases the variance to zero but also allows blocks to propagation to happen much faster. This also has the effect of lowering the mining centralisation pressure due to orphaned blocks. (This is just a concept at the moment but I don't see why it couldn't be implemented).
 
Here is a table below that shows roughly what the effects these solutions should have.
Name Bandwidth Broadcast Time Validation Time New Node Time Storage Space
Pruning 1 1 1 1 0.014
Thin Blocks 0.42 0.1 0.1 1 1
libsecp256k1 1 1 0.2 0.6 1
Serialised Broadcast 1 0.5 1 1 1
KYN 1 0.75 1 1 1
Segregated Witness 1 1 1 0.4 1
TOTAL 0.42 0.0375 0.02 0.24 0.014
Multiplier 2.38 26.7 50 - 70
(The "multiplier" shows how many times higher the block size could be relative to the specific function.)
 
 
The Factors in Finding a Balanced Solution
 
At the beginning of this post I detailed a relatively simple framework for finding a solution by describing what the problem is. There seems to me to be five main fundamental forces at play in finding a balanced solution; 'node distribution', 'mining decentralisation', 'network utility', 'time' and 'adoption'. The optimal solution needs to find a balance between all of these forces taking into account a buffer to offset our inability to predict the future with absolute accuracy.
To find a suitable buffer we need to assign a set of red line values which certain values should not pass if we want to make sure bitcoin continues to function as well as today (at a minimum). For example, percentage of orphans should stay below a certain value. These values can only be a best estimate due to the complexity of bitcoin economics, although I have tried to provide as sound reasoning as possible.
 
Propagation time
It seems a fair limit for this would be roughly what we have now. Bitcoin is still functioning now. Could mining be more decentralised? Yes, of course, but it seems bitcoin is working fine right now and therefore our currently propagation time for blocks is a fairly conservative limit to set. Currently 1MB blocks take around 15 seconds to propagate more than 50% of the network. 15 second propagation time is what I will be using as a limit in the solution to create a buffer.
 
Orphan Rate
This is obviously a value that is a function of propagation time so the same reasoning should be used. I will use a 3% limit on orphan rate in the solution to create a buffer.
 
Non-Pruned Node Storage Cost
For this I am choosing a limit of $200 in the near-term and $600 in the long-term. I have chosen these values based on what I think is a reasonable (maximum) for a business or enthusiast to pay to run a full node. As the number of transactions increases as more people use bitcoin the number of people willing to pay a higher price to run a node will also increase although the percentage of people will decrease. These are of course best guess values as there is no way of knowing exactly what percentage of users are willing to pay what.
 
Pruned Node Storage Cost
For this I am choosing a limit of $3 in the near-term (next 5 years) and $9 in the long-term (Next 25 years). I have chosen these values based on what I think is a reasonable (maximum) for normal bitcoin user to pay. In fact this cost will more likely be zero as almost all users have an amount of storage free on their computers.
 
Percentage of Downstream Bandwidth Used
This is a best guess at what I think people who run nodes would be willing to use to be connected to the bitcoin network directly. I believe using 10% (maximum) of a users downstream bandwidth is the limit of what is reasonable for a full node (pruned and non-pruned). Most users would continue to access the blockchain via SPV wallets though. Downstream is generally a much more valuable resource to a user than upstream due to the nature of the internet usage.
 
Percentage of Upstream Bandwidth Used
This is a best guess at what I think people who run nodes would be willing to use to be connected to the bitcoin network directly. I believe using 25% (maximum) of a users downstream bandwidth is the limit of what is reasonable for a full node (pruned and non-pruned). Most users would continue to access the blockchain via SPV wallets though. Upstream is generally a much less valuable resource to a user than downstream due to the nature of the internet usage.
 
Time to Bootstrap a New Node
My limit for this value is at 5 days using 50% of downstream bandwidth in the near-term and 30 days in the long-term. This seems like a reasonable number to me for someone who wants to start running a full node. Currently opening a new bank account takes at least week until everything is set up and you have received your cards, so it seems to me people would be willing to wait this long to become connected. Again, this is a best guess on what people would be willing to do to access the blockchain in the future. Most users requiring less security will be able to use an SPV wallet.
It is important to note that we only need enough nodes to make sure the blockchain is distributed across many places with many backups of the full blockchain. It is likely that a few thousand is a minimum for this. Increasing this amount to hundreds of thousands or millions of full nodes is not necessarily that much of an advantage to node distribution but could be a significant disadvantage to mining centralisation. This is because the more nodes you have in the network, the longer it takes to propagate >50% of it.
 
Storage Cost Price Reduction Over Time
Storage cost follows a linear logarithmic trend. Costs of HDD reducing by 10 times every 5 years, although this has slowed over the past few years. This can be attributed to the flooding in South East Asia and the transition to SSD technology. SSD technology also follows the linear logarithmic trend of costs reducing 10 times every 5 years, or roughly decreasing 37% per year.
 
Average Upload and Download Bandwidth Increases Over Time
Average upload and download bandwidth increases in a linear logarithmic trend. Both upload and download bandwidth follow the same trend of doubling roughly every two years, or increasing 40% per year.
 
Price
I was hesitant to include this one here but I feel it is unavoidable. Contrary to what people say (often when the price is trending downwards) bitcoin price is an extremely important metric in the long-term. Depending on bitcoin's price, bitcoin's is useful to; enthusiasts->some users->small companies->large companies->nations->the world, in roughly that order. The higher bitcoin's price is the more liquid the market will be and the more difficult it will be to move the price, therefore increasing bitcoin's utility. Bitcoin's price in the long-term is linked to adoption, which seems to happen in waves, as can be seen in the price bubbles over the years. If we are planning/aiming for bitcoin to at least become a currency with equal value to one of the worlds major currencies then we need to plan for a market cap and price that reflect that. I personally think there are two useful targets we should use to reflect our aims. The first, lower target is for bitcoin to have a market cap the size of a major national currency. This would put the market cap at around 2.1 trillion dollars or $100,000 per bitcoin. The second higher target is for bitcoin to become the world's major reserve currency. This would give bitcoin a market cap of around 21 trillion dollars and a value of $1,000,000 per bitcoin. A final, and much more difficult target is likely to be bitcoin as the only currency across the world, but I am not sure exactly how this could work so for now I don't think this is worth considering.
 
As price increases, so does the subsidy reward given out to miners who find blocks. This reward is semi-dynamic in that it remains static (in btc terms) until 210,000 blocks are found and then the subsidy is then cut in half. This continues to happen until all 21,000,000 bitcoins have been mined. If the value of each bitcoin increases faster than the btc denominated subsidy decreases then the USD denominated reward will be averagely increasing. Historically the bitcoin price has increased significantly faster than subsidy decreases. The btc denominated subsidy halves roughly every 4 years but the price of bitcoin has historically increased roughly 50 fold in the same time.
 
Bitcoin adoption should happen in a roughly s-curve dynamic like every other technology adoption. This means exponential adoption until the market saturation starts and adoption slows, then the finally is the market becomes fully saturated and adoption slowly stops (i.e. bitcoin is fully adopted). If we assume the top of this adoption s-curve has one of the market caps above (i.e. bitcoin is successful) then we can use this assumption to see how we can transition from a subsidy paid network to a transaction fee paid network.
 
Adoption
Adoption is the most difficult metric to determine. In fact it is impossible to determine accurately now, let alone in the future. It is also the one of the most important factors. There is no point in building software that no one is going to use after all. Equally, there is no point in achieving a large amount of adoption if bitcoin offers none of the original value propositions. Clearly there is a balance to be had. Some amount of bitcoin's original value proposition is worth losing in favour of adoption, and some amount of adoption is worth losing to keep bitcoin's original value proposition. A suitable solution should find a good balance between the two. It is clear though that any solution must have increased adoption as a basic requirement, otherwise it is not a solution at all.
 
One major factor related to adoption that I rarely see mentioned, is stability and predictability. This is relevant to both end users and businesses. End users rely on stability and predictability so that they do not have to constantly check if something has changed. When a person goes to get money from a cash machine or spend money in a shop, their experience is almost identical every single time. It is highly dependable. They don't need to keep up-to-date on how cash machines or shops work to make sure they are not defrauded. They know exactly what is going to happen without having to expend any effort. The more deviation from the standard experience a user experiences and the more often a user experiences a deviation, the less likely a user is going to want to continue to use that service. Users require predictability extending into the past. Businesses who's bottom line is often dependent on reliable services also require stability and predictability. Businesses require predictability that extends into the future so that they can plan. A business is less likely to use a service for which they do not know they can depend on in the future (or they know they cannot depend on).
For bitcoin to achieve mass adoption it needs a long-term predictable and stable plan for people to rely on.
 
 
The Proposal
 
This proposal is one based on determining a best fit balance of every factor and a large enough buffer to allows for our inability to perfectly predict the future. No one can predict the future with absolutely certainty but it does not mean we cannot make educated guesses and plan for it.
 
The first part of the proposal is to spend 2016 implementing all available efficiency improvements (i.e the ones detailed above) and making sure the move to a scaled bitcoin happens as smoothly as possible. It seems we should set a target of implementing all of the above improvements within the first 6 months of 2016. These improvements should be implemented in the first hardfork of its kind, with full community wide consensus. A hardfork with this much consensus is the perfect time to test and learn from the hardforking mechanism. Thanks to Seg Wit, this would give us an effective 2 fold capacity increase and set us on our path to scalability.
 
The second part of the proposal is to target the release of a second hardfork to happen at the end of 2016. Inline with all the above factors this would start with a real block size limit increase to 2MB (effectively increasing the throughput to 4x compared to today thanks to Seg Wit) and a doubling of the block size limit every two years thereafter (with linear scaling in between). The scaling would end with an 8GB block size limit in the year 2039.
 
 
How does the Proposal fit inside the Limits
 
 
Propagation time
If trends for average upload and bandwidth continue then propagation time for a block to reach >50% of the nodes in the network should never go above 1s. This is significantly quickly than propagation times we currently see.
In a worst case scenario we can we wrong in the negative direction (i.e. bandwidth does not increase as quickly as predicted) by 15% absolute and 37.5% relative (i.e. bandwidth improves at a rate of 25% per year rather than the predicted 40%) and we would still only ever see propagation times similar to today and it would take 20 years before this would happen.
 
Orphan Rate
Using our best guess predictions the orphan rate would never go over 0.2%.
In a worst case scenario where we are wrong in our bandwidth prediction in the negative direction by 37.5% relative, orphan rate would never go above 2.3% and it would take over 20 years to happen.
 
Non-Pruned Node Storage Cost
Using our best guess predictions the cost of storage for a non-pruned full node would never exceed $40 with blocks consistently 50% full and would in fact decrease significantly after reaching the peak cost. If blocks were consistently 100% full (which is highly unlikely) then the maximum cost of an un-pruned full node would never exceed $90.
In a worst case scenario where we are wrong in our bandwidth prediction in the negative direction by 37.5% relative and we are wrong in our storage cost prediction by 20% relative (storage cost decreases in cost by 25% per year instead of the predicted 37% per year), we would see a max cost to run a node with 50% full blocks of $100 by 2022 and $300 by 2039. If blocks are always 100% full then this max cost rises to $230 by 2022 and $650 in 2039. It is important to note that for storage costs to be as high as this, bitcoin will have to be enormously successful, meaning many many more people will be incentivised to run a full node (businesses etc.)
 
Pruned Node Storage Cost
Using our best guess predictions the cost of storage for a pruned full node would never exceed $0.60 with blocks consistently 50% full. If blocks were consistently 100% full (which is highly unlikely) then the max cost of an un-pruned full node would never exceed $1.30.
In a worst case scenario where we are wrong in our bandwidth prediction in the negative direction by 37.5% relative and we are wrong in our storage cost prediction by 20% relative (storage cost decreases in cost by 25% per year instead of the predicted 37% per year), we would see a max cost to run a node with 50% full blocks of $1.40 by 2022 and $5 by 2039. If blocks are always 100% full then this max cost rises to $3.20 by 2022 and $10 in 2039. It is important to note that at this amount of storage the cost would be effectively zero since users almost always have a large amount of free storage space on computers they already own.
 
Percentage of Downstream Bandwidth Used
Using our best guess predictions running a full node will never use more than 0.3% of a users download bandwidth (on average).
In a worst case scenario we can we wrong in the negative direction by 37.5% relative in our bandwidth predictions and we would still only ever see a max download bandwidth use of 4% (average).
 
Percentage of Upstream Bandwidth Used
Using our best guess predictions running a full node will never use more than 1.6% of a users download bandwidth (on average).
In a worst case scenario we can we wrong in the negative direction by 37.5% relative in our bandwidth predictions and we would only ever see a max download bandwidth use of 24% (average) and this would take over 20 years to occur.
 
Time to Bootstrap a New Node
Using our best guess predictions bootstrapping a new node onto the network should never take more than just over a day using 50% bandwidth.
In a worst case scenario we can we wrong in the negative direction by 37.5% relative in our bandwidth predictions and it would take one and 1/4 days to bootstrap the blockchain using 50% of the download bandwidth. By 2039 it would take 16 days to bootstrap the entire blockchain when using 50% bandwidth. I think it is important to note that by this point it is very possible the bootstrapping the blockchain could very well be done by simply buying an SSD with blockchain already bootstrapped. 16 days would be a lot of time to download software but it does not necessarily mean a decrease in centralisation. As you will see in the next section, if bitcoin has reached this level of adoption, there may well be many parties will to spend 16 days downloading the blockchain.
 
What if Things Turn Out Worse than the Worse Case?
While it is likely that future trends in the technology required to scale bitcoin will continue relatively similar to the past, it is possible that the predictions are completely and utterly wrong. This plan takes this into account though by making sure the buffer is large enough to give us time to adjust our course. Even if no technological/cost improvements (near zero likelihood) are made to bandwidth and storage in the future this proposal still gives us years to adjust course.
 
 
What Does This Mean for Bitcoin?
 
Significantly Increased Adoption
For comparison, Paypal handles around 285 transactions per second (tps), VISA handles around 2000tps and the total global non-cash transactions are around 12,400tps.
Currently bitcoin is capable of handling a maximum of around 3.5 transactions every second which are published to the blockchain roughly every 10 minutes. With Seg Wit implemented via a hardfork, bitcoin will be capable or around 7tps. With this proposal bitcoin will be capable of handling more transactions than Paypal (assuming Paypal experiences growth of around 7% per year) in the year 2027. Bitcoin will overtake VISA's transaction capability by the year 2035 and at the end of the growth cycle in 2039 it will be able to handle close to 50% of the total global non-cash transactions.
When you add on top second layer protocols( like the LN), sidechains, altcoins and off-chain transactions, there should be more than enough capacity for the whole world and every possible conceivable use for digital value transfer.
 
Transitioning from a Subsidy to a Transaction Fee Model
Currently mining is mostly incentivised by the subsidy that is given by the network (currently 25btc per block). If bitcoin is to widely successful it is likely that price increases will continue to outweigh btc denominated subsidy decreases for some time. This means that currently it is likely to be impossible to try to force the network into matching a significant portion of the subsidy with fees. The amount of fees being paid to miners has averagely increased over time and look like they will continue to do so. It is likely that the optimal time for fees to start seriously replacing the subsidy is when bitcoin adoption starts to slow. Unless you take a pessimistic view of bitcoin (thinking bitcoin is as big as it ever will be), it is reasonable to assume this will not happen for some time.
With this proposal, using an average fee of just $0.05, total transaction fees per day would be:
  • Year 2020 = $90,720
  • Year 2025 = $483,840.00
  • Year 2030 = $2,903,040.00
  • Year 2035 = $15,482,880.00
  • Year 2041 = $123,863,040.00 (full 8GB Blocks)
Miners currently earn a total of around $2 million dollars per day in revenue, significantly less than the $124 million dollars in transaction fee revenue possible using this proposal. That also doesn't include the subsidy which would still play some role until the year 2140. This transaction fee revenue would be a yearly revenue of $45 billion for miners when transaction fees are only $0.05 on average.
 
 
Proposal Data
You can use these two spreadsheets (1 - 2 ) to see the various metrics at play over time. The first spreadsheet shows the data using the predicted trends and the second spreadsheet shows the data with the worst case trends.
 
 
Summary
 
It's very clear we are on the edge/midst of a community (and possibly a network) split. This is a very dangerous situation for bitcoin. A huge divide has appeared in the community and opinions are becoming more and more entrenched on both sides. If we cannot come together and find a way forward it will be bad for everyone except bitcoin's competition and enemies. While this proposal is born from an attempt at finding a balance based on as many relevant factors as possible, it also fortunately happens to fall in between the two sides of the debate. Hopefully the community can see this proposal as a way of making a compromise, releasing the entrenchment and finding a way forward to scale bitcoin. I have no doubt that if we can do this, bitcoin will have enormous success in the years to come.
 
Lets bring bitcoin out of beta together!!
submitted by ampromoco to Bitcoin [link] [comments]


2015.12.26 16:50 ampromoco An attempt at a fully comprehensive look at how to scale bitcoin. Lets bring Bitcoin out of Beta!

 
WARNING THIS IS GOING TO BE A REALLY REALLY LONG POST BUT PLEASE READ IT ALL. SCALING BITCOIN IS A COMPLEX ISSUE! HOPEFULLY HAVING ALL THE INFO IN ONE PLACE SHOULD BE USEFUL
 
Like many people in the community I've spent the past month or so looking deeply into the bitcoin scaling debate. I feel there has never been a fully comprehensive thread on how bitcoin could scale. The closest I have seen is gavinandresen's medium posts back in the summer describing the problem and a solution, and pre-emptively answering supposed problems with the solution. While these posts got to the core of the issue and spawned the debate we have been having, they were quite general and could have used more data in support. This is my research and proposal to scale bitcoin and bring the community back together.
 
 
The Problem
 
There seems to me to be five main fundamental forces at play in finding a balanced solution;
  • 'node distribution',
  • 'mining decentralisation',
  • 'network utility',
  • 'time',
  • 'adoption'.
 
 
Node Distribution
Bandwidth has a relationship to node count and therefore 'node distribution'. This is because if bandwidth becomes too high then fewer people will be able to run a node. To a lesser extent bandwidth also effects 'mining decentralisation' as miners/pool owners also need to be able to run a node. I would argue that the centralisation pressures in relation to bandwidth are negligible though in comparison to the centralisation pressure caused by the usefulness of larger pools in reducing variance. The cost of a faster internet connection is negligible in comparison to the turnover of the pools. It is important to note the distinction between bandwidth required to propagate blocks quickly and the bandwidth required to propagate transactions. The bandwidth required to simply propagate transactions is still low today.
New node time (i.e. the time it takes to start up a new node) also has a relationship with node distribution. i.e. If it takes too long to start a new node then fewer people will be willing to take the time and resources to start a new node.
Storage Space also has a relationship with node distribution. If the blockchain takes up too much space on a computer then less people will be willing to store the whole blockchain.
Any suitable solution should look to not decrease node distribution significantly.
 
Mining Decentralisation
Broadcast time (the time it takes to upload a block to a peer) has a relationship with mining centralisation pressures. This is because increasing broadcast time increases the propagation time, which increases the orphan rate. If the orphan rate it too high then individual miners will tend towards larger pools.
Validation time (the time it to validate a block) has a relationship with mining centralisation pressures. This is because increasing validation time increases the propagation time, which increases the orphan rate. If the orphan rate it too high then individual miners will tend towards larger pools.
Any suitable solution should look to not increase mining centralisation significantly.
 
Network Utility
Network Utility is one that I find is often overlooked, is not well understood but is equally as important. The network utility force acts as a kind of disclaimer to the other two forces. It has a balancing effect. Increasing the network utility will likely increase user adoption (The more useful something is, the more people will want to use it) and therefore decreasing network utility will likely decrease user adoption. User adoption has a relationship with node count. i.e. the more people, companies and organisations know about and use bitcoin, the more people, companies and organisations that will run nodes. For example we could reduce block size down to 10KB, which would reduce broadcast time and validation time significantly. This would also therefore reduce mining centralisation pressures significantly. What is very important to realise though is that network utility would also be significantly be reduced (fewer people able to use bitcoin) and therefore so would node distribution. Conversely, if we increased the block size (not the limit) right now to 10GB, the network utility would be very high as bitcoin would be able to process a large number of transactions but node distribution would be low and mining centralisation pressures would be high due to the larger resource requirements.
Any suitable solution should look to increase network utility as time increases.
 
Time
Time is an important force because of how technology improves over time. Technology improves over time in a semi-predicable fashion (often exponential). As we move through time, the cost of resources required to run the bitcoin network (if the resource requirements remained static) will decrease. This means that we are able to increase resource requirements proportional to technological improvements/cost reductions without any increase in costs to the network. Technological improvements are not perfectly predictable though so it could be advantageous to allow some buffer room for when technological improvements do not keep up with predictions. This buffer should not be applied at the expense of the balance between the other forces though (i.e. make the buffer too big and network utility will be significantly decreased).
 
 
Adoption
Increasing adoption means more people using the bitcoin/blockchain network. The more people use bitcoin the more utility it has, and the more utility Bitcoin has the more people will want to use it (network effect). The more people use bitcoin, the more people there that have an incentive to protect bitcoin.
Any suitable solution should look to increase adoption as time increases.
 
 
The Solution Proposed by some of the bitcoin developers - The Lightning Network
 
The Lightning Network (LN) is an attempt at scaling the number of transactions that can happen between parties by not publishing any transaction onto the blockchain unless it is absolutely necessary. This is achieved by having people pool bitcoin together in a "Channel" and then these people can transact instantly within that channel. If any shenanigans happen between any of the parties, the channel can be closed and the transactions will be settled on the blockchain. The second part of their plan is limit the block size to turn bitcoin into a settlement network. The original block size limit of 1MB was originally put in place by Satoshi as an anti-DOS measure. It was to make sure a bad actor could not propagate a very large block that would crash nodes and increase the size of the blockchain unnecessarily. Certain developers now want to use this 1MB limit in a different way to make sure that resource requirements will stay low, block space always remains full, fees increase significantly and people use the lightning network as their main way of transacting rather than the blockchain. They also say that keeping the resource requirements very low will make sure that bitcoin remains decentralised.
 
Problems with The Lightning Network
The LN works relatively well (in theory) when the cost and time to publish a set of transactions to the network are kept low. Unfortunately, when the cost and time to publish a set of transactions on the blockchain become high, the LN's utility is diminished. The trust you get from a transaction on the LN comes only from the trustless nature of having transactions published to the bitcoin network. What this means is that if a transaction cannot be published on the bitcoin network then the LN transaction is not secured at all. As transactions fees rise on the bitcoin blockchain the LN utility is diminished. Lets take an example:
  • Cost of publishing a transaction to the bitcoin network = $20
  • LN transaction between Bob and Alice = $20.
  • Transaction between Bob and Alice has problem therefore we want to publish it to the blockchain.
  • Amount of funds left after transaction is published to the blockchain = $20 - $20 = $0.
This is also not a binary situation. If for example in this scenario, the cost to publish the transaction to blockchain was $10 then still only 50% of the transaction would be secure. It is unlikely anyone really call this a secure transaction.
Will a user make a non-secured/poorly secured transaction on the LN when they could make the same transaction via an altcoin or non-cryptocurrency transaction and have it well secured? It's unlikely. What is much more likely to happen is that transaction that are not secured by bitcoin because of the cost to publish to the blockchain will simply overflow into altcoins or will simply not happen on any cryptocurrency network. The reality is though, that we don't know exactly what will happen because there is no precedent for it.
Another problem outside of security is convenience. With a highly oversaturated block space (very large backlog of transactions) it could take months to have a transaction published to the blockchain. During this time your funds will simply be stuck. If you want to buy a coffee with a shop you don't have a channel open with, instead of simply paying with bitcoin directly, you would have to wait months to open a channel by publishing a transaction to the bitcoin blockchain. I think your coffee might be a little cold by then (and mouldy).
I suggest reading this excellent post HERE for other rather significant problems with the LN when people are forced to use it.
The LN is currently not complete and due to its high complexity it will take some time to have industry wide implementation. If it is implemented on top of a bitcoin-as-a-settlement-network economy it will likely have very little utility.
 
Uses of The LN
The LN is actually an extremely useful layer-2 technology when it is used with it's strengths. When the bitcoin blockchain is fast and cheap to transact on, the LN is also extremely useful. One of the major uses for the LN is for trust-based transactions. If you are transacting often between a set of parties you can truly trust then using LN makes absolute sense since the trustless model of bitcoin is not necessary. Then once you require your funds to be unlocked again it will only take a short time and small cost to open them up to the full bitcoin network again. Another excellent use of LN would be for layer-3 apps. For example a casino app: Anyone can by into the casino channel and play using real bitcoins instantly in the knowledge that is anything nefarious happens you can instantly settle and unlock your funds. Another example would be a computer game where you can use real bitcoin in game, the only difference is that you connect to the game's LN channel and can transact instantly and cheaply. Then whenever you want to unlock your funds you can settle on the blockchain and use your bitcoins normally again.
LN is hugely more powerful, the more powerful bitcoin is. The people making the LN need to stick with its strengths rather than sell it as an all-in-one solution to bitcoin's scaling problem. It is just one piece of the puzzle.
 
 
Improving Network Efficiency
 
The more efficient the network, the more we can do with what we already have. There are a number of possible efficiency improvements to the network and each of them has a slightly different effect.
 
Pruning
Pruning allows the stored blockchain size to be reduced significantly by not storing old data. This has the effect of lowering the resource requirements of running a node. a 40GB unpruned blockchain would be reduced in size to 550MB. (It is important to note that a pruned node has lower utility to the network)
 
Thin Blocks
Thin blocks uses the fact that most of the nodes in the network already have a list of almost all the same transactions ready to be put into the blockchain before a block is found. If all nodes use the same/similar policy for which transactions to include in a block then you only need to broadcast a small amount of information across the network for all nodes to know which transactions have been included (as opposed to broadcasting a list of all transactions included in the block). Thin Blocks have the advantage of reducing propagation which lowers the mining centralisation pressure due to orphaned blocks.
 
libsecp256k1 libsecp256k1 allows a more efficient way of validating transactions. This means that propagation time is reduced which lowers the mining centralisation pressure due to orphaned blocks. It also means reduced time to bootstrap the blockchain for a new node.
 
Serialised Broadcast
Currently block transmission to peers happens in parallel to all connected peers. Obviously for block propagation this is a poor choice in comparison to serial transmission to each peer one by one. Using parallel transmission means that the more peers you have, the slower the propagation, whereas serial transmission does not suffer this problem. The problem that serial transmission does suffer from though is variance. If the order that you send blocks to peers in is random, then it means sometimes you will send blocks to a peer who has a slow/fast connection and/or is able to validate slowly/quickly. This would mean the average propagation time would increase with serialised transmission but depending on your luck you would sometimes have faster propagation and sometimes have slower propagation. As this will lower propagation time it will also lower the mining centralisation pressure due to orphaned blocks. (This is just a concept at the moment but I don't see why it couldn't be implemented).
 
Serialised Broadcast Sorting
This is a fix for the variance that would occur due to serialised broadcast. This sorts the order that you broadcast a block to each peer into; fastest upload + validation speed first and slowest upload speed and validation speed last. This not only decreases the variance to zero but also allows blocks to propagation to happen much faster. This also has the effect of lowering the mining centralisation pressure due to orphaned blocks. (This is just a concept at the moment but I don't see why it couldn't be implemented).
 
Here is a table below that shows roughly what the effects these solutions should have.
Name Bandwidth Broadcast Time Validation Time New Node Time Storage Space
Pruning 1 1 1 1 0.014
Thin Blocks 0.42 0.1 0.1 1 1
libsecp256k1 1 1 0.2 0.6 1
Serialised Broadcast 1 0.5 1 1 1
KYN 1 0.75 1 1 1
Segregated Witness 1 1 1 0.4 1
TOTAL 0.42 0.0375 0.02 0.24 0.014
Multiplier 2.38 26.7 50 - 70
(The "multiplier" shows how many times higher the block size could be relative to the specific function.)
 
 
The Factors in Finding a Balanced Solution
 
At the beginning of this post I detailed a relatively simple framework for finding a solution by describing what the problem is. There seems to me to be five main fundamental forces at play in finding a balanced solution; 'node distribution', 'mining decentralisation', 'network utility', 'time' and 'adoption'. The optimal solution needs to find a balance between all of these forces taking into account a buffer to offset our inability to predict the future with absolute accuracy.
To find a suitable buffer we need to assign a set of red line values which certain values should not pass if we want to make sure bitcoin continues to function as well as today (at a minimum). For example, percentage of orphans should stay below a certain value. These values can only be a best estimate due to the complexity of bitcoin economics, although I have tried to provide as sound reasoning as possible.
 
Propagation time
It seems a fair limit for this would be roughly what we have now. Bitcoin is still functioning now. Could mining be more decentralised? Yes, of course, but it seems bitcoin is working fine right now and therefore our currently propagation time for blocks is a fairly conservative limit to set. Currently 1MB blocks take around 15 seconds to propagate more than 50% of the network. 15 second propagation time is what I will be using as a limit in the solution to create a buffer.
 
Orphan Rate
This is obviously a value that is a function of propagation time so the same reasoning should be used. I will use a 3% limit on orphan rate in the solution to create a buffer.
 
Non-Pruned Node Storage Cost
For this I am choosing a limit of $200 in the near-term and $600 in the long-term. I have chosen these values based on what I think is a reasonable (maximum) for a business or enthusiast to pay to run a full node. As the number of transactions increases as more people use bitcoin the number of people willing to pay a higher price to run a node will also increase although the percentage of people will decrease. These are of course best guess values as there is no way of knowing exactly what percentage of users are willing to pay what.
 
Pruned Node Storage Cost
For this I am choosing a limit of $3 in the near-term (next 5 years) and $9 in the long-term (Next 25 years). I have chosen these values based on what I think is a reasonable (maximum) for normal bitcoin user to pay. In fact this cost will more likely be zero as almost all users have an amount of storage free on their computers.
 
Percentage of Downstream Bandwidth Used
This is a best guess at what I think people who run nodes would be willing to use to be connected to the bitcoin network directly. I believe using 10% (maximum) of a users downstream bandwidth is the limit of what is reasonable for a full node (pruned and non-pruned). Most users would continue to access the blockchain via SPV wallets though. Downstream is generally a much more valuable resource to a user than upstream due to the nature of the internet usage.
 
Percentage of Upstream Bandwidth Used
This is a best guess at what I think people who run nodes would be willing to use to be connected to the bitcoin network directly. I believe using 25% (maximum) of a users downstream bandwidth is the limit of what is reasonable for a full node (pruned and non-pruned). Most users would continue to access the blockchain via SPV wallets though. Upstream is generally a much less valuable resource to a user than downstream due to the nature of the internet usage.
 
Time to Bootstrap a New Node
My limit for this value is at 5 days using 50% of downstream bandwidth in the near-term and 30 days in the long-term. This seems like a reasonable number to me for someone who wants to start running a full node. Currently opening a new bank account takes at least week until everything is set up and you have received your cards, so it seems to me people would be willing to wait this long to become connected. Again, this is a best guess on what people would be willing to do to access the blockchain in the future. Most users requiring less security will be able to use an SPV wallet.
It is important to note that we only need enough nodes to make sure the blockchain is distributed across many places with many backups of the full blockchain. It is likely that a few thousand is a minimum for this. Increasing this amount to hundreds of thousands or millions of full nodes is not necessarily that much of an advantage to node distribution but could be a significant disadvantage to mining centralisation. This is because the more nodes you have in the network, the longer it takes to propagate >50% of it.
 
Storage Cost Price Reduction Over Time
Storage cost follows a linear logarithmic trend. Costs of HDD reducing by 10 times every 5 years, although this has slowed over the past few years. This can be attributed to the flooding in South East Asia and the transition to SSD technology. SSD technology also follows the linear logarithmic trend of costs reducing 10 times every 5 years, or roughly decreasing 37% per year.
 
Average Upload and Download Bandwidth Increases Over Time
Average upload and download bandwidth increases in a linear logarithmic trend. Both upload and download bandwidth follow the same trend of doubling roughly every two years, or increasing 40% per year.
 
Price
I was hesitant to include this one here but I feel it is unavoidable. Contrary to what people say (often when the price is trending downwards) bitcoin price is an extremely important metric in the long-term. Depending on bitcoin's price, bitcoin's is useful to; enthusiasts->some users->small companies->large companies->nations->the world, in roughly that order. The higher bitcoin's price is the more liquid the market will be and the more difficult it will be to move the price, therefore increasing bitcoin's utility. Bitcoin's price in the long-term is linked to adoption, which seems to happen in waves, as can be seen in the price bubbles over the years. If we are planning/aiming for bitcoin to at least become a currency with equal value to one of the worlds major currencies then we need to plan for a market cap and price that reflect that. I personally think there are two useful targets we should use to reflect our aims. The first, lower target is for bitcoin to have a market cap the size of a major national currency. This would put the market cap at around 2.1 trillion dollars or $100,000 per bitcoin. The second higher target is for bitcoin to become the world's major reserve currency. This would give bitcoin a market cap of around 21 trillion dollars and a value of $1,000,000 per bitcoin. A final, and much more difficult target is likely to be bitcoin as the only currency across the world, but I am not sure exactly how this could work so for now I don't think this is worth considering.
 
As price increases, so does the subsidy reward given out to miners who find blocks. This reward is semi-dynamic in that it remains static (in btc terms) until 210,000 blocks are found and then the subsidy is then cut in half. This continues to happen until all 21,000,000 bitcoins have been mined. If the value of each bitcoin increases faster than the btc denominated subsidy decreases then the USD denominated reward will be averagely increasing. Historically the bitcoin price has increased significantly faster than subsidy decreases. The btc denominated subsidy halves roughly every 4 years but the price of bitcoin has historically increased roughly 50 fold in the same time.
 
Bitcoin adoption should happen in a roughly s-curve dynamic like every other technology adoption. This means exponential adoption until the market saturation starts and adoption slows, then the finally is the market becomes fully saturated and adoption slowly stops (i.e. bitcoin is fully adopted). If we assume the top of this adoption s-curve has one of the market caps above (i.e. bitcoin is successful) then we can use this assumption to see how we can transition from a subsidy paid network to a transaction fee paid network.
 
Adoption
Adoption is the most difficult metric to determine. In fact it is impossible to determine accurately now, let alone in the future. It is also the one of the most important factors. There is no point in building software that no one is going to use after all. Equally, there is no point in achieving a large amount of adoption if bitcoin offers none of the original value propositions. Clearly there is a balance to be had. Some amount of bitcoin's original value proposition is worth losing in favour of adoption, and some amount of adoption is worth losing to keep bitcoin's original value proposition. A suitable solution should find a good balance between the two. It is clear though that any solution must have increased adoption as a basic requirement, otherwise it is not a solution at all.
 
One major factor related to adoption that I rarely see mentioned, is stability and predictability. This is relevant to both end users and businesses. End users rely on stability and predictability so that they do not have to constantly check if something has changed. When a person goes to get money from a cash machine or spend money in a shop, their experience is almost identical every single time. It is highly dependable. They don't need to keep up-to-date on how cash machines or shops work to make sure they are not defrauded. They know exactly what is going to happen without having to expend any effort. The more deviation from the standard experience a user experiences and the more often a user experiences a deviation, the less likely a user is going to want to continue to use that service. Users require predictability extending into the past. Businesses who's bottom line is often dependent on reliable services also require stability and predictability. Businesses require predictability that extends into the future so that they can plan. A business is less likely to use a service for which they do not know they can depend on in the future (or they know they cannot depend on).
For bitcoin to achieve mass adoption it needs a long-term predictable and stable plan for people to rely on.
 
 
The Proposal
 
This proposal is one based on determining a best fit balance of every factor and a large enough buffer to allows for our inability to perfectly predict the future. No one can predict the future with absolutely certainty but it does not mean we cannot make educated guesses and plan for it.
 
The first part of the proposal is to spend 2016 implementing all available efficiency improvements (i.e the ones detailed above) and making sure the move to a scaled bitcoin happens as smoothly as possible. It seems we should set a target of implementing all of the above improvements within the first 6 months of 2016. These improvements should be implemented in the first hardfork of its kind, with full community wide consensus. A hardfork with this much consensus is the perfect time to test and learn from the hardforking mechanism. Thanks to Seg Wit, this would give us an effective 2 fold capacity increase and set us on our path to scalability.
 
The second part of the proposal is to target the release of a second hardfork to happen at the end of 2016. Inline with all the above factors this would start with a real block size limit increase to 2MB (effectively increasing the throughput to 4x compared to today thanks to Seg Wit) and a doubling of the block size limit every two years thereafter (with linear scaling in between). The scaling would end with an 8GB block size limit in the year 2039.
 
 
How does the Proposal fit inside the Limits
 
 
Propagation time
If trends for average upload and bandwidth continue then propagation time for a block to reach >50% of the nodes in the network should never go above 1s. This is significantly quickly than propagation times we currently see.
In a worst case scenario we can we wrong in the negative direction (i.e. bandwidth does not increase as quickly as predicted) by 15% absolute and 37.5% relative (i.e. bandwidth improves at a rate of 25% per year rather than the predicted 40%) and we would still only ever see propagation times similar to today and it would take 20 years before this would happen.
 
Orphan Rate
Using our best guess predictions the orphan rate would never go over 0.2%.
In a worst case scenario where we are wrong in our bandwidth prediction in the negative direction by 37.5% relative, orphan rate would never go above 2.3% and it would take over 20 years to happen.
 
Non-Pruned Node Storage Cost
Using our best guess predictions the cost of storage for a non-pruned full node would never exceed $40 with blocks consistently 50% full and would in fact decrease significantly after reaching the peak cost. If blocks were consistently 100% full (which is highly unlikely) then the maximum cost of an un-pruned full node would never exceed $90.
In a worst case scenario where we are wrong in our bandwidth prediction in the negative direction by 37.5% relative and we are wrong in our storage cost prediction by 20% relative (storage cost decreases in cost by 25% per year instead of the predicted 37% per year), we would see a max cost to run a node with 50% full blocks of $100 by 2022 and $300 by 2039. If blocks are always 100% full then this max cost rises to $230 by 2022 and $650 in 2039. It is important to note that for storage costs to be as high as this, bitcoin will have to be enormously successful, meaning many many more people will be incentivised to run a full node (businesses etc.)
 
Pruned Node Storage Cost
Using our best guess predictions the cost of storage for a pruned full node would never exceed $0.60 with blocks consistently 50% full. If blocks were consistently 100% full (which is highly unlikely) then the max cost of an un-pruned full node would never exceed $1.30.
In a worst case scenario where we are wrong in our bandwidth prediction in the negative direction by 37.5% relative and we are wrong in our storage cost prediction by 20% relative (storage cost decreases in cost by 25% per year instead of the predicted 37% per year), we would see a max cost to run a node with 50% full blocks of $1.40 by 2022 and $5 by 2039. If blocks are always 100% full then this max cost rises to $3.20 by 2022 and $10 in 2039. It is important to note that at this amount of storage the cost would be effectively zero since users almost always have a large amount of free storage space on computers they already own.
 
Percentage of Downstream Bandwidth Used
Using our best guess predictions running a full node will never use more than 0.3% of a users download bandwidth (on average).
In a worst case scenario we can we wrong in the negative direction by 37.5% relative in our bandwidth predictions and we would still only ever see a max download bandwidth use of 4% (average).
 
Percentage of Upstream Bandwidth Used
Using our best guess predictions running a full node will never use more than 1.6% of a users download bandwidth (on average).
In a worst case scenario we can we wrong in the negative direction by 37.5% relative in our bandwidth predictions and we would only ever see a max download bandwidth use of 24% (average) and this would take over 20 years to occur.
 
Time to Bootstrap a New Node
Using our best guess predictions bootstrapping a new node onto the network should never take more than just over a day using 50% bandwidth.
In a worst case scenario we can we wrong in the negative direction by 37.5% relative in our bandwidth predictions and it would take one and 1/4 days to bootstrap the blockchain using 50% of the download bandwidth. By 2039 it would take 16 days to bootstrap the entire blockchain when using 50% bandwidth. I think it is important to note that by this point it is very possible the bootstrapping the blockchain could very well be done by simply buying an SSD with blockchain already bootstrapped. 16 days would be a lot of time to download software but it does not necessarily mean a decrease in centralisation. As you will see in the next section, if bitcoin has reached this level of adoption, there may well be many parties will to spend 16 days downloading the blockchain.
 
What if Things Turn Out Worse than the Worse Case?
While it is likely that future trends in the technology required to scale bitcoin will continue relatively similar to the past, it is possible that the predictions are completely and utterly wrong. This plan takes this into account though by making sure the buffer is large enough to give us time to adjust our course. Even if no technological/cost improvements (near zero likelihood) are made to bandwidth and storage in the future this proposal still gives us years to adjust course.
 
 
What Does This Mean for Bitcoin?
 
Significantly Increased Adoption
For comparison, Paypal handles around 285 transactions per second (tps), VISA handles around 2000tps and the total global non-cash transactions are around 12,400tps.
Currently bitcoin is capable of handling a maximum of around 3.5 transactions every second which are published to the blockchain roughly every 10 minutes. With Seg Wit implemented via a hardfork, bitcoin will be capable or around 7tps. With this proposal bitcoin will be capable of handling more transactions than Paypal (assuming Paypal experiences growth of around 7% per year) in the year 2027. Bitcoin will overtake VISA's transaction capability by the year 2035 and at the end of the growth cycle in 2039 it will be able to handle close to 50% of the total global non-cash transactions.
When you add on top second layer protocols( like the LN), sidechains, altcoins and off-chain transactions, there should be more than enough capacity for the whole world and every possible conceivable use for digital value transfer.
 
Transitioning from a Subsidy to a Transaction Fee Model
Currently mining is mostly incentivised by the subsidy that is given by the network (currently 25btc per block). If bitcoin is to widely successful it is likely that price increases will continue to outweigh btc denominated subsidy decreases for some time. This means that currently it is likely to be impossible to try to force the network into matching a significant portion of the subsidy with fees. The amount of fees being paid to miners has averagely increased over time and look like they will continue to do so. It is likely that the optimal time for fees to start seriously replacing the subsidy is when bitcoin adoption starts to slow. Unless you take a pessimistic view of bitcoin (thinking bitcoin is as big as it ever will be), it is reasonable to assume this will not happen for some time.
With this proposal, using an average fee of just $0.05, total transaction fees per day would be:
  • Year 2020 = $90,720
  • Year 2025 = $483,840.00
  • Year 2030 = $2,903,040.00
  • Year 2035 = $15,482,880.00
  • Year 2041 = $123,863,040.00 (full 8GB Blocks)
Miners currently earn a total of around $2 million dollars per day in revenue, significantly less than the $124 million dollars in transaction fee revenue possible using this proposal. That also doesn't include the subsidy which would still play some role until the year 2140. This transaction fee revenue would be a yearly revenue of $45 billion for miners when transaction fees are only $0.05 on average.
 
 
Proposal Data
You can use these two spreadsheets (1 - 2 ) to see the various metrics at play over time. The first spreadsheet shows the data using the predicted trends and the second spreadsheet shows the data with the worst case trends.
 
 
Summary
 
It's very clear we are on the edge/midst of a community (and possibly a network) split. This is a very dangerous situation for bitcoin. A huge divide has appeared in the community and opinions are becoming more and more entrenched on both sides. If we cannot come together and find a way forward it will be bad for everyone except bitcoin's competition and enemies. While this proposal is born from an attempt at finding a balance based on as many relevant factors as possible, it also fortunately happens to fall in between the two sides of the debate. Hopefully the community can see this proposal as a way of making a compromise, releasing the entrenchment and finding a way forward to scale bitcoin. I have no doubt that if we can do this, bitcoin will have enormous success in the years to come.
 
Lets bring bitcoin out of beta together!!
submitted by ampromoco to bitcoinone [link] [comments]


2015.11.13 20:10 singularity87 Some research on BIP101 starting with 4MB instead of 8MB (Very promising)

Date BSL BS TPS MD BcS $ per PB HDD Total ABDL ABUL %DL %UL UL per Block PT NNDL
Jan 2016 4 1 7 7GB 54GB $30,000 $1.63 1.5MB/s 0.625MB/s 0.11% 1.48% 0.56MB 0.90s 0.84
Jan 2017 6 2.5 17.5 17GB 150GB $18,928 $2.84 2.12MB/s 0.884MB/s 0.20% 2.62% 1.4MB 1.58s 1.64
Jan 2018 8 4 28 28GB 324GB $11,943 $3.87 3MB/s 1.25MB/s 0.22% 2.97% 2.24MB 1.79s 2.5
Jan 2019 12 6 42 42GB 592GB $7,535 $4.46 4.24MB/s 1.77MB/s 0.24% 3.15% 3.36MB 1.90s 3.23
Jan 2020 16 8 56 56GB 965GB $4,755 $4.59 6MB/s 2.5MB/s 0.22% 2.97% 4.48MB 1.79s 3.72
Jan 2021 24 12 84 84GB 1,500GB $3,000 $4.5 8.5MB/s 3.5MB/s 0.24% 3.15% 6.72MB 1.90s 4.09
Jan 2022 32 16 112 111GB 2,245GB $1,893 $4.25 12MB/s 5MB/s 0.22% 2.97% 8.96MB 1.79s 4.33
Jan 2023 48 24 168 167GB 3,316GB $1,194 $3.96 17MB/s 7MB/s 0.24% 3.15% 13.44MB 1.90s 4.52
Jan 2024 64 32 224 223GB 4,806GB $754 $3.62 24MB/s 10MB/s 0.22% 2.97% 17.92MB 1.79s 4.64
Jan 2025 96 48 336 334GB 6,947GB $475 $3.3 33.94MB/s 14.14MB/s 0.24% 3.15% 26.88MB 0.90s 4.74
Jan 2026 128 64 448 446GB 9,929GB $300 $2.98 48MB/s 20MB/s 0.22% 2.97% 35.84MB 1.79s 4.79
Jan 2027 192 96 672 669GB 14.2TB $189 $2.69 67.88MB/s 28.28MB/s 0.24% 3.15% 53.76MB 1.90s 4.85
Jan 2028 256 128 896 891GB 20.2TB $119 $2.31 96MB/s 40MB/s 0.22% 2.97% 71.68MB 1.79s 4.86
Jan 2029 384 192 1344 1,337GB 28.7TB $75.4 $2.17 135.8MB/s 56.57MB/s 0.24% 3.15% 107.5MB 1.90s 4.90
Jan 2030 512 256 1,792 1,783GB 40.7TB $47.6 $1.93 192MB/s 80MB/s 0.22% 2.97% 143.4MB 1.79s 4.90
Jan 2031 768 384 2,688 2,674GB 57.8TB $30 $1.73 271.5MB/s 113.1MB/s 0.24% 3.15% 215MB 1.90s 4.93
Jan 2032 1024 512 3,584 3,565GB 81.6TB $18.93 $1.55 384MB/s 160MB/s 0.22% 2.97% 286.7MB 1.79s 4.92
Jan 2033 1,536 768 5,376 5,348GB 115.9TB $11.94 $1.38 543MB/s 226MB/s 0.24% 3.15% 430MB 1.90s 4.94
Jan 2034 2,048 1,024 7,168 7,131GB 163.6TB $7.54 $1.23 768MB/s 320MB/s 0.22% 2.97% 573MB 1.79s 4.93
Jan 2035 3,072 1,536 10,752 10.7TB 232.0TB $4.75 $1.10 1,086MB/s 453MB/s 0.24% 3.15% 860MB 1.90s 4.95
Jan 2036 4,096 2,048 14,336 14.3TB 327.5TB $3 $0.98 1,536MB/s 640MB/s 0.22% 2.97% 1,147MB 1.79s 4.94
Jan 2037 6,144 3,072 21,504 21.4TB 464.5TB $1.89 $0.88 2,172MB/s 905MB/s 0.24% 3.15% 1,720MB 1.90s 4.95
Jan 2038 8,192 4,096 28,672 28.5TB 655.3TB $1.19 $0.78 3,072MB/s 1,280MB/s 0.22% 2.97% 2,294MB 1.79s 4.94
Jan 2039 8,192 6,114 43,008 42.8TB 929.3TB $0.75 $0.70 4,344MB/s 1,810MB/s 0.24% 3.15% 3,440MB 1.90s 4.95
Jan 2040 8,192 8,192 57,344 57.0TB 1311TB $0.48 $0.62 6,144MB/s 2,560MB/s 0.22% 2.97% 4,587MB 1.79s 4.94
.
GLOSSERY
.
This the data for BIP101 with a slight change and fast propagating blocks. The only change is to add another two years to the schedule and start at a 4MB limit instead of 8MB. The reason I did this is that; based on the rate of change of HDD/SSD price and average available download and upload speeds, starting at 4MB should be enough to keep block propagation under 2 seconds through the entire schedule. This should be enough to mitigate any centralisation pressures.
A key requirement of this data is that full blocks are not uploaded to each peer, but rather just the necessary information. According to Mike Hearn this would currently mean 70KB upstream data per peer. I have used this as a basis for the data.
.
KEY INFORMATION
.
DATA REFERENCES
.
CHARTS
Block Size
Storage Cost
Percentage of Bandwidth Use
Propagation Time
Time to download new node
.
NOTES
I'd appreciate input from people if they have more useful data or if you think I have missed anything. I intend to post this to /bitcoin as well but I want to get people's input first. Would love to hear your thoughts mike_hearn
submitted by singularity87 to bitcoinxt [link] [comments]


2009.12.27 20:37 64-17-5 Snowfall and wind while helping tourists on their dog sledding adventure [PIC]

Link to my blogpost: http://blogg.humle.be/2009/12/26/hundesledetur-i-sn%C3%B8v%C3%A6r
Direct link to the picture: http://blogg.humle.be/wp-content/2009/12/_dsc0171_resized.jpg
C&C please. Did I do the best out of the bad weather conditions? What can you tell me about the photo editing?
I have adjusted the tone curves a bit. And I used the unsharp mask. All done on RAW. Camera, a Nikon D90 in heavy snowfall and some wind.
Image Size: L (4288 x 2848) Date Shot: 26.12.2009 14:03:03.00 World Time: UTC+1 DST:ON Image Quality: Compressed RAW (12-bit) Device: Nikon D90 Lens: VR 18-105mm F/3,5-5,6G Focal Length: 105mm Focus Mode: AF-S AF-Area Mode: Auto VR: ON Aperture: F/5,6 Shutter Speed: 1/200s Exposure Mode: Programmed Auto Exposure Comp.: +0,3EV Metering: Matrix ISO Sensitivity: ISO 250 Color Space: Adobe RGB Active D-Lighting: Auto Sharpening: 6 Contrast: Active D-Lighting Brightness: Active D-Lighting Saturation: +1 Hue: +1
Edit: What I did on the tone curves: I toned up the dark tones (shadows), toned down midtones (details and the background), and toned up the high tones (most of the snow). So I ended with this S-shaped curve. In other words, I enhanced the contrast?
submitted by 64-17-5 to photocritique [link] [comments]


Speed Dating 9 Guys Using 5 Senses  Versus 1 - YouTube Speed Dating Drake Bell Kisses Lucky Fan on the Lips  Speed Dating ... The Only Guide to Geek Dating Austin - Speed-Dating for ... Book Speed Dating Speed dating Lviv Speed Dating in Herräng Speed Dating in Sex And The City - 'Dont Ask Dont Tell' 23 Good Speed Dating Tips, Conversations and Questions  Ask JT Tran (feat Jessica J)

Speed Dating Images, Stock Photos & Vectors Shutterstock

  1. Speed Dating 9 Guys Using 5 Senses Versus 1 - YouTube
  2. Speed Dating
  3. Drake Bell Kisses Lucky Fan on the Lips Speed Dating ...
  4. The Only Guide to Geek Dating Austin - Speed-Dating for ...
  5. Book Speed Dating
  6. Speed dating Lviv
  7. Speed Dating in Herräng
  8. Speed Dating in Sex And The City - 'Dont Ask Dont Tell'
  9. 23 Good Speed Dating Tips, Conversations and Questions Ask JT Tran (feat Jessica J)

Geek Personals Complete with blog sites, forums, music and videos, this is more than a dating area, however a way to fulfill pals and link, too. In addition ... 23 Good Speed Dating Tips, Conversations and Questions Ask JT Tran (feat Jessica J) ABCs Of Attraction JT Tran's Dating Advice & PUA Bootcamps for Asian Men Loading... People & Blogs; Suggested by UMG OneRepublic - Didn’t I (Official Music Video) Song ... Quest Tour Speed Dating Events - November 2013 in Nikolaev Ukraine Women - Duration: 14:36. Clark Greg - Harris Bragen - Sex and the City S3E12 Don't Ask, Don't Tell (2000) - Duration: 1:54. Sandy Rubi 8,448 views What happens when your younger brother forces you to go speed dating? Find out in this romantic comedy short film. Enjoy!! Disclaimer: I do not own any of the music from this video. This year for the first time every Monday night was speed dating night. It happened outdoors and many turned up just to watch (why?). All participants were given numbers, and marked a piece of ... This video is part of the teacher tip series, 'How to Create Book Hype,' on my blog www.EricaLeeBeaton.com. I discuss how to increase the love around choice reading by doing Book Speed Dating in ... Subscribe http://bit.ly/1Jy0DbO We gave 3 lucky fans the opportunity to go speed dating with Drake Bell for only one minute each. What is one thing you wou... Thanks to your incredible support, our latest humangood collection is SOLD OUT! Follow humangood on Instagram and keep an eye out for our next drop: https://...