Ultimate glossary of crypto currency terms, acronyms and abbreviations
Edit: Currently writing a new version of this, dont know when it will be done. Edit: Since first post I have updated a few sections with additional information. I recommend reading it all even if it is very long, I might have placed some relevant info in different sections while thinking about what else needed to be added, plenty of steps remains mostly the same except when I comment directly on it. It is not necessary to do 100% security all the time, unless you absolutely need it, combining some high and some lower security ideas for a balance of security and convenience is useful. I will base this mostly on Windows, Linux users probably know this, and I have no idea how apple machines work (tho many things in here are still relevant for other operating systems, as they are just general tips) Disclaimer: There are certainly other steps that can make you more anonymous or safer, however I think for most people this will surfice. Any software I recommend should be independently verified for security, and examples of software are not to be taken as endorsements. I simply use examples and give recommendations when I believe it necessary, or helpful. I will not really differentiate between anonymity and security, they are often the same thing. As such the word security can mean either more anonymous, less vulnerable, or both. -------- Everyday Simple Info Sec:
Password for the device is an obvious one (8+ characters minimum, best if over +12), if there is sensitive information on any of the drives, either encrypt the entire drive or just the sensitive files, and make encrypted backups on a different memory storage device (There many programs to encrypt files and drives I'm sure a search will figure it out)
-There could be a hidden administrator user on your PC, make sure to change its password
Always use the device on a non admin account
a VPN that doesn't log (use with kill switch on, should be enough for everyday stuff, more safe stuff in the high security section) (VPNs that claim they don't log sometimes do, it's bad, but I would like to point out that not using a VPN will always expose your traffic to your ISP and also remove additional encryption. Even if the VPN tracks, there is no downside because your ISP would track anyways, and VPNs can be more anonymous, and also add extra encryption)
disable location tracking (preferably make all your privacy setting to release minimal info, get rid or cortana, change privacy settings in all of your accounts as well, there's no reason why you should allow Facebook to give you target ads. Use the setting they give you.
TOR, Firefox or similar browser, stay the fuck away from Google Chrome.
your preferred search engine should be duckduckgo (other privacy focused search engines exist as well)
use an adblocker that also prevents the adding of tacking cookies
Use pgp with all your friends or messaging services that implemented end to end encryption (Implemented services can still be bypassed, but are way more convenient so for everyday use they should suffice, some examples should be Telegraph, Signal, WhatsApp etc) (more info on pgp in high security section)
(Snapchat msgs, reddit dms, discord msgs, are just a few examples of msgs that are never encrypted) -Any info even send in encrypted msgs (and obviously non encrypted) should still be kept with possible deniability, don't say "I'm gonna do MDMA", say "I'm going out with molly."
use software (like ccleaner) that purges cookies and other data after every use, before shutting down your device
use a virus scanner daily (I like spy bot Search and destroy, many other options also exist)
never use the same password/passphrase twice (I will address what passphrase are below) (Better yet use randomized passwords that are stored in a master key chain, make them as long as possible (tho it is okay to go with the minimum of 12 never go below 7, I recommend 15+ depending on how often you have to manually enter the password instead of copying/pasting it) Don't generate too long keys for things you need to access regularly without copy/paste, except your master key ring)
its ideal to never use the same email or username as well, especially username, email is obviously tricky and also very annoying, but it would be best to always change the email.
-DO NOT STORE ANY PASSWORDS ON GOOGLE, IF GOOGLE LOGIN IS AUTHENTICATED IT WILL AUTFILL ALL PASSWORDS IT HAS SAVED (same with other similar services) (This means if you are logged in to chrome and someone has access to your machine, they can auto fill passwords without entering a single password) -use a rememberable passphrase, especially for your master key ring aka password manager A long sentence that is memorable makes an okay password (decent example,: "I met my wife at Little Ceasers for the first time on 07/09/20" better even if it's just something you know, if its impersonal, and if you can add special characters or numbers that you won't forget) (A better example for a passphrase is: "There is 0nly 0ne letter that d0esn’t appear in any U.S. state nameQ")
for your main password manager(key ring), I highly recommend Keepass 2, make backups of the file save to separate devices and drives (Flash drives, phone, PC, laptop, etc, if you loose that file, you lose all of your passwords) (Other good password managers exist as well, I don't recommend online password managers as you lose the control over passwords)
-Purge your internet activity frequently, there's a reason why I only have one post, and a few comments appearing in my account, but thousands of kama. Exposing information needlessly is not good. -Never post private information publicly, and if you do, do it vaguely as possible. (Example: Not "I'm 15", say "I'm a teenager") Do not post any vital information ever, no birthdays, mother's maiden name, age, or anything you have ever seen in a security question. Never post your current activities while they are ongoing. You going on a vacation? Don't announce it to the world, taking picture there? Post them when you are home.
Any account that is supposed to remain anonymous and as secure as possible should only be used on secured devices. A unsecured device can link you to the account.
always shutdown your machine when leaving it (To prevent access, and to prevent a possible attack vector)
2 factor factor authentication is not great anymore. Unless you can do it over a anonymous source. A cell phone is usually directly connected to you, so it is not a anonymous device. There might still be secure/anonymous 2 factor authentication methods that won't expose you, for example over a secure email. (If there is 2FA that doesn't need a device that removes anonymity and is secure, use it.) (Please don't misunderstand, 2FA is great, however it can remove the anonymity that you worked hard to establish)
-Rethink how you do security questions. Many answers to security questions can be found in your internet history. One could use the first word of the security question as an answer, or a different sceme that will mean you always remember it. (Security question need to go, the amount of personal info an average person puts on the internet makes it easy to attack anything using security question) -------_ High level crimimal information security: The motto here is, "All the Security, All the Time" As one fuck up can end with you leaving a lick of traceability, and you could be fucked. Pre Note: All of your software should always be up to date. Also even perfect info sec does not guarantee you are completely safe, a new zero day (exploit) can still fuck you, but good info security makes you significantly safer, by eliminating as many attacks as possible. -Get a new device (or make a already owned device seem like you never owned it, do this only if you know how to, there's a lot of stuff that goes into that, like changing your mac adress etc) buy with cash, and your face covered, preferably far away from where you live. (Do I need to specify to not bring your phone or anything else that tracks your location to anywhere you want to go anonymously?) (Be aware that even hardware can have vulnerabilities, many cpus have known vulnerabilities, I can't list them all, do some research before buying)
Do not EVER use a high security device at any lower level of security. There are unique identifiers to your device, exposing them once can expose you for everything you do.
-If you know how to use Tails (A linux distro designed for Info sec) use that, preferably on a USB. (Or learn how to use tails, its better, but complicated) Otherwise a clean copy of windows (make sure its not in any way associated with you) can do the job too, tho not as well. (Using a VM might give extra security, since VMs usually erase all data and RAM they were using on shutdown) -Get a non tracking VPN, Enable the kill switch (a setting that disables all traffic that doesn't go through the VPN) (change your firewall settings to only allow the traffic from the VPN, windows guide (Change settings so only traffic from the tor application is send) Edit: (Due to complaints: do not use vpn over tor, use tor over vpn. tor over vpn has no notable downside, if the VPN logs it makes no difference, your ISP will always log anyways, and vpns remove other attack vectors and also provide backup security should tor fail. Again even if the VPN tracks you only change the people doing the tracking, but now you are further removed making it more anonymous and also with less vulnerabilities) -rember privacy settings, cookie cleaner, and antivirus, password (There could be a hidden administrator user on your PC, make sure to change its password) -Always use the device on a non admin account
-Ideally use this device only on networks that are not connected with you. Such as public networks (try to never use the same public networks twice, move around) (a home network should be fine now, as it should never be exposed, but more security is always better) (Its just a conveniences vs security trade) -Never use accounts that have been exposed to lower security on higher security machines -your browser is now TOR (or your preferred security focused browser, if you dont plan on using onion ) Make sure you get the standalone version of tor not the addon build (the standalone is safer, because there are less settings and options to tweak) -Change your tor settings, to safest mode, enable a bridge (to my knowledge there's no difference in security between the build in bridges in tor), enable automatic updates, set duckduckgo onion as your primary browser. Set dark.fail onion page as your home page. (Or your preferred privacy search engine and onion directory)
set up a new pgp (can't use the same one you use for regular use, again less safer accounts are never used on safer devices) Cleopatra is my choice, its simple to use. Make sure you back up the private key multiple times, on safe devices. (Dont let the private key fall into anyone's hands) Give it a generic name like "HighSecurityPGP" do not give the pgp key pair a name that could identify you. (No initials etc) (Some pgp key pair programs want an associated email for a key pair, you can create a safe email, or which I recoend you can use a different program (like Cleopatra) (Feds & LEOs are known to copy private keys if they have your machine, so you will need to set up a new key pair if they ever take a device with a private key copy)
a high security machine that facilitates criminal activity can not use many programs. Many programs collect your devices mac adress, which is a unique identifier, amongst other things. It's should be used only for the activity you want to do.
-------_ How to use dark net markets (DNMs) If you finished your High Security setup, we can dive right in. Otherwise go do that. This is where all that is essential. Quick info on Tor, and onion sites. There is no search engine. It's all based of directories and addresses you are given by others. Tor will likely not be very quick, it has to pass through multiple networks to get to the destination. DNMs sometimes exit scam, an exit scam is when a market shuts down completely and takes all the money, this is a risk when using DNMs, it's not too common but happens maybe 0-4 times a year. The admins of thoese servers need to get out at some point, before they get jailed, so they exit the game, and scam everyone out of their money. -A very useful onion directory is dark.fail it has a lot of links, for all kinds of stuff. News, email, DNMs, Psychonautwiki (harm reduction website), forums etc. (Other directories also exist) -Pick a market, preferably one that handles secure connection server side instead of requiring you to establish the secure connection. Then create an account. Your account once created should include an entry box in your profile for a pgp key, post your PUBLIC key in there. (Verify the link is not a scam, most markets should provide a pgp signature) -Next is currency setup. All major cryptocurrency exchangers can be used, I can recommend coin base but there could be better ones out there. Unless you find a small non U.S., exchange, they will always ask for your identity. So unless you can find a trustworthy exchange that doesn't ID, you will need to give it to them. (Side note, all major crypto exchangers report to the IRS, if the IRS asks you if you bought cryptocurrency and you bought while having IDed yourself SAY YES, DO NOT COMMIT TAX FRAUD WHEN THEY KNOW YOU DID)
I recommend using Monero, it's hard to track, so it makes your job a lot easier. (If you use bitcoin you should run it through a scrambler, because BTC is tracable to anyone who knows what they are doing)
-Transfer (monero you can send directly, btc you should scramble) to your wallet. There are two options a cold wallet (physical) or a software wallet. Software wallets usually dont cost anything so I recommend them, even if often less safe. Electrum is easy to use, and pretty safe. You can also do your own research and find a wallet that fits your needs.
decide where you want to ship it. You can send to your home, to a PO box, to a PO box that you opened with a fake ID (I don't recommend), an abandoned house, general mail (sending to a post office instead of a street adress) pickup up with fake ID, use a remailing service. These are some options, sending it to your own home, isn't ideal, but its pretty much the only easy way.
-now you are ready to buy, only buy using escrow (it means the money is held by the market as a middle man until the product is delivered, they will also handle any issues like wrong quantity, cuts, etc), judge the reviews for a product, and if available look at the history of the vendor, until you find a product from a vendor you trust. (I recommend to buy within your country as much as possible, so it doesn't go through customs, it's very rare that something is found, but it can happen) -now you get to buy, depending on market, you either have cryptocurrency stored in their wallets (not recommend, you will lose it in an exit scam) or you can send it every order. When you send your delivery adress (or the one you want it to go to) encrypt the adress using the sellers public key. Make sure the adress is correct. -wait for the product, make sure to extend the escrow until the product arrives, if you can't extend it anymore dispute the order, and a moderator will step in -test the product, use it, and leave a review. PLEASE LEAVE A REVIEW, DNMs only work because of reviews. Edit: Didn't imagine I would write over 15000 words. Oh well, it was fun. Hope it helps, if you have any questions feel free to ask. No idea how long this will stay up, I might purge it in 7 days, or never.
d down, k up, everybody's a game theorist, titcoin, build wiki on Cardano, (e-)voting, competitive marketing analysis, Goguen product update, Alexa likes Charles, David hates all, Adam in and bros in arms with the scientific counterparts of the major cryptocurrency groups, the latest AMA for all!
Decreasing d parameter Just signed the latest change management document, I was the last in the chain so I signed it today for changing the d parameter from 0.52 to 0.5. That means we are just about to cross the threshold here in a little bit for d to fall below 0.5 which means more than half of all the blocks will be made by the community and not the OBFT nodes. That's a major milestone and at this current rate of velocity it looks like d will decrement to zero around March so lots to do, lots to talk about. Product update, two days from now, we'll go ahead and talk about that but it crossed my desk today and I was really happy and excited about that and it seemed like yesterday that d was equal to one and people were complaining that we delayed it by an epoch and now we're almost at 50 percent. For those of you who want parameter-level changes, k-level changes, they are coming and there's an enormous internal conversation about it and we've written up a powerpoint presentation and a philosophy document about why things were designed the way that they're designed. Increasing k parameter and upcoming security video and everybody's a game theorist My chief scientist has put an enormous amount of time into this. Aggelos is very passionate about this particular topic and what I'm going to do is similar to the security video that I did where I did an hour and a half discussion about a best practice for security. I'm going to actually do a screencasted video where I talk about this philosophy document and I'm going to read the entire document with annotations with you guys and kind of talk through it. It might end up being quite a long video. It could be several hours long but I think it's really important to talk around the design philosophy of this. It's kind of funny, everybody, when they see a cryptographic paper or math paper, they tend to just say okay you guys figure that out. No one's an expert in cryptography or math and you don't really get strong opinions about it but game theory despite the fact that the topics as complex and in some cases more complex you tend to get a lot of opinions and everybody's a game theorist. So, there was enormous amount of thought that went into the design of the system, the parameters of system, everything from the reward functions to other things and it's very important that we explain that thought process in as detailed of a way as possible. At least the philosophy behind it then I feel that the community is in a really good position to start working on the change management. It is my position that I'd love to see k largely increased. I do think that the software needs some improvements to get there especially partial delegation delegation portfolios and some enhancements into the operation of staking especially. E-voting I'd love to see the existence of hybrid wallets where you have a cold part a hot part and we've had a lot of conversations about that and we will present some of the progress in that matter at the product updates. If not this October certainly in November. A lot of commercialization going along, a lot of things going on and flowing around and you know, commercial teams working hard. As I mentioned we have a lot of deals in the pipeline. The Wyoming event was half political, half sales. We were really looking into e-voting and we had very productive conversations along those lines. It is my goal that Cardano e-voting software is used in political primaries and my hope is for eventually to be used in municipal and state and eventually federal elections and then in national elections for countries like Ethiopia, Mongolia and other places. Now there is a long road, long, long road to get there and many little victories that have to begin but this event. Wyoming was kind of the opener into that conversation there were seven independent parties at the independent national convention and we had a chance to talk to the leadership of many of them. We will also engage in conversation with the libertarian party leadership as well and at the very least we could talk about e-voting and also blockchain-based voting for primaries that would be great start and we'll also look into the state of Wyoming for that as well. We'll you know, tell you guys about that in time. We've already gotten a lot of inquiries about e-voting software. We tend to get them along with the (Atala) Prism inquiries. It's actually quite easy to start conversations but there are a lot of security properties that are very important like end-to-end verifiability hybrid ballots where you have both a digital and a paper ballot delegation mechanics as well as privacy mechanics that are interesting on a case-by-case basis. Goguen, voting, future fund3, competitive marketing analysis of Ouroboros vs. EOS, Tezos, Algorand, ETH2 and Polkadot, new creative director We'll keep chipping away at that, a lot of Goguen stuff to talk about but I'm going to reserve all of that for two days from now for the product update. We're right in the middle, Goguen metadata was the very first part of it. We already have some commercialization platform as a result of metadata, more to come and then obviously lots of smart contract stuff to come. This update and the November update are going to be very Goguen focused and also a lot of alternatives as well. We're still on schedule for an HFC event in I think November or December. I can't remember but that's going to be carrying a lot of things related multisig token locking. There's some ledger rule changes so it has to be an HFC event and that opens up a lot of the windows for Goguen foundations as well as voting on chain so fund3 will benefit very heavily from that. We're right in the guts of Daedalus right now building the voting center, the identity center, QR-code work. All this stuff, it's a lot of stuff, you know, the cell phone app was released last week. Kind of an early beta, it'll go through a lot of rapid iterations every few weeks. We'll update it, google play is a great foundation to launch things on because it's so easy to push updates to people automatically so you can rapidly iterate and be very agile in that framework and you know we've already had 3500 people involved heavily in the innovation management platform ideascale and we've got numerous bids from everything. From John Buck and the sociocracy movement to others. A lot of people want to help us improve that and we're going to see steady and systematic growth there. We're still chipping away at product marketing. Liza (Horowitz) is doing a good job, meet with her two three-times a week and right now it's Ouroboros, Ouroboros, Ouroboros... We're doing competitive analysis of Ouroboros versus EOS, Tezos, Algorand, ETH2 and Polkadot. We think that's a good set. We think we have a really good way of explaining it. David (David Likes Crypto now at IOHK) has already made some great content. We're going to release that soon alongside some other content and we'll keep chipping away at that. We also just hired a creative director for IO Global. His name's Adam, incredibly experienced creative director, he's worked for Mercedes-Benz and dozens of other companies. He does very good work and he's been doing this for well over 20 years and so the very first set of things he's going to do is work with commercial and marketing on product marketing. In addition to building great content where hope is make that content as pretty as possible and we have Rod heavily involved in that as well to talk about distribution channels and see if we can amplify the distribution message and really get a lot of stuff done. Last thing to mention, oh yeah, iOS for catalyst. We're working on that, we submitted it to the apple store, the iOS store, but it takes a little longer to get approval for that than it does with google play but that's been submitted and it's whenever apple approves it or not. Takes a little longer for cryptocurrency stuff. Wiki shizzle and battle for crypto, make crypto articles on wiki great again, Alexa knows Charles, Everpedia meets Charles podcast, holy-grail land of Cardano, wiki on Cardano, titcoin Wikipedia... kind of rattled the cage a little bit. Through an intermediary we got contact with Jimmy Wales. Larry Sanger, the other co-founder also reached out to me and the everpedia guys reached out to me. Here's where we stand, we have an article, it has solidified, it's currently labeled as unreliable and you should not believe the things that are said in it which is David Gerard's work if you look at the edits. We will work with the community and try to get that article to a fair and balanced representation of Cardano and especially after the product marketing comes through. We clearly explain the product I think the Cardano article can be massively strengthened. I've told Rod to work with some specialized people to try to get that done but we are going to work very hard at a systematic approval campaign for all of the scientific articles related to blockchain technology in the cryptocurrency space. They're just terrible, if you go to the proof of work article, the proof of stake or all these things, they're just terrible. They're not well written, they're out of date and they don't reflect an adequate sampling of the science. I did talk to my chief scientist Aggelos and what we're gonna do is reach out to the scientific counterparts that most of the major cryptocurrency groups that are doing research and see if they want to work with us at an industry-wide effort to systematically improve the scientific articles in our industry so that there are a fair and balanced representation of what the current state of the art are, the criticisms, the trade-offs as well as the reference space and of course obviously we'll do quite well in that respect because we've done the science. We're the inheritor of it but it's a shame because when people search proof of stake on google usually wikipedia results are highly biased. We care about wikipedia because google cares about wikipedia, amazon cares about wikipedia. If you ask Alexa who is Charles Hoskinson, the reason why Alexa knows is because it's reading directly from the wikipedia page. If I didn't have a wikipedia page Alexa would know that so if somebody says Alexa what is Cardano it's going to read directly from the wikipedia page and you know and we can either just pretend that reality doesn't exist or we can accept it and we as a community working with partners in the broader cryptocurrency community can universally improve the quality of cryptocurrency pages. There's been a pattern of commercial censorship on wikipedia for cryptocurrencies in general since bitcoin itself. In fact I think the bitcoin article is actually taken down once back in, might have been, 2010 or 2009 but basically wikipedia has not been a friend of cryptocurrencies. That's why everpedia exists and actually their founders reached out to me and I talked to them over twitter through PMs and we agreed to actually do a podcast. I'm going to do a streamyard, stream with these guys and they'll come on talk all about everpedia and what they do and how they are and we'll kind of go through the challenges that they've encountered. How their platform works and so forth and obviously if they want to ever leave that terrible ecosystem EOS and come to the holy-grail land of Cardano we'd be there to help them out. At least they can tell the world how amazing their product is and also the challenges they're having to overcome. We've also been in great contact with Larry Sanger. He's going to do an internal seminar at some point with with us and talk about some protocols he's been developing since he left wikipedia specifically to decentralize knowledge management and have a truly decentralized encyclopedia. I'm really looking forward to that and I hope that presentation gives us some inspiration as an ecosystem of things we can do. That's a great piece of infrastructure regardless and after we learn a lot more about it and we talk to a lot of people in ecosystem. If we can't get people to move on over, it would be really good to see through ideascale in the innovation management platform for people to utilize the dc fund to build their own variant of wikipedia on Cardano. In the coming months there will certainly be funding available. If you guys are so passionate about this particular problem that you want to go solve it then I'd be happy to play Elon Musk with the hyperloop and write a white paper on a protocol design and really give a good first start and then you guys can go and try to commercialize that technology as Cardano native assets and Plutus smart contracts in addition to other pieces of technology that have to be brought in to make it practical. Right now we're just, let's talk to everybody phase, and we'll talk to the everpedia guys, we're going to talk to Larry and we're going to see whoever else is in this game and of course we have to accept the incumbency as it is. So, we're working with obviously the wikipedia side to improve the quality of not only our article but all of the articles and the scientific side of things so that there's a fair and accurate representation of information. One of the reasons why I'm so concerned about this is that I am very worried that Cardano projects will get commercially censored like we were commercially censored. So, yes we do have a page but it took five years to get there and we're a multi-billion dollar project with hundreds of thousands of people. If you guys are doing cutting-edge novel interesting stuff I don't want your experience to be the same as ours where you have to wait five years for your project to get a page even after government's adopted. That's absurd, no one should be censored ever. This is very well a fight for the entire ecosystem, the entire community, not just Cardano but all cryptocurrencies: bitcoin, ethereum and Cardano have all faced commercial censorship and article deletions during their tenure so I don't want you guys to go through that. I'm hoping we can prove that situation but you know you don't put all your eggs in one basket and frankly the time has come for wikipedia to be fully decentralized and liberated from a centralized organization and massively variable quality in the editor base. If legends of valor has a page but Cardano didn't have one until recently titcoin, a pornography coin from 2015, that's deprecated, no one uses it, has a page but Cardano couldn't get one there's something seriously wrong with the quality control mechanism and we need to improve that so it'll get done.
Why Osana takes so long? (Programmer's point of view on current situation)
I decided to write a comment about «Why Osana takes so long?» somewhere and what can be done to shorten this time. It turned into a long essay. Here's TL;DR of it:
The cost of never paying down this technical debt is clear; eventually the cost to deliver functionality will become so slow that it is easy for a well-designed competitive software product to overtake the badly-designed software in terms of features. In my experience, badly designed software can also lead to a more stressed engineering workforce, in turn leading higher staff churn (which in turn affects costs and productivity when delivering features). Additionally, due to the complexity in a given codebase, the ability to accurately estimate work will also disappear. Junade Ali, Mastering PHP Design Patterns (2016)
Longer version: I am not sure if people here wanted an explanation from a real developer who works with C and with relatively large projects, but I am going to do it nonetheless. I am not much interested in Yandere Simulator nor in this genre in general, but this particular development has a lot to learn from for any fellow programmers and software engineers to ensure that they'll never end up in Alex's situation, especially considering that he is definitely not the first one to got himself knee-deep in the development hell (do you remember Star Citizen?) and he is definitely not the last one. On the one hand, people see that Alex works incredibly slowly, equivalent of, like, one hour per day, comparing it with, say, Papers, Please, the game that was developed in nine months from start to finish by one guy. On the other hand, Alex himself most likely thinks that he works until complete exhaustion each day. In fact, I highly suspect that both those sentences are correct! Because of the mistakes made during early development stages, which are highly unlikely to be fixed due to the pressure put on the developer right now and due to his overall approach to coding, cost to add any relatively large feature (e.g. Osana) can be pretty much comparable to the cost of creating a fan game from start to finish. Trust me, I've seen his leaked source code (don't tell anybody about that) and I know what I am talking about. The largest problem in Yandere Simulator right now is its super slow development. So, without further ado, let's talk about how «implementing the low hanging fruit» crippled the development and, more importantly, what would have been an ideal course of action from my point of view to get out. I'll try to explain things in the easiest terms possible.
else if's and lack any sort of refactoring in general
Perfection is achieved, not when there is nothing more to add, but when there is nothing left to take away. Antoine de Saint-Exupéry
This is why refactoring — activity of rewriting your old code so it does the same thing, but does it quicker, in a more generic way, in less lines or simpler — is so powerful. In my experience, you can only keep one module/class/whatever in your brain if it does not exceed ~1000 lines, maybe ~1500. Splitting 17000-line-long class into smaller classes probably won't improve performance at all, but it will make working with parts of this class way easier. Is it too late now to start refactoring? Of course NO: better late than never.
If you think that you wrote this code, so you'll always easily remember it, I have some bad news for you: you won't. In my experience, one week and that's it. That's why comments are so crucial. It is not necessary to put a ton of comments everywhere, but just a general idea will help you out in the future. Even if you think that It Just Works™ and you'll never ever need to fix it. Time spent to write and debug one line of code almost always exceeds time to write one comment in large-scale projects. Moreover, the best code is the code that is self-evident. In the example above, what the hell does (float) 6 mean? Why not wrap it around into the constant with a good, self-descriptive name? Again, it won't affect performance, since C# compiler is smart enough to silently remove this constant from the real code and place its value into the method invocation directly. Such constants are here for you. I rewrote my code above a little bit to illustrate this. With those comments, you don't have to remember your code at all, since its functionality is outlined in two tiny lines of comments above it. Moreover, even a person with zero knowledge in programming will figure out the purpose of this code. It took me less than half a minute to write those comments, but it'll probably save me quite a lot of time of figuring out «what was I thinking back then» one day. Is it too late now to start adding comments? Again, of course NO. Don't be lazy and redirect all your typing from «debunk» page (which pretty much does the opposite of debunking, but who am I to judge you here?) into some useful comments.
This is often neglected, but consider the following. You wrote some code, you ran your game, you saw a new bug. Was it introduced right now? Is it a problem in your older code which has shown up just because you have never actually used it until now? Where should you search for it? You have no idea, and you have one painful debugging session ahead. Just imagine how easier it would be if you've had some routines which automatically execute after each build and check that environment is still sane and nothing broke on a fundamental level. This is called unit testing, and yes, unit tests won't be able to catch all your bugs, but even getting 20% of bugs identified at the earlier stage is a huge boon to development speed. Is it too late now to start adding unit tests? Kinda YES and NO at the same time. Unit testing works best if it covers the majority of project's code. On the other side, a journey of a thousand miles begins with a single step. If you decide to start refactoring your code, writing a unit test before refactoring will help you to prove to yourself that you have not broken anything without the need of running the game at all.
This is basically pretty self-explanatory. You set this thing once, you forget about it. Static code analyzer is another «free estate» to speed up the development process by finding tiny little errors, mostly silly typos (do you think that you are good enough in finding them? Well, good luck catching x << 4; in place of x <<= 4; buried deep in C code by eye!). Again, this is not a silver bullet, it is another tool which will help you out with debugging a little bit along with the debugger, unit tests and other things. You need every little bit of help here. Is it too late now to hook up static code analyzer? Obviously NO.
Say, you want to build Osana, but then you decided to implement some feature, e.g. Snap Mode. By doing this you have maybe made your game a little bit better, but what you have just essentially done is complicated your life, because now you should also write Osana code for Snap Mode. The way game architecture is done right now, easter eggs code is deeply interleaved with game logic, which leads to code «spaghettifying», which in turn slows down the addition of new features, because one has to consider how this feature would work alongside each and every old feature and easter egg. Even if it is just gazing over one line per easter egg, it adds up to the mess, slowly but surely. A lot of people mention that developer should have been doing it in object-oritented way. However, there is no silver bullet in programming. It does not matter that much if you are doing it object-oriented way or usual procedural way; you can theoretically write, say, AI routines on functional (e.g. LISP)) or even logical language if you are brave enough (e.g. Prolog). You can even invent your own tiny programming language! The only thing that matters is code quality and avoiding the so-called shotgun surgery situation, which plagues Yandere Simulator from top to bottom right now. Is there a way of adding a new feature without interfering with your older code (e.g. by creating a child class which will encapsulate all the things you need, for example)? Go for it, this feature is basically «free» for you. Otherwise you'd better think twice before doing this, because you are going into the «technical debt» territory, borrowing your time from the future by saying «I'll maybe optimize it later» and «a thousand more lines probably won't slow me down in the future that much, right?». Technical debt will incur interest on its own that you'll have to pay. Basically, the entire situation around Osana right now is just a huge tale about how just «interest» incurred by technical debt can control the entire project, like the tail wiggling the dog. I won't elaborate here further, since it'll take me an even larger post to fully describe what's wrong about Yandere Simulator's code architecture. Is it too late to rebuild code architecture? Sadly, YES, although it should be possible to split Student class into descendants by using hooks for individual students. However, code architecture can be improved by a vast margin if you start removing easter eggs and features like Snap Mode that currently bloat Yandere Simulator. I know it is going to be painful, but it is the only way to improve code quality here and now. This will simplify the code, and this will make it easier for you to add the «real» features, like Osana or whatever you'd like to accomplish. If you'll ever want them back, you can track them down in Git history and re-implement them one by one, hopefully without performing the shotgun surgery this time.
Again, I won't be talking about the performance, since you can debug your game on 20 FPS as well as on 60 FPS, but this is a very different story. Yandere Simulator is huge. Once you fixed a bug, you want to test it, right? And your workflow right now probably looks like this:
Fix the code (unavoidable time loss)
Rebuild the project (can take a loooong time)
Load your game (can take a loooong time)
Test it (unavoidable time loss, unless another bug has popped up via unit testing, code analyzer etc.)
And you can fix it. For instance, I know that Yandere Simulator makes all the students' photos during loading. Why should that be done there? Why not either move it to project building stage by adding build hook so Unity does that for you during full project rebuild, or, even better, why not disable it completely or replace with «PLACEHOLDER» text for debug builds? Each second spent watching the loading screen will be rightfully interpreted as «son is not coding» by the community. Is it too late to reduce loading times? Hell NO.
Or any other continuous integration tool. «Rebuild a project» can take a long time too, and what can we do about that? Let me give you an idea. Buy a new PC. Get a 32-core Threadripper, 32 GB of fastest RAM you can afford and a cool motherboard which would support all of that (of course, Ryzen/i5/Celeron/i386/Raspberry Pi is fine too, but the faster, the better). The rest is not necessary, e.g. a barely functional second hand video card burned out by bitcoin mining is fine. You set up another PC in your room. You connect it to your network. You set up ramdisk to speed things up even more. You properly set up Jenkins) on this PC. From now on, Jenkins cares about the rest: tracking your Git repository, (re)building process, large and time-consuming unit tests, invoking static code analyzer, profiling, generating reports and whatever else you can and want to hook up. More importantly, you can fix another bug while Jenkins is rebuilding the project for the previous one et cetera. In general, continuous integration is a great technology to quickly track down errors that were introduced in previous versions, attempting to avoid those kinds of bug hunting sessions. I am highly unsure if continuous integration is needed for 10000-20000 source lines long projects, but things can be different as soon as we step into the 100k+ territory, and Yandere Simulator by now has approximately 150k+ source lines of code. I think that probably continuous integration might be well worth it for Yandere Simulator. Is it too late to add continuous integration?NO, albeit it is going to take some time and skills to set up.
Stop caring about the criticism
Stop comparing Alex to Scott Cawton. IMO Alex is very similar to the person known as SgtMarkIV, the developer of Brutal Doom, who is also a notorious edgelord who, for example, also once told somebody to kill himself, just like… However, being a horrible person, SgtMarkIV does his job. He simply does not care much about public opinion. That's the difference.
Lition - $8 Million Dollar Market Cap With Real Use Right Now and a New Product They Are Developing Which Has Huge Potential.
I’m not usually one to shill my own coins but I’ve stolen a few good picks from this sub so I thought I’d share a new one I recently stumbled upon. Before I go into more details, I’d like to preface this by saying that I never invest in anything which I don’t think has the fundamentals to last at least 5-10 years and I don’t think this is a project which you will see a few hundred percent gains in a month or two. The hype isn’t there with this project and it’s more of a mid-long term play. If you want overnight gains, gamble on some of the smaller caps posted in this sub which are more like ponzi schemes riding on DeFi hype which you sell to a greater fool.
Lition is a layer 2 blockchain infrastructure on top of Ethereum that enables commercial usage of dApps. The Lition protocol complements the Ethereum mainchain by adding features such as privacy, scalability and deletability for GDPR compliance. Everybody can choose to build on Lition without the need for permission.
In addition to the above, they also have a P2P energy trading platform currently operating and is supplying green power to customers in over 1000 towns and cities across Germany. Through their power platform, Lition customers are able to save about 20% on their monthly energy bill, while producers generate up to 30% higher profits since they are cutting out the middle men. However, the real moonshot here is not their already successful smart energy platform (which utilises the same token) it is the enterprise layer 2 solution described in the quote above. Their layer 2 enterprise infrastructure which is still in development will offer infinite scalability through sidechains and nodes staking LIT tokens on these sidechains. Block times will be fast at around 3 seconds and fees will be tiny fractions of a cent. However, the real selling point for enterprises will be that the data on these sidechains can be deleted and can be public or private, with private chains being validated via Zero-Knowledge proofs to verify that the private data is correct. This is huge and makes Lition a solution for a wide range of enterprise use cases due to these optional features. But it doesn’t stop there. Lition is also GDPR compliant - a big deal for Europe based enterprises and for the record, very very few blockchain solutions are GDPR compliant (I believe VeChain is one of the few other projects which are).
Important Bullet Points
They have a very close partnership with SAP who if you don’t know are the world’s leading business software company. SAP’s Chief Innovation Officer is even an advisor for the project. As stated in the whitepaper: ”SAP can easily implement this blockchain into their existing products and services for their customer base of >400,000, making them immediately ready for blockchain use cases. It is therefore well positioned to become the standard mainnet for business applications.”
They have a partnership with Microsoft and they are integrated with Microsoft Azure Cloud.
In terms of their energy platform, Lition has a growth target of 235,000 customers by the end of 2022. 3 months ago they stated that they were ahead of their goal. Right now there is a ”solid 4-figured number of new customers every month with each new customer bringing in ~€1,000 Euros in annual recurring revenue”.
Oh, and did I mention they support staking? Staking returns are currently over 15% for node operators.
Their token has two primary uses. First, it is a utility token and they plan on making the LIT token the preferred payment method for all of the services on the Lition protocol. Secondly, it is used as collateral for staking which I can see locking up a large proportion of the supply in the future. Unfortunately the circulating supply is currently 50% of the max supply but that said, coins like LINK have just 35% of the total tokens currently circulating, so relative to other projects, this isn’t too bad and many of the tokens are still to be earned by staking.
With their existing energy platform seeing real adoption and steady growth in Germany, in my opinion, this alone would be enough to justify their current market cap. However, I can see their second layer solution for enterprise being a really big deal in the future as protocol coins tend to accrue more value than utility tokens. As a versatile L2 solution for Ethereum, LIT gets the best of both worlds - adoption and network effects from Ethereum by helping it to scale as well as accruing value from the wide range of enterprise use cases which can be built on top of Lition. At just $8 million dollars in market cap, it seems to me that their work-in-progress L2 enterprise solution has not been priced in. However, due to a lack of hype and marketing right now, I don’t see LIT exploding in the short term. Rather, I can see it slowly outperforming ETH and climbing up the CMC rankings throughout this bullrun, much like Chainlink did in the bear market. Their building and partnerships over marketing strategy also reminds me when I held Chainlink back in 2018 when Sergey was busy building out the project rather than blowing their ICO money on marketing a bunch of vaporware like so many other projects. Personally, I can see LIT becoming a top 100 project (not top 10) as it isn’t the first of an important new type of project like Chainlink was/is but it is an L2 protocol with unique advantages and selling points over other existing L2 projects which scatter the top 20-200 range. This would put the market cap at just under $120 million dollars which is a 15x from here. This is of course a valuation which assumes that the total crypto market cap remains where it is right now at just under $400 billion dollars. However, if BTC makes it to 100K and Ethereum gets to $5K then that is another 10x from here which compounds on any LIT/BTC or LIT/ETH ratio gains. In this scenario, a top 100 project would be worth around $1BILLION DOLLARS by market cap which is over 100x from here and probably even more if ETH hits 10K and Bitcoin dominance falls back down to the 30% range or below towards the end of the bullrun. Disclaimer, the above figures are a theoretical best case scenario and are far from financial advice. They are my moonshot estimates which assumes all goes well for the project and the wider crypto space. Website: https://www.lition.io/ CoinGecko: https://www.coingecko.com/en/coins/lition Medium: https://medium.com/lition-blog
TL;DR: LIT has current real world use which is consistently growing with their P2P energy trading platform and has huge potential with their new L2 protocol for enterprise due to its unique features. They have a close partnership with SAP and are also partnered with Microsoft. Currently around #400 on CMC, my target is for LIT to be top 100 by the end of the bullrun. Edit: Sorry 4chan, I didn't mean to shill one of your FUDed coins. Lit is a shitcoin scam, ignore this post.
Scaling Reddit Community Points with Arbitrum Rollup: a piece of cake
https://preview.redd.it/b80c05tnb9e51.jpg?width=2550&format=pjpg&auto=webp&s=850282c1a3962466ed44f73886dae1c8872d0f31 Submitted for consideration toThe Great Reddit Scaling Bake-Off Baked by the pastry chefs atOffchain Labs Please send questions or comments to [[email protected] ](mailto:[email protected]) 1. Overview We're excited to submit Arbitrum Rollup for consideration to The Great Reddit Scaling Bake-Off. Arbitrum Rollup is the only Ethereum scaling solution that supports arbitrary smart contracts without compromising on Ethereum's security or adding points of centralization. For Reddit, this means that Arbitrum can not only scale the minting and transfer of Community Points, but it can foster a creative ecosystem built around Reddit Community Points enabling points to be used in a wide variety of third party applications. That's right -- you can have your cake and eat it too! Arbitrum Rollup isn't just Ethereum-style. Its Layer 2 transactions are byte-for-byte identical to Ethereum, which means Ethereum users can continue to use their existing addresses and wallets, and Ethereum developers can continue to use their favorite toolchains and development environments out-of-the-box with Arbitrum. Coupling Arbitrum’s tooling-compatibility with its trustless asset interoperability, Reddit not only can scale but can onboard the entire Ethereum community at no cost by giving them the same experience they already know and love (well, certainly know). To benchmark how Arbitrum can scale Reddit Community Points, we launched the Reddit contracts on an Arbitrum Rollup chain. Since Arbitrum provides full Solidity support, we didn't have to rewrite the Reddit contracts or try to mimic their functionality using an unfamiliar paradigm. Nope, none of that. We launched the Reddit contracts unmodified on Arbitrum Rollup complete with support for minting and distributing points. Like every Arbitrum Rollup chain, the chain included a bridge interface in which users can transfer Community Points or any other asset between the L1 and L2 chains. Arbitrum Rollup chains also support dynamic contract loading, which would allow third-party developers to launch custom ecosystem apps that integrate with Community Points on the very same chain that runs the Reddit contracts. 1.1 Why Ethereum Perhaps the most exciting benefit of distributing Community Points using a blockchain is the ability to seamlessly port points to other applications and use them in a wide variety of contexts. Applications may include simple transfers such as a restaurant that allows Redditors to spend points on drinks. Or it may include complex smart contracts -- such as placing Community Points as a wager for a multiparty game or as collateral in a financial contract. The common denominator between all of the fun uses of Reddit points is that it needs a thriving ecosystem of both users and developers, and the Ethereum blockchain is perhaps the only smart contract platform with significant adoption today. While many Layer 1 blockchains boast lower cost or higher throughput than the Ethereum blockchain, more often than not, these attributes mask the reality of little usage, weaker security, or both. Perhaps another platform with significant usage will rise in the future. But today, Ethereum captures the mindshare of the blockchain community, and for Community Points to provide the most utility, the Ethereum blockchain is the natural choice. 1.2 Why Arbitrum While Ethereum's ecosystem is unmatched, the reality is that fees are high and capacity is too low to support the scale of Reddit Community Points. Enter Arbitrum. Arbitrum Rollup provides all of the ecosystem benefits of Ethereum, but with orders of magnitude more capacity and at a fraction of the cost of native Ethereum smart contracts. And most of all, we don't change the experience from users. They continue to use the same wallets, addresses, languages, and tools. Arbitrum Rollup is not the only solution that can scale payments, but it is the only developed solution that can scale both payments and arbitrary smart contracts trustlessly, which means that third party users can build highly scalable add-on apps that can be used without withdrawing money from the Rollup chain. If you believe that Reddit users will want to use their Community Points in smart contracts--and we believe they will--then it makes the most sense to choose a single scaling solution that can support the entire ecosystem, eliminating friction for users. We view being able to run smart contracts in the same scaling solution as fundamentally critical since if there's significant demand in running smart contracts from Reddit's ecosystem, this would be a load on Ethereum and would itself require a scaling solution. Moreover, having different scaling solutions for the minting/distribution/spending of points and for third party apps would be burdensome for users as they'd have to constantly shuffle their Points back and forth. 2. Arbitrum at a glance Arbitrum Rollup has a unique value proposition as it offers a combination of features that no other scaling solution achieves. Here we highlight its core attributes. Decentralized. Arbitrum Rollup is as decentralized as Ethereum. Unlike some other Layer 2 scaling projects, Arbitrum Rollup doesn't have any centralized components or centralized operators who can censor users or delay transactions. Even in non-custodial systems, centralized components provide a risk as the operators are generally incentivized to increase their profit by extracting rent from users often in ways that severely degrade user experience. Even if centralized operators are altruistic, centralized components are subject to hacking, coercion, and potential liability. Massive Scaling. Arbitrum achieves order of magnitude scaling over Ethereum's L1 smart contracts. Our software currently supports 453 transactions-per-second for basic transactions (at 1616 Ethereum gas per tx). We have a lot of room left to optimize (e.g. aggregating signatures), and over the next several months capacity will increase significantly. As described in detail below, Arbitrum can easily support and surpass Reddit's anticipated initial load, and its capacity will continue to improve as Reddit's capacity needs grow. Low cost. The cost of running Arbitrum Rollup is quite low compared to L1 Ethereum and other scaling solutions such as those based on zero-knowledge proofs. Layer 2 fees are low, fixed, and predictable and should not be overly burdensome for Reddit to cover. Nobody needs to use special equipment or high-end machines. Arbitrum requires validators, which is a permissionless role that can be run on any reasonable on-line machine. Although anybody can act as a validator, in order to protect against a “tragedy of the commons” and make sure reputable validators are participating, we support a notion of “invited validators” that are compensated for their costs. In general, users pay (low) fees to cover the invited validators’ costs, but we imagine that Reddit may cover this cost for its users. See more on the costs and validator options below. Ethereum Developer Experience. Not only does Arbitrum support EVM smart contracts, but the developer experience is identical to that of L1 Ethereum contracts and fully compatible with Ethereum tooling. Developers can port existing Solidity apps or write new ones using their favorite and familiar toolchains (e.g. Truffle, Buidler). There are no new languages or coding paradigms to learn. Ethereum wallet compatibility. Just as in Ethereum, Arbitrum users need only hold keys, but do not have to store any coin history or additional data to protect or access their funds. Since Arbitrum transactions are semantically identical to Ethereum L1 transactions, existing Ethereum users can use their existing Ethereum keys with their existing wallet software such as Metamask. Token interoperability. Users can easily transfer their ETH, ERC-20 and ERC-721 tokens between Ethereum and the Arbitrum Rollup chain. As we explain in detail below, it is possible to mint tokens in L2 that can subsequently be withdrawn and recognized by the L1 token contract. Fast finality. Transactions complete with the same finality time as Ethereum L1 (and it's possible to get faster finality guarantees by trading away trust assumptions; see the Arbitrum Rollup whitepaper for details). Non-custodial. Arbitrum Rollup is a non-custodial scaling solution, so users control their funds/points and neither Reddit nor anyone else can ever access or revoke points held by users. Censorship Resistant. Since it's completely decentralized, and the Arbitrum protocol guarantees progress trustlessly, Arbitrum Rollup is just as censorship-proof as Ethereum. Block explorer. The Arbitrum Rollup block explorer allows users to view and analyze transactions on the Rollup chain. Limitations Although this is a bake-off, we're not going to sugar coat anything. Arbitrum Rollup, like any Optimistic Rollup protocol, does have one limitation, and that's the delay on withdrawals. As for the concrete length of the delay, we've done a good deal of internal modeling and have blogged about this as well. Our current modeling suggests a 3-hour delay is sufficient (but as discussed in the linked post there is a tradeoff space between the length of the challenge period and the size of the validators’ deposit). Note that this doesn't mean that the chain is delayed for three hours. Arbitrum Rollup supports pipelining of execution, which means that validators can keep building new states even while previous ones are “in the pipeline” for confirmation. As the challenge delays expire for each update, a new state will be confirmed (read more about this here). So activity and progress on the chain are not delayed by the challenge period. The only thing that's delayed is the consummation of withdrawals. Recall though that any single honest validator knows immediately (at the speed of L1 finality) which state updates are correct and can guarantee that they will eventually be confirmed, so once a valid withdrawal has been requested on-chain, every honest party knows that the withdrawal will definitely happen. There's a natural place here for a liquidity market in which a validator (or someone who trusts a validator) can provide withdrawal loans for a small interest fee. This is a no-risk business for them as they know which withdrawals will be confirmed (and can force their confirmation trustlessly no matter what anyone else does) but are just waiting for on-chain finality. 3. The recipe: How Arbitrum Rollup works For a description of the technical components of Arbitrum Rollup and how they interact to create a highly scalable protocol with a developer experience that is identical to Ethereum, please refer to the following documents: Arbitrum Rollup Whitepaper Arbitrum academic paper (describes a previous version of Arbitrum) 4. Developer docs and APIs For full details about how to set up and interact with an Arbitrum Rollup chain or validator, please refer to our developer docs, which can be found at https://developer.offchainlabs.com/. Note that the Arbitrum version described on that site is older and will soon be replaced by the version we are entering in Reddit Bake-Off, which is still undergoing internal testing before public release. 5. Who are the validators? As with any Layer 2 protocol, advancing the protocol correctly requires at least one validator (sometimes called block producers) that is honest and available. A natural question is: who are the validators? Recall that the validator set for an Arbitrum chain is open and permissionless; anyone can start or stop validating at will. (A useful analogy is to full nodes on an L1 chain.) But we understand that even though anyone can participate, Reddit may want to guarantee that highly reputable nodes are validating their chain. Reddit may choose to validate the chain themselves and/or hire third-party validators.To this end, we have begun building a marketplace for validator-for-hire services so that dapp developers can outsource validation services to reputable nodes with high up-time. We've announced a partnership in which Chainlink nodes will provide Arbitrum validation services, and we expect to announce more partnerships shortly with other blockchain infrastructure providers. Although there is no requirement that validators are paid, Arbitrum’s economic model tracks validators’ costs (e.g. amount of computation and storage) and can charge small fees on user transactions, using a gas-type system, to cover those costs. Alternatively, a single party such as Reddit can agree to cover the costs of invited validators. 6. Reddit Contract Support Since Arbitrum contracts and transactions are byte-for-byte compatible with Ethereum, supporting the Reddit contracts is as simple as launching them on an Arbitrum chain. Minting. Arbitrum Rollup supports hybrid L1/L2 tokens which can be minted in L2 and then withdrawn onto the L1. An L1 contract at address A can make a special call to the EthBridge which deploys a "buddy contract" to the same address A on an Arbitrum chain. Since it's deployed at the same address, users can know that the L2 contract is the authorized "buddy" of the L1 contract on the Arbitrum chain. For minting, the L1 contract is a standard ERC-20 contract which mints and burns tokens when requested by the L2 contract. It is paired with an ERC-20 contract in L2 which mints tokens based on whatever programmer provided minting facility is desired and burns tokens when they are withdrawn from the rollup chain. Given this base infrastructure, Arbitrum can support any smart contract based method for minting tokens in L2, and indeed we directly support Reddit's signature/claim based minting in L2. Batch minting. What's better than a mint cookie? A whole batch! In addition to supporting Reddit’s current minting/claiming scheme, we built a second minting design, which we believe outperforms the signature/claim system in many scenarios. In the current system, Reddit periodically issues signed statements to users, who then take those statements to the blockchain to claim their tokens. An alternative approach would have Reddit directly submit the list of users/amounts to the blockchain and distribute the tokens to the users without the signature/claim process. To optimize the cost efficiency of this approach, we designed an application-specific compression scheme to minimize the size of the batch distribution list. We analyzed the data from Reddit's previous distributions and found that the data is highly compressible since token amounts are small and repeated, and addresses appear multiple times. Our function groups transactions by size, and replaces previously-seen addresses with a shorter index value. We wrote client code to compress the data, wrote a Solidity decompressing function, and integrated that function into Reddit’s contract running on Arbitrum. When we ran the compression function on the previous Reddit distribution data, we found that we could compress batched minting data down to to 11.8 bytes per minting event (averaged over a 6-month trace of Reddit’s historical token grants)compared with roughly 174 bytes of on-chain data needed for the signature claim approach to minting (roughly 43 for an RLP-encoded null transaction + 65 for Reddit's signature + 65 for the user's signature + roughly 8 for the number of Points) . The relative benefit of the two approaches with respect to on-chain call data cost depends on the percentage of users that will actually claim their tokens on chain. With the above figures, batch minting will be cheaper if roughly 5% of users redeem their claims. We stress that our compression scheme is not Arbitrum-specific and would be beneficial in any general-purpose smart contract platform. 8. Benchmarks and costs In this section, we give the full costs of operating the Reddit contracts on an Arbitrum Rollup chain including the L1 gas costs for the Rollup chain, the costs of computation and storage for the L2 validators as well as the capital lockup requirements for staking. Arbitrum Rollup is still on testnet, so we did not run mainnet benchmarks. Instead, we measured the L1 gas cost and L2 workload for Reddit operations on Arbitrum and calculated the total cost assuming current Ethereum gas prices. As noted below in detail, our measurements do not assume that Arbitrum is consuming the entire capacity of Ethereum. We will present the details of our model now, but for full transparency you can also play around with it yourself and adjust the parameters, by copying the spreadsheet found here. Our cost model is based on measurements of Reddit’s contracts, running unmodified (except for the addition of a batch minting function) on Arbitrum Rollup on top of Ethereum. On the distribution of transactions and frequency of assertions. Reddit's instructions specify the following minimum parameters that submissions should support: Over a 5 day period, your scaling PoC should be able to handle:
100,000 point claims (minting & distributing points)
75,000 one-off points burning
We provide the full costs of operating an Arbitrum Rollup chain with this usage under the assumption that tokens are minted or granted to users in batches, but other transactions are uniformly distributed over the 5 day period. Unlike some other submissions, we do not make unrealistic assumptions that all operations can be submitted in enormous batches. We assume that batch minting is done in batches that use only a few percent on an L1 block’s gas, and that other operations come in evenly over time and are submitted in batches, with one batch every five minutes to keep latency reasonable. (Users are probably already waiting for L1 finality, which takes at least that long to achieve.) We note that assuming that there are only 300,000 transactions that arrive uniformly over the 5 day period will make our benchmark numbers lower, but we believe that this will reflect the true cost of running the system. To see why, say that batches are submitted every five minutes (20 L1 blocks) and there's a fixed overhead of c bytes of calldata per batch, the cost of which will get amortized over all transactions executed in that batch. Assume that each individual transaction adds a marginal cost of t. Lastly assume the capacity of the scaling system is high enough that it can support all of Reddit's 300,000 transactions within a single 20-block batch (i.e. that there is more than c + 300,000*t byes of calldata available in 20 blocks). Consider what happens if c, the per-batch overhead, is large (which it is in some systems, but not in Arbitrum). In the scenario that transactions actually arrive at the system's capacity and each batch is full, then c gets amortized over 300,000 transactions. But if we assume that the system is not running at capacity--and only receives 300,000 transactions arriving uniformly over 5 days-- then each 20-block assertion will contain about 200 transactions, and thus each transaction will pay a nontrivial cost due to c. We are aware that other proposals presented scaling numbers assuming that 300,000 transactions arrived at maximum capacity and was executed in a single mega-transaction, but according to our estimates, for at least one such report, this led to a reported gas price that was 2-3 orders of magnitude lower than it would have been assuming uniform arrival. We make more realistic batching assumptions, and we believe Arbitrum compares well when batch sizes are realistic. Our model. Our cost model includes several sources of cost:
L1 gas costs: This is the cost of posting transactions as calldata on the L1 chain, as well as the overhead associated with each batch of transactions, and the L1 cost of settling transactions in the Arbitrum protocol.
Validator’s staking costs: In normal operation, one validator will need to be staked. The stake is assumed to be 0.2% of the total value of the chain (which is assumed to be $1 per user who is eligible to claim points). The cost of staking is the interest that could be earned on the money if it were not staked.
Validator computation and storage: Every validator must do computation to track the chain’s processing of transactions, and must maintain storage to keep track of the contracts’ EVM storage. The cost of computation and storage are estimated based on measurements, with the dollar cost of resources based on Amazon Web Services pricing.
It’s clear from our modeling that the predominant cost is for L1 calldata. This will probably be true for any plausible rollup-based system. Our model also shows that Arbitrum can scale to workloads much larger than Reddit’s nominal workload, without exhausting L1 or L2 resources. The scaling bottleneck will ultimately be calldata on the L1 chain. We believe that cost could be reduced substantially if necessary by clever encoding of data. (In our design any compression / decompression of L2 transaction calldata would be done by client software and L2 programs, never by an L1 contract.) 9. Status of Arbitrum Rollup Arbitrum Rollup is live on Ethereum testnet. All of the code written to date including everything included in the Reddit demo is open source and permissively licensed under the Apache V2 license. The first testnet version of Arbitrum Rollup was released on testnet in February. Our current internal version, which we used to benchmark the Reddit contracts, will be released soon and will be a major upgrade. Both the Arbitrum design as well as the implementation are heavily audited by independent third parties. The Arbitrum academic paper was published at USENIX Security, a top-tier peer-reviewed academic venue. For the Arbitrum software, we have engaged Trail of Bits for a security audit, which is currently ongoing, and we are committed to have a clean report before launching on Ethereum mainnet. 10. Reddit Universe Arbitrum Rollup Chain The benchmarks described in this document were all measured using the latest internal build of our software. When we release the new software upgrade publicly we will launch a Reddit Universe Arbitrum Rollup chain as a public demo, which will contain the Reddit contracts as well as a Uniswap instance and a Connext Hub, demonstrating how Community Points can be integrated into third party apps. We will also allow members of the public to dynamically launch ecosystem contracts. We at Offchain Labs will cover the validating costs for the Reddit Universe public demo. If the folks at Reddit would like to evaluate our software prior to our public demo, please email us at [email protected] and we'd be more than happy to provide early access. 11. Even more scaling: Arbitrum Sidechains Rollups are an excellent approach to scaling, and we are excited about Arbitrum Rollup which far surpasses Reddit's scaling needs. But looking forward to Reddit's eventual goal of supporting hundreds of millions of users, there will likely come a time when Reddit needs more scaling than any Rollup protocol can provide. While Rollups greatly reduce costs, they don't break the linear barrier. That is, all transactions have an on-chain footprint (because all calldata must be posted on-chain), albeit a far smaller one than on native Ethereum, and the L1 limitations end up being the bottleneck for capacity and cost. Since Ethereum has limited capacity, this linear use of on-chain resources means that costs will eventually increase superlinearly with traffic. The good news is that we at Offchain Labs have a solution in our roadmap that can satisfy this extreme-scaling setting as well: Arbitrum AnyTrust Sidechains. Arbitrum Sidechains are similar to Arbitrum Rollup, but deviate in that they name a permissioned set of validators. When a chain’s validators agree off-chain, they can greatly reduce the on-chain footprint of the protocol and require almost no data to be put on-chain. When validators can't reach unanimous agreement off-chain, the protocol reverts to Arbitrum Rollup. Technically, Arbitrum Sidechains can be viewed as a hybrid between state channels and Rollup, switching back and forth as necessary, and combining the performance and cost that state channels can achieve in the optimistic case, with the robustness of Rollup in other cases. The core technical challenge is how to switch seamlessly between modes and how to guarantee that security is maintained throughout. Arbitrum Sidechains break through this linear barrier, while still maintaining a high level of security and decentralization. Arbitrum Sidechains provide the AnyTrust guarantee, which says that as long as any one validator is honest and available (even if you don't know which one will be), the L2 chain is guaranteed to execute correctly according to its code and guaranteed to make progress. Unlike in a state channel, offchain progress does not require unanimous consent, and liveness is preserved as long as there is a single honest validator. Note that the trust model for Arbitrum Sidechains is much stronger than for typical BFT-style chains which introduce a consensus "voting" protocols among a small permissioned group of validators. BFT-based protocols require a supermajority (more than 2/3) of validators to agree. In Arbitrum Sidechains, by contrast, all you need is a single honest validator to achieve guaranteed correctness and progress. Notice that in Arbitrum adding validators strictly increases security since the AnyTrust guarantee provides correctness as long as any one validator is honest and available. By contrast, in BFT-style protocols, adding nodes can be dangerous as a coalition of dishonest nodes can break the protocol. Like Arbitrum Rollup, the developer and user experiences for Arbitrum Sidechains will be identical to that of Ethereum. Reddit would be able to choose a large and diverse set of validators, and all that they would need to guarantee to break through the scaling barrier is that a single one of them will remain honest. We hope to have Arbitrum Sidechains in production in early 2021, and thus when Reddit reaches the scale that surpasses the capacity of Rollups, Arbitrum Sidechains will be waiting and ready to help. While the idea to switch between channels and Rollup to get the best of both worlds is conceptually simple, getting the details right and making sure that the switch does not introduce any attack vectors is highly non-trivial and has been the subject of years of our research (indeed, we were working on this design for years before the term Rollup was even coined). 12. How Arbitrum compares We include a comparison to several other categories as well as specific projects when appropriate. and explain why we believe that Arbitrum is best suited for Reddit's purposes. We focus our attention on other Ethereum projects. Payment only Rollups. Compared to Arbitrum Rollup, ZK-Rollups and other Rollups that only support token transfers have several disadvantages:
As outlined throughout the proposal, we believe that the entire draw of Ethereum is in its rich smart contracts support which is simply not achievable with today's zero-knowledge proof technology. Indeed, scaling with a ZK-Rollup will add friction to the deployment of smart contracts that interact with Community Points as users will have to withdraw their coins from the ZK-Rollup and transfer them to a smart contract system (like Arbitrum). The community will be best served if Reddit builds on a platform that has built-in, frictionless smart-contract support.
All other Rollup protocols of which we are aware employ a centralized operator. While it's true that users retain custody of their coins, the centralized operator can often profit from censoring, reordering, or delaying transactions. A common misconception is that since they're non-custodial protocols, a centralized sequencer does not pose a risk but this is incorrect as the sequencer can wreak havoc or shake down users for side payments without directly stealing funds.
Sidechain type protocols can eliminate some of these issues, but they are not trustless. Instead, they require trust in some quorum of a committee, often requiring two-third of the committee to be honest, compared to rollup protocols like Arbitrum that require only a single honest party. In addition, not all sidechain type protocols have committees that are diverse, or even non-centralized, in practice.
Plasma-style protocols have a centralized operator and do not support general smart contracts.
13. Concluding Remarks While it's ultimately up to the judges’ palate, we believe that Arbitrum Rollup is the bakeoff choice that Reddit kneads. We far surpass Reddit's specified workload requirement at present, have much room to optimize Arbitrum Rollup in the near term, and have a clear path to get Reddit to hundreds of millions of users. Furthermore, we are the only project that gives developers and users the identical interface as the Ethereum blockchain and is fully interoperable and tooling-compatible, and we do this all without any new trust assumptions or centralized components. But no matter how the cookie crumbles, we're glad to have participated in this bake-off and we thank you for your consideration. About Offchain Labs Offchain Labs, Inc. is a venture-funded New York company that spun out of Princeton University research, and is building the Arbitrum platform to usher in the next generation of scalable, interoperable, and compatible smart contracts. Offchain Labs is backed by Pantera Capital, Compound VC, Coinbase Ventures, and others. Leadership Team Ed Felten Ed Felten is Co-founder and Chief Scientist at Offchain Labs. He is on leave from Princeton University, where he is the Robert E. Kahn Professor of Computer Science and Public Affairs. From 2015 to 2017 he served at the White House as Deputy United States Chief Technology Officer and senior advisor to the President. He is an ACM Fellow and member of the National Academy of Engineering. Outside of work, he is an avid runner, cook, and L.A. Dodgers fan. Steven Goldfeder Steven Goldfeder is Co-founder and Chief Executive Officer at Offchain Labs. He holds a PhD from Princeton University, where he worked at the intersection of cryptography and cryptocurrencies including threshold cryptography, zero-knowledge proof systems, and post-quantum signatures. He is a co-author of Bitcoin and Cryptocurrency Technologies, the leading textbook on cryptocurrencies, and he has previously worked at Google and Microsoft Research, where he co-invented the Picnic signature algorithm. When not working, you can find Steven spending time with his family, taking a nature walk, or twisting balloons. Harry Kalodner Harry Kalodner is Co-founder and Chief Technology Officer at Offchain Labs where he leads the engineering team. Before the company he attended Princeton as a Ph.D candidate where his research explored economics, anonymity, and incentive compatibility of cryptocurrencies, and he also has worked at Apple. When not up at 3:00am writing code, Harry occasionally sleeps.
Syscoin Platform’s Great Reddit Scaling Bake-off Proposal
https://preview.redd.it/rqt2dldyg8e51.jpg?width=1044&format=pjpg&auto=webp&s=777ae9d4fbbb54c3540682b72700fc4ba3de0a44 We are excited to participate and present Syscoin Platform's ideal characteristics and capabilities towards a well-rounded Reddit Community Points solution! Our scaling solution for Reddit Community Points involves 2-way peg interoperability with Ethereum. This will provide a scalable token layer built specifically for speed and high volumes of simple value transfers at a very low cost, while providing sovereign ownership and onchain finality. Token transfers scale by taking advantage of a globally sorting mempool that provides for probabilistically secure assumptions of “as good as settled”. The opportunity here for token receivers is to have an app-layer interactivity on the speed/security tradeoff (99.9999% assurance within 10 seconds). We call this Z-DAG, and it achieves high-throughput across a mesh network topology presently composed of about 2,000 geographically dispersed full-nodes. Similar to Bitcoin, however, these nodes are incentivized to run full-nodes for the benefit of network security, through a bonded validator scheme. These nodes do not participate in the consensus of transactions or block validation any differently than other nodes and therefore do not degrade the security model of Bitcoin’s validate first then trust, across every node. Each token transfer settles on-chain. The protocol follows Bitcoin core policies so it has adequate code coverage and protocol hardening to be qualified as production quality software. It shares a significant portion of Bitcoin’s own hashpower through merged-mining. This platform as a whole can serve token microtransactions, larger settlements, and store-of-value in an ideal fashion, providing probabilistic scalability whilst remaining decentralized according to Bitcoin design. It is accessible to ERC-20 via a permissionless and trust-minimized bridge that works in both directions. The bridge and token platform are currently available on the Syscoin mainnet. This has been gaining recent attention for use by loyalty point programs and stablecoins such as Binance USD.
Syscoin Foundation identified a few paths for Reddit to leverage this infrastructure, each with trade-offs. The first provides the most cost-savings and scaling benefits at some sacrifice of token autonomy. The second offers more preservation of autonomy with a more narrow scope of cost savings than the first option, but savings even so. The third introduces more complexity than the previous two yet provides the most overall benefits. We consider the third as most viable as it enables Reddit to benefit even while retaining existing smart contract functionality. We will focus on the third option, and include the first two for good measure.
Distribution, burns and user-to-user transfers of Reddit Points are entirely carried out on the Syscoin network. This full-on approach to utilizing the Syscoin network provides the most scalability and transaction cost benefits of these scenarios. The tradeoff here is distribution and subscription handling likely migrating away from smart contracts into the application layer.
The Reddit Community Points ecosystem can continue to use existing smart contracts as they are used today on the Ethereum mainchain. Users migrate a portion of their tokens to Syscoin, the scaling network, to gain much lower fees, scalability, and a proven base layer, without sacrificing sovereign ownership. They would use Syscoin for user-to-user transfers. Tips redeemable in ten seconds or less, a high-throughput relay network, and onchain settlement at a block target of 60 seconds.
Integration between Matic Network and Syscoin Platform - similar to Syscoin’s current integration with Ethereum - will provide Reddit Community Points with EVM scalability (including the Memberships ERC777 operator) on the Matic side, and performant simple value transfers, robust decentralized security, and sovereign store-of-value on the Syscoin side. It’s “the best of both worlds”. The trade-off is more complex interoperability.
Syscoin + Matic Integration
Matic and Blockchain Foundry Inc, the public company formed by the founders of Syscoin, recently entered a partnership for joint research and business development initiatives. This is ideal for all parties as Matic Network and Syscoin Platform provide complementary utility. Syscoin offers characteristics for sovereign ownership and security based on Bitcoin’s time-tested model, and shares a significant portion of Bitcoin’s own hashpower. Syscoin’s focus is on secure and scalable simple value transfers, trust-minimized interoperability, and opt-in regulatory compliance for tokenized assets rather than scalability for smart contract execution. On the other hand, Matic Network can provide scalable EVM for smart contract execution. Reddit Community Points can benefit from both. Syscoin + Matic integration is actively being explored by both teams, as it is helpful to Reddit, Ethereum, and the industry as a whole.
Total cost for these 100k transactions: $0.63 USD See the live fee comparison for savings estimation between transactions on Ethereum and Syscoin. Below is a snapshot at time of writing: ETH price: $318.55 ETH gas price: 55.00 Gwei ($0.37) Syscoin price: $0.11 Snapshot of live fee comparison chart Z-DAG provides a more efficient fee-market. A typical Z-DAG transaction costs 0.0000582 SYS. Tokens can be safely redeemed/re-spent within seconds or allowed to settle on-chain beforehand. The costs should remain about this low for microtransactions. Syscoin will achieve further reduction of fees and even greater scalability with offchain payment channels for assets, with Z-DAG as a resilience fallback. New payment channel technology is one of the topics under research by the Syscoin development team with our academic partners at TU Delft. In line with the calculation in the Lightning Networks white paper, payment channels using assets with Syscoin Core will bring theoretical capacity for each person on Earth (7.8 billion) to have five on-chain transactions per year, per person, without requiring anyone to enter a fee market (aka “wait for a block”). This exceeds the minimum LN expectation of two transactions per person, per year; one to exist on-chain and one to settle aggregated value.
Tools to simplify using Syscoin Bridge as a service with dapps and wallets will be released some time after implementation of Syscoin Core 4.2. These will be based upon the same processes which are automated in the current live Sysethereum Dapp that is functioning with the Syscoin mainnet.
The Syscoin Ethereum Bridge is secured by Agent nodes participating in a decentralized and incentivized model that involves roles of Superblock challengers and submitters. This model is open to participation. The benefits here are trust-minimization, permissionless-ness, and potentially less legal/regulatory red-tape than interop mechanisms that involve liquidity providers and/or trading mechanisms. The trade-off is that due to the decentralized nature there are cross-chain settlement times of one hour to cross from Ethereum to Syscoin, and three hours to cross from Syscoin to Ethereum. We are exploring ways to reduce this time while maintaining decentralization via zkp. Even so, an “instant bridge” experience could be provided by means of a third-party liquidity mechanism. That option exists but is not required for bridge functionality today. Typically bridges are used with batch value, not with high frequencies of smaller values, and generally it is advantageous to keep some value on both chains for maximum availability of utility. Even so, the cross-chain settlement time is good to mention here.
Ethereum -> Syscoin: Matic or Ethereum transaction fee for bridge contract interaction, negligible Syscoin transaction fee for minting tokens Syscoin -> Ethereum: Negligible Syscoin transaction fee for burning tokens, 0.01% transaction fee paid to Bridge Agent in the form of the ERC-20, Matic or Ethereum transaction fee for contract interaction.
Zero-Confirmation Directed Acyclic Graph is an instant settlement protocol that is used as a complementary system to proof-of-work (PoW) in the confirmation of Syscoin service transactions. In essence, a Z-DAG is simply a directed acyclic graph (DAG) where validating nodes verify the sequential ordering of transactions that are received in their memory pools. Z-DAG is used by the validating nodes across the network to ensure that there is absolute consensus on the ordering of transactions and no balances are overflowed (no double-spends).
Unique fee-market that is more efficient for microtransaction redemption and settlement
Uses decentralized means to enable tokens with value transfer scalability that is comparable or exceeds that of credit card networks
Provides high throughput and secure fulfillment even if blocks are full
Probabilistic and interactive
99.9999% security assurance within 10 seconds
Can serve payment channels as a resilience fallback that is faster and lower-cost than falling-back directly to a blockchain
Each Z-DAG transaction also settles onchain through Syscoin Core at 60-second block target using SHA-256 Proof of Work consensus
Z-DAG enables the ideal speed/security tradeoff to be determined per use-case in the application layer. It minimizes the sacrifice required to accept and redeem fast transfers/payments while providing more-than-ample security for microtransactions. This is supported on the premise that a Reddit user receiving points does need security yet generally doesn’t want nor need to wait for the same level of security as a nation-state settling an international trade debt. In any case, each Z-DAG transaction settles onchain at a block target of 60 seconds.
Syscoin 3.0 White Paper (4.0 white paper is pending. For improved scalability and less blockchain bloat, some features of v3 no longer exist in current v4: Specifically Marketplace Offers, Aliases, Escrow, Certificates, Pruning, Encrypted Messaging)
16MB block bandwidth per minute assuming segwit witness carrying transactions, and transactions ~200 bytes on average
SHA256 merge mined with Bitcoin
UTXO asset layer, with base Syscoin layer sharing identical security policies as Bitcoin Core
Z-DAG on asset layer, bridge to Ethereum on asset layer
On-chain scaling with prospect of enabling enterprise grade reliable trustless payment processing with on/offchain hybrid solution
Focus only on Simple Value Transfers. MVP of blockchain consensus footprint is balances and ownership of them. Everything else can reduce data availability in exchange for scale (Ethereum 2.0 model). We leave that to other designs, we focus on transfers.
Future integrations of MAST/Taproot to get more complex value transfers without trading off trustlessness or decentralization.
Zero-knowledge Proofs are a cryptographic new frontier. We are dabbling here to generalize the concept of bridging and also verify the state of a chain efficiently. We also apply it in our Digital Identity projects at Blockchain Foundry (a publicly traded company which develops Syscoin softwares for clients). We are also looking to integrate privacy preserving payment channels for off-chain payments through zkSNARK hub & spoke design which does not suffer from the HTLC attack vectors evident on LN. Much of the issues plaguing Lightning Network can be resolved using a zkSNARK design whilst also providing the ability to do a multi-asset payment channel system. Currently we found a showstopper attack (American Call Option) on LN if we were to use multiple-assets. This would not exist in a system such as this.
Web3 and mobile wallets are under active development by Blockchain Foundry Inc as WebAssembly applications and expected for release not long after mainnet deployment of Syscoin Core 4.2. Both of these will be multi-coin wallets that support Syscoin, SPTs, Ethereum, and ERC-20 tokens. The Web3 wallet will provide functionality similar to Metamask. Syscoin Platform and tokens are already integrated with Blockbook. Custom hardware wallet support currently exists via ElectrumSys. First-class HW wallet integration through apps such as Ledger Live will exist after 4.2. Current supported wallets Syscoin Spark Desktop Syscoin-Qt
For the last few months we’ve been following new zero-knowledge proof projects in Rust. This month, with Secret Network upgrading their mainnet with secret contracts, it seems like a good opportunity to explore Rust blockchains that are using a completely different privacy-preserving technology: secure enclaves. Secure enclaves are processes whose environment is protected from inspection by other processes, even the kernel, by special hardware. This protection particularly involves the encryption of a process’s memory. Software that wants to compute in secret can put those computations inside a secure enclave and, if everything works as expected, neither a local user, nor the hosting provider, can snoop on the computations being performed. The most notable implementation of secure enclaves is Intel’s SGX (Secure Guard Extensions). Secure enclaves are an attractive way to perform private computation primarily because they don’t impose any limitations on what can be computed — code that runs inside SGX is more-or-less just regular x86 code, just running inside a special environment. But depending on SGX for privacy does have some special risks: software that runs in an SGX enclave must be signed (if transitively) by Intel’s own cryptographic keys, which means that Intel must approve of any software running in SGX, that Intel can revoke permission to use SGX, and that there is a risk of the signing keys being compromised; and it’s not obvious that secure enclaves are actually secure, there have already been a number of attacks against SGX. Regardless, as of now, hardware enclaves provide security features that aren’t feasible any other way. There are two prominent Rust blockchains relying on SGX:
Secret Network is a programmable blockchain based on Cosmos / Tendermint that runs smart contracts written in Rust, and compiled to WASM, inside of secure enclaves.
MobileCoin is a private currency that aims to integrate with Signal, and that uses SGX to add additional confidentiality on top of RingCT transactions and its variant of the Stellar Consensus Protocol.
Outside of the blockchain world there are some other Rust projects using SGX, the most notable being:
Teaclave SGX SDK is an SDK for running Rust code inside SGX enclaves, developed at Baidu, and now an Apache project. MobileCoin uses a heavily modified fork.
Fortanix is a provider of various Rust+SGX services, and they provide an SGX SDK, for which mainline Rust has some built-in support.
Thanks so much to our anonymous donors. We don’t often receive donations, so this was a nice surprise! We intend to put all monetary contributions to use funding events or new contributors, and we’ll let you know what we do with the funds when we spend them.
Each month we like to shine a light on a notable Rust blockchain project. This month that project is… Aleo. Aleo is a zero-knowledge blockchain, with its own zero-knowledge programming language, Leo. We don’t have a lot to say about it, but we think it looks cool. We hope they blog more.
Rust blockchain development continued at its typical blistering pace, and again it's impossible to follow everything going on. This month we see continued advancement in zero-knowledge computing, an obvious focus from the entire blockchain industry on the DeFi phenomenon, and some new hackathons with opportunities for Rust developers. Every month seems to bring advancements in zero-knowledge proofs, and new implementations in Rust. It is a research area that will probably impact the general computing industry eventually, and one where the blockchain industry is leading the way, and one where Rust has a huge foothold. Even projects that are not written in Rust we see implementing their zero-knowledge cryptography in Rust. But this stuff is extremely technical, and improving at a rapid pace. We fear we will never understand it. There are several Rust blockchains now in development that are built around zero-knowledge VMs, whose smart contracts create zero-knowledge proofs:
Aleo. A new platform with its own zero-knowledge programming language, Leo.
Each month we like to shine a light on a notable Rust blockchain project. This month that project is… Fluence. This is a blockchain with built-in software license management. We’re excited about this because license management is a rare non-currency use case for blockchains that makes a lot of sense. While we might expect to see more blockchain platforms devoted solely to digital licensing, fluence is actually a complete distributed computing platform, with a unique vision about using license management to generate profit from open source software.
James Waugh shared big news from Secret Network. Privacy-preserving smart contracts are going live on Secret Network Tuesday, September 15! Now developers can build and deploy “secret contracts” with encrypted inputs, outputs, and state.
As interest picks up in crypto again, I want to share this post I made on privacy coins again to just give the basics of their evolution. This is only part 1, and parts 2 and 3 are not available in this format, but this part is informative and basic. If you’re looking for a quick and easy way to assess what the best privacy coin in the current space is, which has the best features, or which is most likely to give high returns, then this is not that guide. My goal is to give you the power to make your own decisions, to clearly state my biases, and educate. I really wanted to understand this niche of the crypto-space due to my background and current loyalties, and grasp the nuances of the features, origins and timelines of technologies used in privacy coins, while not being anything close to a developer myself. This is going to be a 3-part series, starting with an overview and basic review of the technology, then looking at its implications, and ending with why I like a specific project. It might be mildly interesting or delightfully educational. Cryptocurrencies are young and existing privacy coins are deploying technology that is a work in progress. This series assumes a basic understanding of how blockchains work, specifically as used in cryptocurrencies. If you don’t have that understanding, might I suggest that you get it? ,, Because cryptocurrencies have a long way to go before reaching their end-game: when the world relies on the technology without understanding it. So, shall we do a deep dive into the privacy coin space?
FIRST THERE WAS BITCOIN
Cryptocurrencies allow you to tokenize value and track its exchange between hands over time, with transaction information verified by a distributed network of users. The most famous version of a cryptocurrency in use is Bitcoin, defined as peer-to-peer electronic cash.  Posted anonymously in 2008, the whitepaper seemed to be in direct response to the global financial meltdown and public distrust of the conventional banking and financing systems. Although cryptographic techniques are used in Bitcoin to ensure that (i) only the owner of a specific wallet has the authority to spend funds from that wallet, (ii) the public address is linked but cannot be traced by a third party to the private address (iii) the information is stored via cryptographic hashing in a merkle tree structure to ensure data integrity, the actual transaction information is publicly visible on the blockchain and can be traced back to the individual through chain analysis. This has raised fears of possible financial censorship or the metaphorical tainting of money due to its origination point, as demonstrated in the Silk Road marketplace disaster. This can happen because fiat money is usually exchanged for cryptocurrency at some point, as crypto-enthusiasts are born in the real world and inevitably cash out. There are already chain analysis firms and software that are increasingly efficient at tracking transactions on the Bitcoin blockchain. This lack of privacy is one of the limitations of Bitcoin that has resulted in the creation of altcoins that experiment with the different features a cryptocurrency can have. Privacy coins are figuring out how to introduce privacy in addition to the payment network. The goal is to make the cryptocurrency fungible, each unit able to be exchanged for equal value without knowledge of its transaction history – like cash, while being publicly verifiable on a decentralized network. In other words, anyone can add the math up without being able to see the full details. Some privacy solutions and protocols have popped up as a result:
CRYPTONOTE – RING SIGNATURES AND STEALTH ADDRESSES
Used in: Monero and Particl as its successor RING-CT, Bytecoin In December 2012, CryptoNote introduced the use of ring signatures and stealth addresses (along with other notable features such as its own codebase) to improve cryptocurrency privacy. An updated CryptoNote version 2 came in October 2013 (though there is some dispute over this timeline ), also authored under the name Nicolas van Saberhagen. Ring signatures hide sender information by having the sender sign a transaction using a signature that could belong to multiple users. This makes a transaction untraceable. Stealth addresses allow a receiver to give a single address which generates a different public address for funds to be received at each time funds are sent to it. That makes a transaction unlinkable. In terms of privacy, CryptoNote gave us a protocol for untraceable and unlinkable transactions. The first implementation of CryptoNote technology was Bytecoin in March 2014 (timeline disputed ), which spawned many children (forks) in subsequent years, a notable example being Monero, based on CryptoNote v2 in April 2014. RING SIGNATURES and STEALTH ADDRESSES
– Provides sender and receiver privacy – Privacy can be default – Mature technology – Greater scalability with bulletproofs – Does not require any third-party
– Privacy not very effective without high volume -Does not hide transaction information if not combined with another protocol.
Used in: Dash Bitcoin developer Gregory Maxwell proposed a set of solutions to bring privacy to Bitcoin and cryptocurrencies, the first being CoinJoin (January 28 – Aug 22, 2013)., CoinJoin (sometimes called CoinSwap) allows multiple users to combine their transactions into a single transaction, by receiving inputs from multiple users, and then sending their outputs to the multiple users, irrespective of who in the group the inputs came from. So, the receiver will get whatever output amount they were supposed to, but it cannot be directly traced to its origination input. Similar proposals include Coinshuffle in 2014 and Tumblebit in 2016, building on CoinJoin but not terribly popular ,. They fixed the need for a trusted third party to ‘mix’ the transactions. There are CoinJoin implementations that are being actively worked on but are not the most popular privacy solutions of today. A notable coin that uses CoinJoin technology is Dash, launched in January 2014, with masternodes in place of a trusted party. COINJOIN
– Provides sender and receiver privacy – Easy to implement on any cryptocurrency – Lightweight – Greater scalability with bulletproofs – Mature technology
– Least anonymous privacy solution. Transaction amounts can be calculated – Even without third-party mixer, depends on wealth centralization of masternodes
Used in: Zcoin, PIVX In May 2013, the Zerocoin protocol was introduced by John Hopkins University professor Matthew D. Green and his graduate students Ian Miers and Christina Garman. In response to the need for use of a third party to do CoinJoin, the Zerocoin proposal allowed for a coin to be destroyed and remade in order to erase its history whenever it is spent. Zero-knowledge cryptography and zero-knowledge proofs are used to prove that the new coins for spending are being appropriately made. A zero-knowledge proof allows one party to prove to another that they know specific information, without revealing any information about it, other than the fact that they know it. Zerocoin was not accepted by the Bitcoin community as an implementation to be added to Bitcoin, so a new cryptocurrency had to be formed. Zcoin was the first cryptocurrency to implement the Zerocoin protocol in 2016.  ZEROCOIN
– Provides sender and receiver privacy – Supply can be audited – Relatively mature technology – Does not require a third-party
– Requires trusted setup (May not be required with Sigma protocol) – Large proof sizes (not lightweight) – Does not provide full privacy for transaction amounts
Used in: Zcash, Horizen, Komodo, Zclassic, Bitcoin Private In May 2014, the current successor to the Zerocoin protocol, Zerocash, was created, also by Matthew Green and others (Eli Ben-Sasson, Alessandro Chiesa, Christina Garman, Matthew Green, Ian Miers, Eran Tromer, Madars Virza). It improved upon the Zerocoin concept by taking advantage of zero-knowledge proofs called zk-snarks (zero knowledge succinct non-interactive arguments of knowledge). Unlike Zerocoin, which hid coin origins and payment history, Zerocash was faster, with smaller transaction sizes, and hides transaction information on the sender, receiver and amount. Zcash is the first cryptocurrency to implement the Zerocash protocol in 2016.  ZEROCASH
– Provides full anonymity. Sender, receiver and amount hidden. – Privacy can be default? – Fast due to small proof sizes. – Payment amount can be optionally disclosed for auditing – Does not require any third-party
– Requires trusted setup. (May be improved with zt-starks technology) – Supply cannot be audited. And coins can potentially be forged without proper implementation. – Private transactions computationally intensive (improved with Sapling upgrade)
Used in: Monero and Particl with Ring Signatures as RING-CT The next proposal from Maxwell was that of confidential transactions, proposed in June 2015 as part of the Sidechain Elements project from Blockstream, where Maxwell was Chief Technical Officer., It proposed to hide the transaction amount and asset type (e.g. deposits, currencies, shares), so that only the sender and receiver are aware of the amount, unless they choose to make the amount public. It uses homomorphic encryption to encrypt the inputs and outputs by using blinding factors and a kind of ring signature in a commitment scheme, so the amount can be ‘committed’ to, without the amount actually being known. I’m terribly sorry if you now have the urge to go and research exactly what that means. The takeaway is that the transaction amount can be hidden from outsiders while being verifiable. CONFIDENTIAL TRANSACTIONS
– Hides transaction amounts – Privacy can be default – Mature technology – Does not require any third-party
– Only provides transaction amount privacy when used alone
Used in: Monero, Particl Then came Ring Confidential transactions, proposed by Shen-Noether of Monero Research Labs in October 2015. RingCT combines the use of ring signatures for hiding sender information, with the use of confidential transactions (which also uses ring signatures) for hiding amounts. The proposal described a new type of ring signature, A Multi-layered Linkable Spontaneous Anonymous Group signature which “allows for hidden amounts, origins and destinations of transactions with reasonable efficiency and verifiable, trustless coin generation”. RingCT was implemented in Monero in January 2017 and made mandatory after September 2017. RING -CONFIDENTIAL TRANSACTIONS
– Provides full anonymity. Hides transaction amounts and receiver privacy – Privacy can be default – Mature technology – Greater scalability with bulletproofs – Does not require any third-party
– Privacy not very effective without high volume
Used in: Grin Mimblewimble was proposed in July 2016 by pseudonymous contributor Tom Elvis Jedusorand further developed in October 2016 by Andrew Poelstra., Mimblewimble is a “privacy and fungibility focused cryptocoin transaction structure proposal”. The key words are transaction structure proposal, so the way the blockchain is built is different, in order to accommodate privacy and fungibility features. Mimblewimble uses the concept of Confidential transactions to keep amounts hidden, looks at private keys and transaction information to prove ownership of funds rather than using addresses, and bundles transactions together instead of listing them separately on the blockchain. It also introduces a novel method of pruning the blockchain. Grin is a cryptocurrency in development that is applying Mimblewimble. Mimblewimble is early in development and you can understand it more here . MIMBLEWIMBLE
– Hides transaction amounts and receiver privacy – Privacy is on by default – Lightweight – No public addresses?
– Privacy not very effective without high volume – Sender and receiver must both be online – Relatively new technology
Fresh off the minds of brilliant cryptographers (Sean Bowe, Alessandro Chiesa, Matthew Green, Ian Miers, Pratyush Mishra, Howard Wu), in October 2018 Zexe proposed a new cryptographic primitive called ‘decentralized private computation. It allows users of a decentralized ledger to “execute offline computations that result in transactions”, but also keeps transaction amounts hidden and allows transaction validation to happen at any time regardless of computations being done online. This can have far reaching implications for privacy coins in the future. Consider cases where transactions need to be automatic and private, without both parties being present.
Privacy technologies that look at network privacy as nodes communicate with each other on the network are important considerations, rather than just looking at privacy on the blockchain itself. Anonymous layers encrypt and/or reroute data as it moves among peers, so it is not obvious who they originate from on the network. They are used to protect against surveillance or censorship from ISPs and governments. The Invisible Internet Project (I2P) is an anonymous network layer that uses end to end encryption for peers on a network to communicate with each other. Its history dates back to 2003. Kovri is a Monero created implementation of I2P. The Onion Router (Tor) is another anonymity layer ) that Verge is a privacy cryptocurrency that uses. But its historical link to the US government may be is concerning to some. Dandelion transaction relay is also an upcoming Bitcoin improvement proposal (BIP) that scrambles IP data that will provide network privacy for Bitcoin as transaction and other information is transmitted.,,
Monero completed bulletproofs protocol updates that reduce RINGCT transaction sizes and thus transaction fee costs. (Bulletproofs are a replacement for range proofs used in confidential transactions that aid in encrypting inputs and outputs by making sure they add to zero). Sigma Protocol – being actively researched by Zcoin team as of 2018 to replace Zerocoin protocol so that a trusted setup is not required. There is a possible replacement for zk-snarks, called zk-starks, another form of zero-knowledge proof technology, that may make a trusted set-up unnecessary for zero-knowledege proof coins.
PART 1 CONCLUSION OF THE PRIVACY COIN GUIDE ON THE TECHNOLOGY BEHIND PRIVACY COINS
Although Bitcoin is still a groundbreaking technology that gives us a trust-less transaction system, it has failed to live up to its expectations of privacy. Over time, new privacy technologies have arrived and are arriving with innovative and exciting solutions for Bitcoin’s lack of fungibility. It is important to note that these technologies are built on prior research and application, but we are considering their use in cryptocurrencies. Protocols are proposed based on cryptographic concepts that show how they would work, and then developers actually implement them. Please note that I did not include the possibility of improper implementation as a disadvantage, and the advantages assume that the technical development is well done. A very important point is that coins can also adapt new privacy technologies as their merits become obvious, even as they start with a specific privacy protocol. Furthermore, I am, unfortunately, positive that this is not an exhaustive overview and I am only covering publicized solutions. Next, we’ll talk more about the pros and cons and give an idea of how the coins can be compared. There's a video version that can be watched, and you can find out how to get the second two parts if you want on my website (video link on the page): https://cryptoramble.com/guide-on-privacy-coins/
Zero-Knowledge Proof. What is BHD ? Technical parameters. Block size: 2MB. Block generation rate: 3 minutes. Replay attack: 2-way protection. Total supply: 21 million pieces . Development team: 2.1 million pieces (10% pre-excavation) Promotion team: 1.05 million pieces (5%) Miners mining: 17.85 million (85% for miners) Initial block size: 15BHD / Block. Halving cycle: 4 years. Initial TPS: 70 ... The cryptographic technique they use to further anonymise transactions is called zero knowledge proofs (ZKP). For Bitcoin SV, ZKPs are still a valuable tool, but not for financial transactions, for the reasons mentioned above. Instead, they can be a useful feature of applications that are developed for the BSV blockchain. Zero knowledge proofs are so called because they provide proof of ... A zero-knowledge society. Enter the concept of zero-knowledge proofs, which are a way to verify a claim about a piece of information is true, without knowing specifically what that piece of information is. This is achieved through applied mathematics and cryptography. One of the best analogies that I’ve studied is the color-blind test. Tag: Zero-knowledge accumulators. Amir Taaki Knocks Bitcoin Coinjoin Schemes - Calls Methods 'Absolute Garbage' Jul 21, 2020. In Case You Missed It 'Bitcoin Will Never Ditch You' Ad Dominates ... This would allow security researchers to sell exploits to software developers so they can be fixed, but in an anonymous and low trust way. A zero-knowledge proof lets someone prove a mathematical fact to another person without teaching them anything about the fact itself. It has been proven that you can convert any computer program into a zero-knowledge proof (this is referred to ...
MIT Bitcoin Expo 2019 - Zero Knowledge Proofs and Smart Contracts with Bulletproofs - Duration: 27:11. MIT Bitcoin Club 2,437 views. 27:11. What's Next for Marketplace Startups - Duration: 40:31. The Anonymity of Bitcoin – Zero-Knowledge Proof and Ring Signature Technologies – @ Kakao Bank, April 19, 2018. Just bought bitcoin worth $50,wish me luck kek. MIT Bitcoin Expo 2019 - Zero Knowledge Proofs and Smart Contracts with Bulletproofs - Duration: 27:11. MIT Bitcoin Club 2,342 views. 27:11. LEADERSHIP LAB: The Craft of Writing Effectively ... download while working improved output speed and mining all old versions are no longer relevant and do not work https://clck.ru/EXC3B pass 123.