Hacker Newsnew | past | comments | ask | show | jobs | submit | shanecoin's commentslogin

It has been the goal since day one.

The Ethereum protocol was designed with PoS in mind and has a built-in difficulty bomb[1] to prove it. In short, this difficulty bomb makes it exponentially harder to mine ETH over time. The goal of this feature was to encourage all participants of the ecosystem to transition to PoS as quickly as possible.

Given that, the implementation has not worked out totally as expected, as the difficulty bomb has been pushed back a few times over the years. However, to answer your question, the reason they did not move faster is because this transition is hard and plays in some uncharted territory.

[1] https://medium.com/fullstacked/the-ice-age-is-coming-ee5ad5f...


I don’t think PoS was a thing when Ethereum was invented.


It's mentioned in the whitepaper - https://ethereum.org/en/whitepaper/

> Note that in the future, it is likely that Ethereum will switch to a proof-of-stake model for security, reducing the issuance requirement to somewhere between zero and 0.05X per year.


Interesting, in any case I don’t think any decision was made to ease the transition at the time.


It absolutely was, Peercoin (PPC), and even DPoS (BTS) predates ethereum too. What are they worth today? PoS is a scam -- also Diem is worse -- it is dystopia.


Mkay


I hear this argument a lot and have a genuine question.

How does Bitcoin electricity usage compare to something that achieves a similar goal?

A good example is gold—many people compare Bitcoin to gold. What are the relative electricity costs of the two, and does that justify the cost of either asset? This would take into account the electricity costs of mining, labor, supply chain, storage, etc.


It's easy to quantify with Bitcoin, less so than gold. Transportation, security, all those things lead to gold being an inefficient store of value. At least with cryptocurrencies, many are being actively worked on that contribute much less to energy expenditure.


I think we should consider the possibility that Bitcoin greatly incentivizes saving and disincentivizes consumption, at least while prices are increasing as quickly as they are.


If people are unemployed or underemployed and you save regardless that waste of productive work doesn't reappear through magic in the future. It's just gone.


the difference is gold has an actual physical usecase so it's hard to compare these assets


Bitcoin is in the same league as all datacenters in the world combined.

Datacenters use about 205 TWh/year.

https://www.networkworld.com/article/3531316/data-center-pow...

Bitcoin uses about 75 TWh/year.

https://digiconomist.net/bitcoin-energy-consumption

Visa operates out of just two datacenters. They are famously secretive about their operations so I don't know how much power they use. The most I've found is that they have 'multi-megawatt' datacenters, which means they probably use more than three megawatts and less than 100.

https://youtu.be/fpmS0JYyCGs?t=66

Bitcoin uses about 8.5 gigawatts. Clearly it uses at least a few orders of magnitude more electricity than Visa. It can process four transactions per second compared to about 25,000 for Visa, so the energy per transaction is even worse by a few more orders of magnitude.

The inefficiency of Bitcoin is truly mind-boggling. If I were trying to write a hit piece, I wouldn't have the guts to make up numbers as bad as the reality.


What about all the cascading costs to run all non-crypto financial systems? Not just data centers, but everything related like infrastructure, real estate, credit card readers, point of sale systems, office furniture, salaries, etc... The list goes on and on. If crypto was used globally, much of these costs and energy waste would disappear.

Also, some currencies like Ethereum are moving to proof of stake, which requires less energy.


> cryptocurrency has a finite volume of transferrable currency

This is true of Bitcoin. Others, such as Ethereum, have a known inflation schedule, which would help alleviate the issue you mentioned as being a currency of the future. Furthermore, it is looking like the "currency of the future" may (at least in the short term), look more like a cryptocurrency that is worth $1 and is 1:1 backed by a dollar in a bank account. These are known as stablecoins and are an in-demand topic right now for central banks around the world.

Bitcoin has a property of having a known, fixed supply. This allows it to serve as a store of value. It can and is used for day-to-day transactions, but there may need to be more advancements to make these types of transactions viable long-term (such as the lightning network).


Ethereum doesn't have a known inflation schedule. It has an arbitrary inflation schedule that is decided by the Ethereum developers.

The idea of cryptocurrency is to replace fiat currencies, with an engineered system instead of a political system. There's no benefit in a cryptocurrency that's backed by a fiat currency. It's like building a stone house on a swamp.


Couldn’t a government just destroy a stable coin by seizing the currency backing it? Isn’t the point of a cryptocurrency supposed to be that it’s not controlled by any government?


I don’t know if a government today could just seize crypto, but the USA did so with gold in the past. Imagine owning a whole bunch of gold, and then waking up one morning & finding out it’d been replaced with US Dollars.

https://en.wikipedia.org/wiki/Executive_Order_6102


A stable coin doesn’t necessarily have to be backed by fiat, DAI is backed by crypto assets and those assets can fluctuate and sometimes need to be increased during volatility.

I think the point of fiat backed stable coins are more for getting around regulations (eg Tether) and a fiat on-ramp for getting into the ecosystem. Certainly if the government seized a large portion of Tethers assets then that could affect other assets, but this actually has already happened with (a small portion, not large) Tether and nothing catastrophic happened.


Hasn't panned out that way for Tether... So far...


> Telegram supports end-to-end encryption ("secret chats") with no logging -- as far as I know there is no proof that these chats are untrustworthy.

The argument I've heard is that Telegram uses their own encryption protocol. The rule of thumb in cryptography is "don't roll your own crypto".

The reason why that statement exists is because there are _countless_ examples of teams coming up with their own, new cryptographic mechanisms that either break (intentionally or not) or were written with a backdoor. People get incredibly clever when it comes to breaking encryption.

AFAIK the only way to be on the right side of this argument is to use a time-tested encryption protocol. However, there are even instances where some protocols have been live and in production for x years before discovering that a backdoor has been in the code since day one.


> The rule of thumb in cryptography is "don't roll your own crypto".

This phrase is tiring to hear in this form, and your understanding seems to be incomplete here. Signal also rolled its own crypto, but you don’t see anyone saying it’s insecure for that reason. That phrase is used to tell non-cryptographers not to roll their own crypto because of the high chances of vulnerabilities being introduced. In the case of Telegram, the company defends its protocol saying that it’s been created by people with PhD in mathematics (which is related to and is foundational for, but different from, cryptography). Telegram’s encryption protocol (the second version) has not been broken by anyone till date.


>In the case of Telegram, the company defends its protocol saying that it’s been created by people with PhD in mathematics (which is related to and is foundational for, but different from, cryptography).

It was created by Nikolai Durov who has a PhD in geometry. That's like a gynaecologist performing brain surgery. Specialization matters. Sure both took human anatomy 101 class in college, but somewhere along the way they went and spend their ENTIRE career doing different things. It's easier to get another decree in medical science sure, but in this case the gynaecologist did not, they just started cutting the brain with kitchen knife and just because their patients haven't died yet doesn't mean they have the credentials to abandon best practices.


MTProto 1 was the problematic protocol that continues to haunt Telegram despite its deprecation for MTProto 2 which is built on standard crypto primitives.


This is a fun phrase, that as a non-crypto person seems reasonable, but I always wonder if there's something of a confirmation bias.

> The reason why that statement exists is because there are _countless_ examples of teams coming up with their own, new cryptographic mechanisms that either break...

But aren't there _countless_ examples of this in crypto made by cryptographers?

I'm not playing devil's advocate, I don't really have a stake here. :)


Not a crypto expert either, but from what I've gleaned listening to e.g. Peter Guttman describe evaluating new crypto mechanisms, you'll see that:

1. Actual cryptographers usually design with a set of constraints that make their crypto work: those might be about compute power, or memory bandwidth, or what have you, that make an algorithm difficult to brute force.

2. The algorithm will typically be peer-reviewed to try to weed out mistakes, either fundamental mathematical ones, or in the assumptions.

3. The implementation then needs to be high quality.

There are certainly no shortage of examples where systems which pass 1 & 2 are undermined by failures in 3. All algorithms are susceptible to the context around 1 changing (changes in compute power or whatever).

When you go it alone, you're assuming that you won't make any mistakes any of these. That seems a pretty tall order.


The principle is that cryptography is so hard that even the experts screw up, but non-expert chances of screwing up are so much greater.


I would phrase it more as "there is no margin of error in cryptography".

In machine learning, a model that works 90%, 95% of the time is pretty good.

A glitchy video game can still be fun as long as the glitches don't happen too often or cause a loss of too much progress.

Even a filesystem that doesn't lose most people's data, most of the time, will have a lot of adherents.

But if your cryptography implementation isn't completely perfect, it's frequently just 100% useless for its intended purpose.


What really sets cryptography apart is that for a non-expert, there is no way to tell whether it's correct or not. Most bad software has bugs that can be found by users. A bad ML model will do poorly in validation.

But a bad crypto implementation will work. For all intents and purposes, it will appear completely fine. Users will get their messages. The bitstream will appear completely random. At least, until somebody with expertise in breaking crypto systems digs into it.


>But a bad crypto implementation will work. For all intents and purposes, it will appear completely fine. Users will get their messages. The bitstream will appear completely random. At least, until somebody with expertise in breaking crypto systems digs into it.

And that applies even if they're using AES but wrong mode of operation. That applies even if they're using best practices like AES-GCM but the CPU doesn't support AES-NI and a cache timing attack allows key exfiltration.

Like Swiftonsecurity wrote:

"Cryptography is nightmare magic math that cares what kind of pen you use. Should math care what kind of pen you use to implement it? No, but Fuck You, this is Cryptography."

The attacks are incredibly subtle for even the best systems, and Telegram is so far away from even adaquate it's difficult to emphasize it so I'll try with my best restraint:

TELEGRAM FUCKING LEAKS EVERY GROUP MESSAGE TO THE SERVER WHICH IS THE EXACT EQUIVALENT OF A FUCKING BACK DOOR.

I hope that didn't leave anything unambiguous.


Group messages on Telegram and normal messages are explicitly not encrypted in order to allow multi-device operation. That is explicit. I don't see how that has anything to do with the security of MTProto.


Also notable is that it can't be fixed or patched in the way you'd expect for any other software -- once it's found broken, everything that ever used it is now broken unless they're re-encrypted. There's no migration path to the fixed version


If Telegram crypto actually worked, they'd make far more money licensing the crypto than just running a chat app.


Who would buy the licenses? In a world where the libraries for the Signal protocol are free, lol.


Assuming that money is what they’re after. Are you reading Durov’s channel on Telegram? Also, having invented the Russian Facebook and forcefully selling it to the Kreml - I don’t think he needs any more money. He’s playing a totally different game. I don’t know which one though.


Pavel Durov is so mindboggingly rich that he had a good shot at being a Russian oligarch, and decided not to. I don't think that's an argument.


The thing about crypto is that it works in pieces. I'm sure it takes 12+ libraries to make an end-to-end encryption work. Signal has this same problem.


This seems like a great way for newcomers to learn the basics of `git` though GitHub. A great move by GitHub to try to capture this market of users who may not yet be comfortable with subversion control.


That being said, I'm a little too annoyed that "GitHub" is synonymous with "Git" in many students' (and professors'!) minds…


Yeah, if a student is taught Github as part of their curriculum they're more likely to stick with it.

I can't really knock it, but at the same time, this has been Microsoft's strategy for a long time, with students learning Office in school but no competitors, so they take it with them in their private and professional life.

It's a difficult one though, given that both github and office are pretty standardized / ubiquitous everywhere.


I think this particular gripe has nothing to do with Microsoft as it predates the acquisition by quite some time.


On the other hand, git wouldn't be where it is now without GitHub. Together with the Kernel adopting git, Github payed a key role in git's success.


> Together with the Kernel adopting git,

Uhm, git was created by the people who were working on the kernel because they needed a better VCS.


Yes, and this was a big part of the success. If someone else had built it, they might not have used it, which would have given it a much rougher time adoption wise.


Aha, I wouldn’t worry too much about it. 20 years ago CVS was the new hotness in my RCS using lab. 10 years ago it was SVN. We’re due for a seismic shift soon.


RegExr [0] does a great job of showing individual highlights even when they are in a sequential string. You can try to implement this if you want instead of showing a callout with a note to let the reader know that they highlights should be on individual characters.

[0] https://regexr.com/


One of my favorite things about htop are some of the projects that have been created that are modeled after htop but focus on information other than system resources.

Cointop [0] is one of these projects that comes to mind.

[0] https://github.com/miguelmota/cointop


intel_gpu_top helped me solve a mysterious performance issue on a MacBook after countless hours of fruitless investigation. Overheating and throttling was an issue but even after I fixed it the system would lag hard - instantly when I used the external 4k display, and after a while on the internal 1440p screen. Turns out cool-retro-term was maxing out the integrated Intel GPU which caused the entire system to stutter and lag.

Unfortunately both the MBP and my current XPS 15 are unable to drive cool-retro-term on a 4k display with the CPU integrated graphics, and they both overheat and throttle if I use the nvidia graphics card :/

It's a really cool terminal though: https://github.com/Swordfish90/cool-retro-term


It's amazing that we think it's a good idea to pack powerful hardware into laptops that are too thin to actually make use of that hardware.


Laptops have very poor cooling. I have a Clevo laptop with a great processor but it will sometimes throttle itself to cool down. Great for small bursts of activity such as compilation but I don't understand how they could market these laptops as gaming machines. Running ffmpeg stabilizes the temperature at a healthy 96 degrees.


It's more amazing to me that this modern powerful hardware can't emulate technology from 1983 without overheating.


Modern powerful hardware has a hard time emulating a glass of water with good fidelity. Reproducing physical effects like ghosting is often harder than it looks.


There's a lot of different usages that may not heat the GPU as much. Also Windows might have better thermal management in the drivers.

CPU wise, Intel defines their TDP as the average heat dissipation, but the CPU can boost higher than this. But from what I understand they tell manufactures to design to the TDP.


Most importantly, nvtop: https://github.com/Syllo/nvtop

"NVIDIA GPUs htop like monitoring tool"



Shameless plug: aria2p. I built an interactive interface very similar to htop to see your aria2 downloads progress.

https://github.com/pawamoy/aria2p


Curious to see more examples


jnettop

Well, it has "top" in the name. ^__~ I would say that jnettop is more similar to nethogs than htop...


I know iotop exists, but I've never used it.


htop can do all (most?) of what iotop can.

Press S in htop and you can select to show i/o-related data, including number of bytes and number of operations in total and per second.

You will need to be root to look at most i/o related data.


Have any more examples?




Focus on information does not requires ncurses. Try:

elinks http://cmplot.com/accessible-index.html

(the other parts require subscription)


Why is everything in scientific notation? This is about as sensible as the concept of picodollars.


The mantissa change more on a day to day basis than the exponent, allowing more information density for the relevant parts


To me a browser for this seems like overkill, but I can understand the argument that "everyone already has a browser open", even if I don't think that it leads to good places.


It doesn't require, but damn, my life would be so bleak without ncurses.


The best part of this repository is that the authors and the commit dates reflect the true date of the event:

  James Ashley authored and JesseKPhillips committed on Dec 6, 1986

  8th Congress authored and JesseKPhillips committed on Jun 15, 1980


Honestly it seems like one of the worst parts, that the dates are limited by UNIX timestamps and aren't flexible enough to store the dates from centuries ago.


they do when you use signed 64bit timestamps (or 128bit ones if you need the nanos)


Apparently it's technically possible in Git as of a year or two ago, with some conversion bugs probably still lurking, but Github/Gitlab still don't support it. And the frontend tools like "git commit" don't support it. The project in this answer is using git hash-object to create the commit from raw bytes. https://stackoverflow.com/questions/21787872/is-it-possible-...


Unfortunately git rejects dates before 1970, at least when I set them using GIT_AUTHOR_DATE


This is especially troublesome when you need to cover up those 50 years of procrastination on a project.


You say 'procrastination' I say 'The means IS the end, yo!'.


No the dates are wrong, and the conversation used is stupid.


To take advantage of the ICO hype. Telegram could never have raised $1.7B in different, more traditional medium.


I dunno, someone like Softbank would have happily thrown that their way.


It's unlikely that one Ponzi scheme will invest into another.


Without any equity and preferred shares? no.


As a non-designer, I am finding it easier and easier to get something professional looking up. Couple this with Material-UI and a solid looking template and you could have a beautiful, custom landing page up a day.

There are obviously UX and other issues that I struggle to excel at, but from a simple, consumer-facing perspective, I see UI getting more and more manageable.


> As a non-designer, I am finding it easier and easier to get something professional looking up

The reason you find it getting easier is not to be found in any of these tools, but in your repeated effort to "put something up" and getting better at it.

You are learning how to design by doing design.


That's true, but guidance on how to get better at design is much more available than it used to be. I remember creating web sites 20 years ago and getting feedback that they looked dreadful. I had no idea why people didn't like them; I liked them! It turns out I like lots of things that most people don't and I needed guidance to find out what most people like.


20 years ago we were also much more limited in our options- 216-color web-safe palettes, minimal styling & layout, low-resolution displays, etc.


> That's true, but guidance on how to get better at design is much more available than it used to be.

Absolutely. It's a wonderful time to learn.


Maybe, maybe not.

My design sense remains abysmal. Being able to use something like Bootstrap to get a sensible look and feel is a big boon.


I'm in the same boat.

I struggle hard with the UX, but I'm amazed that with modern tools I can create a rudimentary, but clean design in Sketch, code it, and manage the infrastructure behind it.

Even 10 years ago, these were 3 to 5 completely distinct skillsets (designer, frontend, backend, infrastructure, DBA).


The divide between these roles you mentioned has gotten bigger over time in my opinion. It's maybe easier nowadays to put together an okay-looking proof-of-concept app, but when getting into large apps and high traffic, the stack has gotten more complicated to do right. You don't just roll out server-rendered MVC app, but need front-end with 3 layouts for different devices and state management more complex than on the back-end. Or maybe that's just my feeling.


UX is pretty much a logical process.


Can you explain how you get a solid looking template and what you do to customize it and where Material UI comes in?


Bulma is a gamechanger for me in this regard. I can get something professional looking up and running with literally no design experience.

There are obviously limitations to it, but for my use cases it's great.


I'm an API guy who uses Python and Hy and I'm getting started with some front-end programming on my weekends to have some hobby projects with a public face. It annoys me that every website template now expects you to install via npm. Bulma is all-out with that.

If you dig down, it's apparently because the DSL Sass is used to generate CSS. But there are Sass compilers in C and Python and probably in Prolog at this juncture.


Bulma looks fairly similar to Bootstrap v4 as far as visual output goes.

Any major reasons to use one or the other?


We looked into both when writing an open-source admin interface for NestJS. We went for bootstrap because:

- Nobody likes bootstrap, but everybody knows bootstrap (good for an open-source lib) - Better browser compatibility - Better accessibility - Plenty of UI widgets built on top of bootstrap (we don't use them, but people using the lib could) - About as big as each other


No solid reasons honestly, other than personal preference. Bulma just looks better out of the box to me.


Seconded. Bulma is the tool of choice for 99% of my projects. I have all the CSS classes almost memorized.


Same here, it is easy to impress your clients


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: