The article skip over the results? Did the design succeeded? Which hardware do miner uses, and is it evenly distributed? Can I mine Monero on potato hardware?
Most miners use AMD Ryzens. Couldn't tell you the actual breakdown of CPU types in use. Apple's M series CPUs are quite efficient at it too. Bitmain now sells a "Monero RandomX Mining ASIC" which is just a bunch of RISC-V cores, seemingly based on Sophon SG2042 SOCs. There's nothing special or more cost-effective about their product.
You can mine on old smartphones quite easily. I use a bunch of old Android TVboxes myself. Their hashrates are nothing to crow about, but their hashes/watt are still competitive with faster CPUs.
There is a RandomX V2 that will be deployed soon. Its main improvement is even cheaper verification cost.
> [To lib authors] Nobody is obviously in charge in the way a fast-moving production team would mean "in charge," and that creates understandable hesitation around making breaking changes, even when experience has taught us better ways to design these systems.
> This is not a complaint about volunteer maintainers. It is simply one of the ambient risks of building serious systems on a smaller ecosystem.
And so instead of paying the lib authors who already have domain expertise and know their codebase, they chose to rewrite it from scratch/fork without contributing back. So classic.
Author here: I think you are projecting quite a bit. We do in fact hire a lot of people who maintain things, and even pay quite a lot for OSS development on things like the compiler and libraries we care about. But we still have business objectives to achieve, and sometimes it makes more sense to write things that better suit our needs.
Imagine making your product compliant across 100+ countries while regulatiions, labor-laws, tax rules, insurance requirements, and data privacy laws keep changing.
Imagine itegrating dozens of payment methods - many of them highly localized - across emerging and developed markets, while dealing with fraud, chargebacks, KYC, AML, and settlement complexities.
Imagine processing trillions of data points every day - rides, location updates, pricing signals, ETAs, traffic conditions, demand forecasts, payments, support events.... storing it efficiently, querying it in near real time, generating reports, and keeping the whole pipeline reliable. I have woorked in data engineering, and can tell you confidently that this alone requires an enormous R&d budget.
Then there are the apps - not just customer-facing, but driver-facing, courier-facing, merchant-facing, fleet-management, onboarding, support, operations, compliance, finance, and hundreds of internal tools and dashboards.
Then come the integrations. Companies running at Uber's scale genemrally have hundreds of tjese - mapping providers, payment processors, banks, identity verification, tax systems, telecoms, customer support platforms, fraud detection, analytics, ERP, CRM, and more.
... And then there are even more...
Real-time routing and dispatch optimization
Dynamic pricing and marketplace balancing
Fraud detection and account security
Driver/rider safety systems
ML models for ETA, demand forecasting, incentives, and churn prevention
Experimentation infrastructure for thousands of A/B tests
Reliability engineering across globally distributed systems
Data centers / cloud optimization at massive scale
Localization across languages, currencies, addresses, and cultural norms
Customer support automation at global scale
Autonomous vehicle research, mapping, and computer vision
... to be fair, this is all what I could thing of based on my own work experience in related fields... there is definitely as many more systems in reality as mentioned abpve.
> It’s probably the core reason developers choose GitHub as their main git forge. I get it. It does have it’s advantages of giving a better experience for reviewing a set of changes. Initially. But what if I told you there was a time when submitting email-based patches was the standard for version control?
The author explains well how you can bear with patches, but not why patches were chosen in the first place. What advantages do they have over PR? I see none, and I won't lose my precious time working-around an inferior process to Github's already subpar PR one.
I tried email patches with another person myself. The only reason GH won here, is because the git people made one fatal mistake: They forgot to include the tree hash and only show the commit hash in the email patch. But the commit hash is useless. When you email patch, then commits people want to treat as "the same" and talk about have different hashes. The commit times differ and there is not only the commit author, but also the committer.
We stopped doing email patches, because commit hashes became useless for communicating with each other.
GitHub made commit hashes "constant" in a way people care about.
For our purposes, tree hashes would have been much better in practice.
The git user interface is literally "git porcelain". It cuts you for no reason.
I think there is a strong argument that Gerrit is the current evolution of the patches workflow, many prefer it, and there are a lot of good blog posts explaining why.
I don't know what the justification for emailing patches around is though, that seems needlessly painful in the face of alternatives
This should really what LLM ought to bring in terms of security. Be able to break things faster considering it is now easier for the maintainers to fix them.
This has downsides of course, moving further into the "everything rot so fast these days" trope, but we will in a adversarial world where the threat is constantly evolving.
Tomorrow (today) the servers and repo won't be scanned by scripts anymore but by increasingly capable models with knowledge about more security issues than many searchers.
Management problem more than anything else, I feel.
Compilers should not have so much churn. You decide on a set of language features, stick to it and implement. After that, it should only be bugfixes for the foreseeable future till someone can make a solid case for that shiny new feature.
reply