Same as with any other requirement: when you can afford it.
Slack's desktop client shouldn't require Electron (making it unusable on low-end devices), but it does. They can get away with that requirement, because there's hardly any alternative client worth your effort/attention.
I was sceptical about BitWarden (versus 1Password), because their UX was quite far behind - but they're slowly catching up, and I might have to reconsider my choice there.
If I wasn't "required" to run a dozen different proprietary/non-portable apps, I'd be using OpenBSD instead of macOS.
I do not "require" X to be free, but given all other things being equal, it's my preferred choice. Unfortunately, a lot of the time, we do not have the luxury of being picky.
Firmware makes it possible to make use of hardware you physically own.
Client software makes it possible to use services you pay for, operating systems make it possible to use applications.
It's difficult to come up with a more straightforward analogy - I don't even want to try cars, these are rapidly becoming a much bigger issue than firmware.
> They want to make their own lives easier rather than the customer's.
This right here is the crux of the matter and why I have no sympathy for the defenses of Electron. I don't care at all how easy it is for the company's developers to make something. I care that the software I get is high quality, and I'm not willing to sacrifice that so that the company can cut corners on their costs.
Discord is electron/web based. And tbh it shows that asking for native just for the sake of being native is weird. Discord has a better UX than most native clients.
Sometimes it feels like it makes sense to keep some software proprietary (e.g. the algorithms running in a Garmin watch have a lot of value, customers don't just buy the hardware).
Other times it feels like it really doesn't. I often give the example of Marshall smart speakers: the software seems to be very bad, it's never updated but it's connected to the internet, and the UX is not pleasing me. But initially I bought it for the hardware: it looks nice, and the sound is nice.
In my opinion, when I buy hardware as a customer, I should have the right (and be given the tools) to interface with that hardware. So there should be some open source firmware that gives me reasonable access to the hardware I bought. In other words I should be able, legally and without impossible reverse engineering, to write my own system for hardware I buy.
If I managed to write software that competed with Garmin on their own watches, so be it. That's fair competition (and that's probably super hard to do). The fact that Marshall can push me towards throwing away my perfectly fine hardware because they abandoned the bad software they once pushed on it is not fine. And it encourages companies to invent subscription models where my perfectly fine hardware becomes unusable if I stop paying them even though I don't want or need their updates.
So somehow I think there is a middle ground that would help making products better and hardware more sustainable, and we are not there yet.
> If I managed to write software that competed with Garmin on their own watches, so be it. That's fair competition
I worked on a line of products that allowed user modifications of the software. We didn’t explicitly endorse it, but we also had instructions on the website for how to get into the hardware and we designed it so users could modify things and persist their changes.
It was fun to see what people did with the hardware, but the volume of support requests was also out of control after a while.
Companies wouldn’t actually care if you created alternate software for something like a watch if there was no downside to them. It would sell more hardware to people who wanted to customize.
The problem comes later, when many people casually modify their devices and then expect original customer service to get it back to the original state. You can try to place rules on customer support, but customers will lie through their teeth to customer support if they think it will get them what they want. We would get people demanding RMA replacements because their device “mysteriously” stopped working, then when engineering did random sampling of RMAs we would find that they had modified the software and didn’t know they could hard reset it.
It’s a real problem. HN imagines the average person who modifies hardware to be highly knowledgeable and self sufficient. In the real world it’s usually someone following an outdated guide online where they copy and paste different commands until something works. Those same people are the first to try to return the product or request customer support when it doesn’t work perfectly.
Prusa (IIRC) had an interesting solution where you had to break off a physical piece to void your warranty in order to try different firmware. People were up in arms about it, but I think it was the smartest way to handle this conundrum.
> The problem comes later, when many people casually modify their devices and then expect original customer service to get it back to the original state.
So design the hardware to make that (i.e. factory reset) easy and have the customer try it before any RMA.
> Prusa (IIRC) had an interesting solution where you had to break off a physical piece to void your warranty in order to try different firmware. People were up in arms about it, but I think it was the smartest way to handle this conundrum.
Sort of. What the customer is reasonably going to want is to have a separate warranty for the hardware and the software and void the software warranty. If someone replaces their firmware to fix a software bug and then their screen independently develops bad pixels, the customer is rightfully going to be upset that the company is using it as an excuse to deny a warranty replacement of their faulty hardware.
This is also how you get customers lying to you about replacing the firmware. If you try to deny unrelated warranty claims over that they feel like they're being ripped off.
But if the alternatives are "warranty void if firmware replaced" and "closed firmware" the first one is clearly better.
This is very accurate. Most of the time the motivation behind restrictions and not releasing code is rooted in support costs. People can and will break things inadvertently, but they won't like that and will then go the route of requesting support, RMAs, etc. Additionally in some cases those people that developed random customizations will release them, and now other users will install or use them, and if an issue crops up, the support costs are now exponentially worse.
Is there any evidence that this actually happens? The percentage of customers who install custom firmware would be low to begin with, much less the percentage of those who then have problems with it and try to get the company to support it instead of the community that developed it. The idea that this type of support request is going to dominate support costs seems pretty farfetched.
This is the same sort of middle-ground I increasingly find myself advocating for, with one caveat: once something is no longer being manufactured (hardware) or sold and supported (software), it should be entirely open sourced by law. After all, why shouldn’t (as a random example) the iPhone 4 be FOSS hardware and software in 2024? It’s perfectly usable if a bit sluggish by modern standards, could reduce eWaste (albeit by a small amount), and gives customers control over what they purchase. There’s also the additional disincentive to companies in the yearly release cadence or ending support too early just to drive sales.
I look back at the mounds of perfectly functional hardware amassed over my life, hobbled or bricked solely by software a vendor refuses to fix or otherwise allow end users to repair themselves. A Wii U with bad flash, an Xbox with a corrupted bootloader, a smart TV with a bad OS that crashes the display, a universal remote no longer supported by its manufacturer with new codes or firmware. All of these should’ve been unshackled the moment their manufacturers abandoned them, so that their owners can fix and support them indefinitely if they so choose.
> once something is no longer being manufactured (hardware) or sold and supported (software), it should be entirely open sourced by law.
There are a lot of practical barriers to this being that simple. A couple of common examples:
1. Third-party dependencies are a thing. Many pieces of hardware or software contain pieces of other hardware and software from other organizations, and the end vendor may frequently not even have the right themselves to open source them.
2. Many products are abandoned when businesses fail, in which case, there may be no staff to open-source them.
> Third-party dependencies are a thing. Many pieces of hardware or software contain pieces of other hardware and software from other organizations, and the end vendor may frequently not even have the right themselves to open source them.
That wouldn't be an issue if it was a legal requirement because then they would all be subject to the same rule or be unable to market their software because it doesn't satisfy the requirement.
> Many products are abandoned when businesses fail, in which case, there may be no staff to open-source them.
Which is why they should be open source to begin with. But what excuse does this provide to the companies that still exist?
> That wouldn't be an issue if it was a legal requirement because then they would all be subject to the same rule
Just because a product is discontinued doesn't mean every component inside of it is also discontinued. Nor does it mean the restrictions on those are based in copyright. They may be restricted by other legal mechanisms other than copyright.
> or be unable to market their software because it doesn't satisfy the requirement.
> Which is why they should be open source to begin with.
If you want to propose making closed source hardware or software illegal, just propose that.
What you're effectively arguing for here, is that people own the rights to everything in their entire development stack, which is pretty unreasonable for any modern commercial software/hardware development. Even many prominent and celebrated open source hardware projects would not comply with this. For example: every open-source phone project. Not a single one could even possibly comply with this law because open-source baseband firmware does not exist, and cannot exist due to spectrum licensing requirements.
> Just because a product is discontinued doesn't mean every component inside of it is also discontinued.
You're assuming the code doesn't have to be released as long as there exists something using it that isn't discontinued, but it can be the other way. As soon as you discontinue something, everything in it has to be released.
> Nor does it mean the restrictions on those are based in copyright. They may be restricted by other legal mechanisms other than copyright.
The proposal is a consumer protection law that requires source code to be published when a product goes out of support. If the law requires it to be released, you release it or you're in violation of the law.
> If you want to propose making closed source hardware or software illegal, just propose that.
> You're assuming the code doesn't have to be released as long as there exists something using it that isn't discontinued, but it can be the other way. As soon as you discontinue something, everything in it has to be released.
No, I'm interpreting this the way you're suggesting. And I'm saying it's insane. If company A sells a component to company B, and uses it in their product, and company B stops supporting it or goes out of business, now it needs to be open sourced? Every vendor has customers that have gone out of business or discontinued a product, so this is effectively a ban of all B2B closed source hardware or software components.
> If the law requires it to be released, you release it or you're in violation of the law.
So, to comply with your proposed law, companies can no longer sign NDAs with their vendors. They can no longer license patented hardware or software. They can no longer create products which have components with source restricted by trade law or product regulatory law. They can no longer license any closed-source hardware or software from any vendor. Do you realize this is a ban on approximately all commercially available computers, and a ban on many types of radio devices, including cellphones?
That's a lot more reasonable argument, although it is unprecedented and unfounded. Copyright doesn't require any other authors of other works to release any foundational sources, and it never has. Authors don't have to release drafts or outlines of their books. Musicians don't have to release the constituent tracks of their final recordings. Artists and photographers don't have to release their PSD files. Patent protection works this way, but copyright never has.
> Every vendor has customers that have gone out of business or discontinued a product, so this is effectively a ban of all B2B closed source hardware or software components.
Not exactly. If you don't have to release it until support ends then you could require the licensee to provide support for a specific number of years and carry bankruptcy insurance that would pay for support to continue for that period of time, if you wanted to.
It would tend to cause the source code to be released after ten or twenty years, but what are you really protecting at that point? The source code for Windows XP and 3G cellular radios?
> Do you realize this is a ban on approximately all commercially available computers, and a ban on many types of radio devices, including cellphones?
It seems pretty likely that the vendors would choose to license the code under terms that would allow the source code to be published in that circumstance, since otherwise they have no customers.
> Copyright doesn't require any other authors of other works to release any foundational sources, and it never has.
Other media is distributed in a human-readable form rather than compiled into an opaque binary. If you want to make a derivative of Alice in Wonderland you can easily do it from a copy of the book as published without needing any of Lewis Carroll's notes. If you want to fix a firmware bug, doing it without the source code is commonly infeasible.
> bankruptcy insurance that would pay for support to continue
1. That isn't a thing
2. If your company doesn't exist, it doesn't really matter if you comply with the law anymore.
3. Insurance writes checks, it doesn't conduct business operations
> It seems pretty likely that the vendors would choose to license the code under terms that would allow the source code to be published in that circumstance, since otherwise they have no customers.
Except for all of the hardware and software that already exists, all of the patents that already exist, all of the different laws for hardware and software in every other country where closed-source software/hardware exists, and all of the laws here and everywhere else that restrict hardware or software for other reasons. I think it is much more likely that companies will comply with all of the rest of the operational and legal obligations they currently have, and instead, avoid legal nexus in your jurisdiction.
> 3G cellular radios?
It's not legal to open source a baseband radio in many countries because of the licensing requirements to use the spectrum. If you make this a requirement, and companies want to sell cellphones in other countries (they do) you have banned the manufacture of these devices in your country.
> It seems pretty likely that the vendors would choose to license the code under terms that would allow the source code to be published in that circumstance, since otherwise they have no customers.
It's more likely they will conduct operations outside of your jurisdiction, and continue their business as they currently do.
> Other media is distributed in a human-readable form rather than compiled into an opaque binary.
You don't "read" a piece of software the same way you don't "read" a song on the radio. Software runs on a computer, and songs play on a radio.
But even if we do interpret this your way -- copyright doesn't require anything at all to be published. You can create something, put it in a locked safe, and copyright still applies. You can't create a derivative work of my nudes on my phone, because I won't give them to you. They're still protected by copyright, though.
Your presumption that it is predicated on sharing is simply not factually correct. Copyright is and has only ever been about protecting creators rights. Only patent law is predicated on sharing.
> It's not legal to open source a baseband radio in many countries because of the licensing requirements to use the spectrum.
If anything, it may not be legal to operate modified firmware/hardware on public networks. It's perfectly fine to open source this stuff, and you can operate it in a lab given that you don't transmit anything outside of it.
But the certification of baseband radios in multiple countries requires the devices to be restricted to operating within the parameters permitted under the regulatory requirements for the use of spectrum in those respective countries. These parameters are enforced in the firmware on the device. The manufacturer has to ensure that the device complies in order to obtain certification for use in a consumer product, and they do this by locking the firmware.
If it's unlock-able, it may be perfectly legal as an uncertified device for use in a lab environment, but it isn't as a consumer device.
> 2. If your company doesn't exist, it doesn't really matter if you comply with the law anymore.
> 3. Insurance writes checks, it doesn't conduct business operations
Insurance writes checks that you can contract to have fund whatever you want. You can make someone agree to carry insurance against bankruptcy so that if it happens the insurance pays out, and then require the payout to go into a trust whose purpose is to continue to support the hardware for the specified period of time, if that's what you want to do.
> It's not legal to open source a baseband radio in many countries because of the licensing requirements to use the spectrum. If you make this a requirement, and companies want to sell cellphones in other countries (they do) you have banned the manufacture of these devices in your country.
This can't realistically be the actual requirement. So if the source code to your phone firmware leaks on the internet it becomes illegal to sell the phone in these jurisdictions? What does it mean to be released? If you change two lines of code for the phones sold into that market, is that fine because the now-distinct code is unpublished, or is any phone with a Linux kernel on it illegal there -- or the Windows kernel which has BSD-licensed code in it -- because at least part of the code is public?
Any laws even resembling that should be repealed anyway.
> It's more likely they will conduct operations outside of your jurisdiction, and continue their business as they currently do.
The relevant jurisdiction is the one where the products are sold. If this is a large jurisdiction like the US or the EU, the premise that nobody is going to sell computers or phones there anymore seems highly implausible.
> You don't "read" a piece of software the same way you don't "read" a song on the radio. Software runs on a computer, and songs play on a radio.
If you want to create a derivative work of a piece of music, having the recording will be all a musician needs, whereas having only object code is going to frustrate a programmer.
> You can't create a derivative work of my nudes on my phone, because I won't give them to you.
But then you also can't (or have no opportunity to) exercise any of the rights copyright imparts. If your stuff never leaves your phone then you can't sue someone for infringing your copyright because they have nothing from which to make an illicit copy or derivative work before the copyright expires.
That doesn't mean you have to be the one to publish it, but without it somehow getting out, there is nothing to enforce. And obviously in the common case copyright is being used for published works -- the date of first publication is even used to determine the copyright term in the common case. For corporations it's 95 years from first publication.
> Copyright is and has only ever been about protecting creators rights. Only patent law is predicated on sharing.
In the US they both come from the same clause in the constitution:
> [Congress shall have power] To promote the Progress of Science and useful Arts, by securing for limited Times to Authors and Inventors the exclusive Right to their respective Writings and Discoveries.
The goal in both cases is the same (promote the Progress), only the thing protected differs (writings vs. discoveries), and in both cases it's for limited times, implying that enriching the public domain is the mandated result.
> You can make someone agree to carry insurance against bankruptcy
If you can find someone to write it, because these aren't a product that is generally available.
> require the payout to go into a trust whose purpose is to continue to support the hardware for the specified period of time, if that's what you want to do.
Nobody would do this, because funding a requirement that wouldn't ever be enforceable is akin to burning money. You can't penalize a company that doesn't exist.
> This can't realistically be the actual requirement.
It absolutely is, and it's why precisely zero phones (even 'open source' phones) have an open baseband.
> So if the source code to your phone firmware leaks on the internet it becomes illegal to sell the phone in these jurisdictions?
No. Regulations require companies to ensure they design their products to operate within regulatory requirements. Leaks, by definition, are not intentional.
> Any laws even resembling that should be repealed anyway.
No, they really shouldn't. There are good reasons that mass market devices are designed not to interfere with other users of the spectrum.
> The relevant jurisdiction is the one where the products are sold.
Are you suggesting that parties other than OEMs are subject to this law too??? Good lord -- do you want the corner gas station to hire engineers to ensure the microcontroller firmware on the phone chargers they sell is properly open sourced? Basically all other product regulatory law affects the company that makes the product, for good reason.
> If you want to create a derivative work of a piece of music, having the recording will be all a musician needs
That very much depends entirely on what they're trying to do.
> But then you also can't (or have no opportunity to) exercise any of the rights copyright imparts.
The rights legally exist regardless, and the reason I used this example is because it is a very solid way to go after people who steal private works.
> And obviously in the common case copyright is being used for published works -- the date of first publication is even used to determine the copyright term in the common case. For corporations it's 95 years from first publication.
And there are other various durations which apply when the work is not published. Publishing is not a requirement.
> In the US they both come from the same clause in the constitution:
Yep, that's the clause that enumerated the power of the federal government to regulate those things. Which the federal government subsequently, as that clause granted them the power to do, regulated them as two entirely separate categories of intellectual property.
> If you can find someone to write it, because these aren't a product that is generally available.
Trade credit insurance etc. are generally available. The general concept is that you want to insure against your counterparty being unable to fulfill their obligations, which is exactly the concern here, and nothing stops you from requiring them to pay for that and then naming a trust as the beneficiary that would use the money to uphold their obligation to continue to support the device, if you really want to do that.
> Nobody would do this, because funding a requirement that wouldn't ever be enforceable is akin to burning money. You can't penalize a company that doesn't exist.
The money comes from the insurance that pays out because they defaulted. The insurance company still exists and is required to pay the claim.
> It absolutely is, and it's why precisely zero phones (even 'open source' phones) have an open baseband.
This doesn't make any sense. Can you cite the specific statute or regulation that requires this?
What secret is even being protected here? Surely the method of operation for baseband modems is not classified information.
> No. Regulations require companies to ensure they design their products to operate within regulatory requirements.
The devices presumably do operate within regulatory requirements as manufactured. It's ridiculous to expect the manufacturer to prevent the customer from modifying the device. If someone modifies the device it's on them. How is the OEM supposed to stop someone from splicing a signal amplifier into the antenna or having a device from a different regulatory domain drop shipped or similar?
> Leaks, by definition, are not intentional.
Once the code is leaked, intentionally or otherwise, you're now intentionally selling a device with publicly available code. Why should the response to that be any different than selling a device for which the code was intentionally published to comply with the laws of another jurisdiction?
> No, they really shouldn't. There are good reasons that mass market devices are designed not to interfere with other users of the spectrum.
They would still be designed to do that. The difference is that you put the obligation for prosecuting users who modify devices to violate those regulations where it belongs, on law enforcement, instead of compromising security, freedom and transparency by trying to put it on device OEMs.
> Are you suggesting that parties other than OEMs are subject to this law too??? Good lord -- do you want the corner gas station to hire engineers to ensure the microcontroller firmware on the phone chargers they sell is properly open sourced? Basically all other product regulatory law affects the company that makes the product, for good reason.
What do you think currently happens if you go to Walmart and buy a product manufactured in China with faulty wiring and it starts a fire? Or a foreign-made wireless access point that exceeds FCC regulatory limits on transmit power out of the box?
> The rights legally exist regardless, and the reason I used this example is because it is a very solid way to go after people who steal private works.
It's basically a hack the courts used because there wasn't a more convenient existing law to go after people who do that but those people are unsympathetic. It's ancillary to the core purpose of copyright laws.
> And there are other various durations which apply when the work is not published. Publishing is not a requirement.
The other durations are longer (e.g. 120 years from creation vs. 95 years from publication), so for for the vast majority of works which are promptly published it's the publication date that causes expiration. The other date exists only to ensure that the work can't be withheld from the public domain indefinitely by not publishing it, which is back to the purpose of copyright being to enrich the public domain.
> Yep, that's the clause that enumerated the power of the federal government to regulate those things. Which the federal government subsequently, as that clause granted them the power to do, regulated them as two entirely separate categories of intellectual property.
The constitution itself implies that they're two different things, but the point is that they have the same ultimate purpose and denying the public the source code frustrates that purpose, so it's entirely reasonable to require it in exchange for the benefits of copyright protection.
> The general concept is that you want to insure against your counterparty being unable to fulfill their obligations, which is exactly the concern here
So vendors are going to have to take insurance policies out on each of their customers?
> The money comes from the insurance that pays out because they defaulted. The insurance company still exists and is required to pay the claim.
Pay who? Money is not source code. Money can't hire people to work at a company that doesn't exist.
> This doesn't make any sense. Can you cite the specific statute or regulation that requires this?
47 C.F.R. § 15.5, Prevention of Harmful Interference. Manufacturers must design devices in a way that reasonably prevent their users from operating the device outside of allowable criteria.
And under 47 U.S.C. § 302a and 47 C.F.R. § 2.803, the devices will be tested to ensure that they can't be changed to operate outside of that criteria. If the software allows the testers to modify them to do so, they will fail testing.
> What secret is even being protected here? Surely the method of operation for baseband modems is not classified information.
Nothing. It has nothing to do with keeping secrets -- it is about preventing the user from changing the way it operates.
> Once the code is leaked, intentionally or otherwise, you're now intentionally selling a device with publicly available code. Why should the response to that be any different than selling a device for which the code was intentionally published to comply with the laws of another jurisdiction?
1. Leaks are never part of design intent, there's no transitive property of intent.
2. Devices are certified and tested before they're released, so the timelines are unlikely to line up.
3. I'm not sure what the FCC would do if this did happen, I'm not sure it ever has, but would be interesting to know. I'd bet it would depend on the severity of the problem.
> What do you think currently happens if you go to Walmart and buy a product manufactured in China with faulty wiring and it starts a fire? Or a foreign-made wireless access point that exceeds FCC regulatory limits on transmit power out of the box?
They might be liable for a recall after the problem is identified, or liable if they knowingly sold products that weren't compliant -- but the actual product design regulations, and UL/ETL/FCC testing requirements all lie on the original manufacturer.
> So vendors are going to have to take insurance policies out on each of their customers?
They don't have to, but they can in any case they want to do business with a company they fear is at significant risk of insolvency.
> Pay who? Money is not source code. Money can't hire people to work at a company that doesn't exist.
Money can pay the people who used to work at the company, or some entirely different people, to continue to support the hardware for a defined period of time even if the company fails to secure any new customers or revenue. Contracts or escrow ahead of time ensures that those people have access to what they need to do it.
> 47 C.F.R. § 15.5, Prevention of Harmful Interference. Manufacturers must design devices in a way that reasonably prevent their users from operating the device outside of allowable criteria.
It appears to say you're not allowed to operate a device that causes harmful interference. I don't see anything in it that says manufacturers are required to prevent the user from modifying the device.
47 U.S.C. § 302a is the generic grant of authority by Congress to the FCC to make this category of rules, not a specific rule.
47 CFR § 2.803 is a multi-part section but I don't see anything in it about manufacturers having an obligation to prevent modification, and it references subpart J which contains 47 CFR § 2.909 "Responsible Party" and seems to imply the opposite, i.e. that if an independent third party modifies the device they're taking the compliance burden onto themselves. What am I missing?
> Devices are certified and tested before they're released, so the timelines are unlikely to line up.
Wouldn't that imply the source code could be released by the manufacturer or someone else after the certification to the same effect?
> I'm not sure what the FCC would do if this did happen, I'm not sure it ever has, but would be interesting to know. I'd bet it would depend on the severity of the problem.
There have been WiFi chips with open source firmware, e.g. Atheros pre-Qualcomm. I'm not aware of any current ones but also not aware of any serious problems occurring when those chips were current. What is someone even supposed to do that they couldn't still do with hardware modifications or SDR anyway?
They have to open source the code which is so old that the laptop it originally came with is discontinued. Yes, that makes sense. Things shouldn't be discontinued four hours after they're released and if you want to do that there can be a cost to it.
Maybe there should be a cost, but you've picked an insane cost to ascribe here.
If some fly-by-night PC integrator sells a PC today, then goes out of business tomorrow, no, Microsoft absolutely shouldn't be required to open source their operating system.
Many thousands of businesses close their doors every day, and when they do, punishing their vendor is silly, illogical, and wrong.
> If some fly-by-night PC integrator sells a PC today, then goes out of business tomorrow, no, Microsoft absolutely shouldn't be required to open source their operating system.
There is another reasonable solution for Microsoft here. The premise is that the device either has to be supported or open sourced (and therefore supportable by the community). If the OEM is gone the firmware has to be open sourced because the company is no longer supporting it.
But my point earlier was that you shouldn't get out of having to release the source for something just because that component is still supported on other devices. It has to be still supported on this device. But as long as a current version of Windows is still supported on the device by Microsoft, that implies that once the firmware source is released the entire device is still supported.
So then all Microsoft would have to do is make sure they're licensing Windows to run on hardware they're willing to have Windows continue to support for as long as they want to keep the source code unpublished. Which is already their longstanding practice, Windows 11 TPM kerfuffle notwithstanding. Currently supported versions of Windows support hardware going back more than 20 years.
> There is another reasonable solution for Microsoft here. The premise is that the device either has to be supported or open sourced (and therefore supportable by the community). If the OEM is gone the firmware has to be open sourced because the company is no longer supporting it.
Alright, lets say we do that. And let's say it's my PC OEM, and I'm completing my chapter 7 filings today, and I am going to ignore your law to open source my firmware. What are you going to do about it?
Why would it be different than ignoring any other law? The plaintiffs get an injunction and if the defendants ignore it they're held in contempt or get whatever criminal penalties are imposed for violating the law.
If the concern is that the source code is going to be lost in dying companies, some people have suggested escrow requirements, e.g. you file the source code with the copyright office on the release date, not to be published until you discontinue support. You might conceivably file gibberish and not be discovered until the publication date but then we're back to personal liability for willful criminal acts, e.g. falsifying official documents.
As a product regulatory issue, you're right, it probably wouldn't be any different than other product regulatory law. They're mainly civil, (especially in terms of product warranty). In a chapter 7 case, civil penalties typically don't even get priority to creditors. But who cares?! The company is gone anyway.
> If the concern is that the source code is going to be lost in dying companies, some people have suggested escrow requirements, e.g. you file the source code with the copyright office on the release date, not to be published until you discontinue support. You might conceivably file gibberish and not be discovered until the publication date but then we're back to personal liability for willful criminal acts, e.g. falsifying official documents.
Even if you do convince Congress to approve the new billion dollar data center they'll need, I'm not sure the entire world really wants to put this in their build pipeline. It's an interesting thought experiment. I don't really think the public cares about source code as much as you do. And really, if they did, this wouldn't be a discussion.
> As a product regulatory issue, you're right, it probably wouldn't be any different than other product regulatory law. They're mainly civil, (especially in terms of product warranty). In a chapter 7 case, civil penalties typically don't even get priority to creditors. But who cares?! The company is gone anyway.
Even if the company is bankrupt, that doesn't mean they don't still have the source code. In general it would be on their computers which in bankruptcy would end up in the possession of the bankruptcy trustee, who would be ordered by the court to release it in accordance with the law. If instead someone walked off with it or something like that, now the bankrupt company isn't the issue and you're dealing with some kind of theft case.
> Even if you do convince Congress to approve the new billion dollar data center they'll need, I'm not sure the entire world really wants to put this in their build pipeline.
Source code generally isn't that big compared to other forms of media, compresses well and diffs well against variants. People regularly download the entire git history of a project by accident when they only want the latest version and don't even notice. Billion dollar data center seems like a stretch.
> I don't really think the public cares about source code as much as you do.
It's one of those things where people don't care about something because they don't understand it, not because it doesn't affect them.
People care about things like "I have to buy a new phone for no good reason because the old one stopped getting updates" but if you ask them how that relates to open source they don't know the answer.
> Even if the company is bankrupt, that doesn't mean they don't still have the source code. In general it would be on their computers
If often is not. Most organizations use source control these days, and SaaS or otherwise third-party hosted offerings are very popular, and those may disappear as soon as bills stop being paid. And most endpoint disks are encrypted by default or by regulatory environment, so the source code on them isn't accessible without any employees present. And the source code of dependencies may be on a computer at a different company, accessible only with credentials of employees who were laid off last week.
> It's one of those things where people don't care about something because they don't understand it, not because it doesn't affect them.
> People care about things like "I have to buy a new phone for no good reason because the old one stopped getting updates" but if you ask them how that relates to open source they don't know the answer.
No, source availability only affects developers and developer-adjacent people. And that's actually a great example. Nobody's auntie is installing LineageOS on their Pixel 1. It doesn't matter that the bootloader is unlocked. It doesn't matter if the OS on it has security vulnerabilities. Only nerds like the people on this forum care about this stuff. Everyone else either doesn't care and uses the old phone as is, or they care enough to go to the store and buy a new phone.
> Most organizations use source control these days, and SaaS or otherwise third-party hosted offerings are very popular, and those may disappear as soon as bills stop being paid.
Version control systems typically store the code on the server in addition to the local machine, so you only need one. Also, deleting all the source code instead of turning it over to the bankruptcy trustees is typically a big no-no, because it could have value to creditors.
> And most endpoint disks are encrypted by default or by regulatory environment, so the source code on them isn't accessible without any employees present.
Most enterprises use a central management system that can access any endpoint device registered with the management system.
But in general you're just making the argument for requiring source code to be put in escrow.
> Nobody's auntie is installing LineageOS on their Pixel 1. It doesn't matter that the bootloader is unlocked.
We can see exactly the opposite happening on systems where this isn't the case. Ordinary people install Windows 10 on machines that came with Windows 7 etc., long after the hardware OEM declares them out of support. The OS vendor, rather than the hardware vendor, continues to support the device, once there isn't proprietary firmware/drivers locked to a specific kernel version.
You can't start production before work on the product dis-continues. A lot of e-waste gadgets are therefore effectively discontinued before the launch day. They don't ramp up and down like cars, they source parts and make few batches and move on, like how you make cookies.
IMO the ideal end state to this problem cannot happen before massive debloating and standardization of software and hardware. It's just not realistic when anyone's writing C-like code on a sandbox atop dozen layers of C-like environments.
The end date isn't when you stop manufacturing the product, it's when you stop supporting it, e.g. when it has a known security vulnerability and you've gone an unreasonable length of time without releasing a patch.
The original developer team, source codes, and build instructions would all be gone by then. Otherwise you should be able to issue a patch. I suppose you can codify code archives and code release enforcement framework in penal codes, like use of components with non-escrowed firmware code or losing access to code felony crimes with long sentences, but that's horrible.
The source code gets released at the point when you're discontinuing support. At that point you still have it, and the team, you're just not planning to continue supporting it tomorrow, so you release it now.
If tomorrow comes and for the next vulnerability you can neither patch it nor release the code because you don't have it anymore, now you're in trouble, because you have an obligation to do one or the other. But whose fault is that? You had it a month ago and were supposed to release it before you delete it. Imposing penalties for that kind of pointlessly destructive behavior is hardly unreasonable.
The ordinary penalties for violating this type of requirement in other contexts. That's typically a fine for negligence and more serious penalties for willful violations.
My initial counterargument was going to bring up that Garmin example GP added, since largely the same valuable code is likely shared across many devices.
Reflecting further though, the harm of a manufacturer crippling a physical device does seem worth guarding against regardless, and if the code is really that valuable then such a law might encourage longer support lifetimes (or, at the very least, not crippling a perfectly good piece of hardware).
I think there's a reasonable middle ground here. I would think it's fine for them to keep their algorithms closed-source, but be required to opensource the minimum amount of firmware to allow access to all of the hardware: bootloader, CPU, storage, memory, display, and sensors.
No need to reveal any trade secrets at all. The above stuff is not the secret sauce that makes people buy their watches anyway: they use some of the most basic hardware in the market at their price point.
This is exactly what I meant. I don't want Garmin to give me access to their algorithms. I want to have a way to flash a minimal system on it if I want to play with it.
For something like Marshall smart speakers, it may end up making those products actually good: I could totally imagine an open source system for smart speakers. And that would be infinitely better than the crap Marshall put on the one I own.
It works well for routers: on many routers you can install openwrt or opnsense.
> Sometimes it feels like it makes sense to keep some software proprietary (e.g. the algorithms running in a Garmin watch have a lot of value, customers don't just buy the hardware).
The trouble with this is that it doesn't really work. A normal user of the watch is probably not going to reverse engineer the binary but a competitor will, so concealing the source code is hurting the customers more than the competition.
Not only that, it hurts you more than it hurts competitors. If you publish the source code then customers make improvements to it and fix bugs. But it's still firmware, i.e. specific to the device, so you get the benefit of that rather than competitors. Customers contribute fixes so you need fewer developers yourself or your developers can spend more time adding features that make sales.
Any larger competitor would surely discourage disassembling a competitor's code because mere exposure to it might make you liable to lawsuits if you develop similar features (and you can't trust that the fact you did it won't leak out).
Isn't this the argument for releasing the source code? Looking at the code would be as bad as disassembling it.
You can release source code under a license that gives customers the right to modify it and distribute patches to your other customers who have the license that came with their hardware without giving competitors the right to ship it on their own hardware without paying you.
Sure, and some companies strongly discourage looking over other open source code for exactly that reason.
But companies don't release their source code mostly for a different reason: they honestly believe that this is where their value is, or they believe support costs would outweight any benefits.
For some, it's true, but for most, it's utterly false (or there is a tiny little bit of code somewhere that is valuable, with all the rest being boring code that nobody else could put to use).
"Some companies do this because they're dumb and wrong" is a plausible explanation for why they aren't doing it, but then they should just release it. The future can be better than the past.
One approach might be to see if the software can be replaced with an open-source alternative that allows an equivalent level of access to the hardware functionality, with official support for performing such a replacement (though not necessarily for the replacement software itself). If so, then the manufacturer need not provide the source code. Whereas if there is no supported way of replacing the software with an open-source alternative, then compel the manufacturer to make the software available.
Under this system, Garmin could keep their software closed-source as long as they provide a way to replace the firmware and software with an open-source alternative that can use all of the hardware features, while not necessarily having the software features.
Also I don't really care about having the source code of the firmware if I have an API to access it. If I own Qualcomm hardware, I should not need a damn NDA to flash something on it.
Manufacturers should not be mandated to spend extra money to support our hacking, but spending money/effort/lawyers to prevent that hacking could be prohibited.
I think software and firmware should be free if critical functionality depends on an Internet-based service.
Or, it should at least be free whenever that service gets discontinued.
I've been thinking that perhaps to be able to market a product where critical functionality depends on commercial software connected services, a company should be required by law to provide their source code to a government agency that holds it in escrow.
Then when the company goes out of business or just decides to discontinue the product then that agency will publish the source code under an open source license.
This shouldn't just apply to firmware in connected devices, but to all commercial connected software. There are countless games that can't be played any more only because a company behind it has discontinued its DRM servers.
Of course there are details that need to be worked out with how this will work in practice. The publisher will have to prove to some degree that the provided source code is sufficient. You'd have to prevent the publisher from circumventing the solution in a future software update. And the publisher would have to pay a free for the privilege, to be able to fund the agency.
> a company should be required by law to provide their source code to a government agency that holds it in escrow
The reality is that a lot of code can be licensed from third parties, and there's no moral principle by which it would be right to expose and open-source their code.
So I'm really not sure how workable or useful this would actually be.
When the end of Spotify Car Thing was announced this is what I expected. I thought these perfectly good devices had been built on license encumbered libraries that would prohibit open sourcing the code, and that all those hardware devices would go to waste. Thankfully that was not the case, but most of the time it is. It's amazing how much licensing controls how software is used, and thus how much systems and software designers should consider the license of the libraries and frameworks they choose.
This objection only applies to past products. If the law in a big enough market required that sort of code escrow and eventual release, companies wouldn't build future products covered by that law on top of code they couldn't release.
You'd have to have some practical "ramp up" times, but anything that has been written can be written again.
It could eventually result in the third-party libraries being fully replaced (good value) or them adapting so that they update their libraries yearly - sure you can get the 3 year old one open-source, but do you want to?
id ran into this with the sound code for Doom, and Carmack regrets it (though I'm not sure they had many other practical options, they could have at least contracted for a distribution license which wouldn't need to be open source).
> The reality is that a lot of code can be licensed from third parties, and there's no moral principle by which it would be right to expose and open-source their code.
This doesn't mean that they would have to expose somebody else's code, they would just have to use something else (or write their own)
Identifying all of the owners of licensed code is a difficult problem all by itself. It's not always clear that third party code is even used in a product as any identification or licensing information is lost at compile-time. Some licensed code is only delivered as pre-compiled libraries so a licensee couldn't even comply with a law requiring all source code submitted for escrow.
That's part of the issue, such software is not illegal to sell compiled binaries by the licensee. It is however illegal for them to redistribute licensed source/headers/documentation. They can't turn over the source because they don't own it, merely license it. Quite a bit of closed source software uses closed source licensed libraries with proscribed terms of redistribution. Untold number of embedded devices use closed source licensed dev kits the licensee has no power to redistribute themselves.
> That's part of the issue, such software is not illegal to sell compiled binaries by the licensee.
It would be under the proposed law, which is basically the point of the proposed law. Closed software is only closed because the (regulated) market enforces its being closed, even if it's contrary to the interests of everyone but the producer of said software. The point of said regulation would be to give a (regulated) market incentive to provide actually free software. Hardware manufacturers should not rely on software moats to stay afloat!
And of course with a "free market", people would be buying bricks in boxes thinking they're buying routers or cameras.
How does that apply when you're releasing the source code in the jurisdiction where the law requires it? If they sue you there, the law there says they lose. If you're also in the other jurisdiction your subsidiary there doesn't have to release it so they don't. This is like, what happens when a work is in the public domain in one jurisdiction but not another? The answer is different things happen in different places.
Also, obviously companies would then just not license software under terms that would cause legal trouble for them.
>How does that apply when you're releasing the source code in the jurisdiction where the law requires it?
That would depend on what exactly the requirement is and what exactly the third-party code is and is used for. Without the specifics, I'd say it seems plausible that the third party itself is not subject to the requirement but the firmware using the third-party code may be subject for both first- and third-party code.
But then why is that a problem? The entity in the jurisdiction releases the code for the device they sell in the jurisdiction, whether they wrote it or licensed it, and doesn't license it under terms inconsistent with their legal obligation.
Either they follow the law or it is illegal to sell in the covered jurisdiction.
There are already US laws that work in that way. Consider DDT, if fruit was grown with DDT you can't sell it in the US. It doesn't matter that DDT is legal in the grower's country.
Taken as given the statement in the parent that "there's no moral principle by which it would be right to expose and open-source their code.", then that sort of transitive requirement ought not be imposed. Though I don't agree with that statement.
>There are countless games that can't be played any more only because a company behind it has discontinued its DRM servers.
I want those companies (and especially connected device manufacturers in similar positions) to be liable for damages. It's basically destruction of property.
Except software isn't property. You don't even own it. It's all some "limited license to execute the code on your own hardware", because the one thing software companies will never abide is the doctrine of first sale[1].
It's more like renting for an undisclosed period of time than buying, and perhaps labeling should be required to reflect this. I think the first sale doctrine needs holes plugged and the third party doctrine should probably be binned or at least heavily modified. We're well over a decade into an era that many of our laws just are not fit for. Of course, many of those legal deficiencies were bought and paid for, so I'm not holding my breath.
> I think software and firmware should be free if critical functionality depends on an Internet-based service.
Anything that requires the effort or labor of another person should not be free regardless of criticality. Get a private or public entity to pay the providing person or group.
That's fine, people deserve to be paid. Is it fine to be compelled to keep selling the service under the original terms as long as your customers still have your device?
I mean we're talking about hardware that becomes trash the moment the servers stop working. I think there's a legitimate interest in keeping functional products out of landfills.
Perhaps they should have the option of publishing any specifications, APIs, and keys necessary for a third party to build equivalent functionality rather than source code. This stuff usually isn't rocket science, but it's often locked behind some sort of DRM, or requires exponentially more time without simple docs.
> My primary use of non-free firmware is to stop free riding by hardware cloners. It really sucks to put a lot of effort into a hardware and software design, then have your product get cloned using your own firmware.
That's a massive problem indeed. In ham circles, the tinySA spectrum analyzer is one prime example... there's tons of clones of them floating around despite the hardware actually being closed source, but the firmware being FOSS. And people turn up in ham forums asking why their "tinySA" doesn't work, only to find out they have been shipped counterfeits with sub-par components.
Perhaps that would prevent people from selling clones as if they were authentic, but the stated problem is that clones are undercutting the original creators devices. Changing the name puts an increased burden on the cloners to get their name out there, but in a hobbyist market surrounding a technical subject, that seems pretty small compared to the burden of developing hw/fw that users are willing to pay for.
If you don't want people to copy and improve your software, don't give them permission to copy and improve your software.
If you don't want people to copy and improve your hardware, don't give them permission to copy and improve your hardware. It works the same as software copyright, AFAIK.
If you don't want people to make their own products that are similar to yours... sorry, that's not possible. Imagine what the world would be like if Elon Musk could claim a monopoly on Twitter-like services - if Mark Zuckerberg didn't first claim a monopoly on all social media - if IBM didn't first claim a monopoly on all computing products. You get to differentiate your product based on price and features, just like everyone else does.
> If you don't want people to copy and improve your hardware, don't give them permission to copy and improve your hardware.
As if Chinese cloners would give a shit about permissions. You got everything there from "independents" reverse-engineering clothing or circuits over the "official" manufacturing contractor running "ghost shifts" [1] with original parts that got binned in QA to supposedly trustable brands getting shafted on their end [2].
Regarding electronics, bunnie has also dedicated a chapter about fakes in his book about that world [3]. China simply runs on an entirely different value system than Western countries, one where what we see as theft they see as a completely normal part of doing business [4]. Unfortunately, our politicians never realised that and still continue treating China as an "equal partner".
But that was the original problem. People bought the device thinking it was authentic when it wasn't. If they're still selling crap with your name on it then how does it help you that it's subpar hardware running subpar firmware instead of subpar hardware running better firmware?
If your name's on it it's a trademark violation. If they merely said "our product is similar to tinySA" then nothing you can do - you don't get a special exemption from the laws of market competition, and it seems that customers like their product more.
Sellers in China list counterfeit products on eBay. eBay never has possession of the merchandise. People in the US buy them. They get shipped directly from China to the end customer. Who is the importer? The postal service?
The concern here is that innocent buyers are mislead and end up with counterfeit products. Your proposal is for the innocent buyers to be mislead and end up with counterfeit products and then get sued for it?
I get why they want control over this stuff (simplification, segmentation and obsolescence) but the very second they lose interest, stop supporting it, owners of the hardware should have access and a licence to the source and build instructions.
Not even necessarily open source. Just something better than automatic ewaste.
How do you reliably (eg legally-definable) differentiate between "stopped supporting" and "haven't released an update in a while because it works fine and there are no major bugs"?
On idea I have heard is that you have to pay $AMOUNT yearly to some registrar to be not subjected to that rule and with that payment you thereby agree to support the product for another year. Stopping to pay means you stop to support it and are therefore required to release the plans. Going bancrupt/out of business does the same.
Turning off the central servers is a big clue ;) Happened to me with a "Kodak" baby monitor. Stll-great hardware left with 10% function.
I accept there is some murky middle-ground so maybe there shouldn't be a start limit. You buy the hardware, you assume the right to alter what it runs (but lose official support thereafter).
When a consumer can point to a major bug or security vulnerability that the manufacturer has not fixed within a reasonable period of time.
That said - I think the above proposal is "release it immediately for the eventuality where they stop supporting it", not "require it be released when they stop supporting it".
I think even defining "major" here is going to be hard. E.g. a lot of CVSS are 8 to 10, because of the _impact_ and now the _exploitability_.
So a very annoying bug that does not have any impact is major, or not major? Like my internet radio sometimes has connectivity issues. It resolves itself, but takes maybe 10-15s. After that, it works fine for a couple of hours of even a day. I wouldn't consider that major, because the product is usable in its intented way, it's just annoying.
I think the court system is generally capable of resolving whether or not a bug makes a product defective. Courts and the legal system are very experienced at dealing with ambiguity.
Absent marketing to the contrary (prior to sale), I would consider a software defined radio that cuts out for 10-15s at a time defective. That out right breaks a lot of use cases. If that's a software (and not instead the result of something like damage to your particular unit) I would expect that to be fixed in a reasonable period of time for a product to be considered supported.
You don't: firmware should always be available. I have too many repairable devices which are actually dead because I can't replace a blown microcontroller since the firmware isn't available.
No one should be under an obligation to offer services at cost. It's not even a meaningful concept: if I say the cost of an hour of my time is $N dollars, well, then it is.
There is a pretty good argument that this should be required for copyright protection. The point of copyright is supposed to be to ultimately get works for the public domain, when the copyright expires. If the source code has never been released then it's failing at its purpose. Once a work is in the public domain the public is supposed to be able to make derivatives etc. So to have copyright protection for software you should have to release the source code.
A noble idea, but it should be pointed out that of all of the software ever written on planet Earth, only Ada Lovelace's somewhat notional code for a non-existent machine would be in the public domain in the year 2024. The earliest you would expect to see "modern" code entering the public domain is sometime in the 2060s, and that would be written in Plankalkül.
Obviously, if we’d like to require programmers to release software in source code form then we’re going to need some rules. If programmers are made to release software in some source code form then what languages should we require? Clearly, Brainfuck is out of the question, for example. Though, it would do no good to legislate the programming community into a religious war over FP or OOP. My proposal, however modest, is that the law require software be released in some kind of well-specified, portable language. Something succinct and powerful. Something symbolic, abstract, and much higher-level than the underlying hardware. Hm. How about x86_64 machine code?
That's generally handled with a "preferred form" clause -- it must be released in the form it was written in. If you originally wrote the firmware in Brainfuck, it's fine to release in Brainfuck. But you can't transpile to Brainfuck and release that.
If a vendor wants their software proprietary and secret, I can respect that and would be perfectly fine with full hardware schematics and documentation. Since that's something I physically own anyway.
In the ancient times, providing schematics (and even repair manuals) for things like TVs or cars was the norm. Obviously, that's not happening, but as a fantasy wish - I would love to see this becoming a thing again.
Not sure if we can go that far but I'd be happy is the manufacturers would be forced to put on the label how many years they'll support the product and at what level.
And perhaps when the product is fully out of support, all the related resources should be published so it can be repaired and supported further on a best effort basis by the community now that the manufacturer no longer loses any money on it.
IMHO, optimally there should be no force (regulation) necessary at all. It'd be much more beneficial if manufacturers saw the benefits because consumers choose according to the benefits they see. Sometimes we have to ask to make a need known - esp. if we're willing to pay a premium for it. So I see the ball in our park. If there's demand but no supply, we found a niche.
> optimally there should be no force (regulation) necessary
Ideally. But realistically most of the reason we have some right today is regulation. Warranty isn't something most manufacturers would offer once they are too big to fail but regulation demands it. Same with certain levels of quality (certifications), etc.
As a purchaser of hardware, I want full control over the stuff I own, including the right to modify firmware. But I'm also sympathetic to hardware manufacturers, especially in this day and age where information travels so fast.
A hardware company can say your warranty is void if you modify your purchase's firmware. But it can't stop you from flashing back the original firmware, saying it was broken when you received it, and still making a bogus warranty claim. Nor can it stop you from writing a terrible Amazon review about how complicated the product is, when all your complaints are self-inflicted from modified firmware you found online that the company didn't even write.
Companies are responsible for the entire relationship between their products and their customers. So I can see why they want to define hard boundaries around it.
I think anyone selling commerical products to the public that include software/firmware components should be required to submit source code and binary artifacts to the Library of Congress. The library can then release it to the public when copyright expires or the vendor approves it.
There's a difference between production cost and sale price. The latter is the one being referred to by the GP. The article itself is about neither: it talks about software freedom, not monetary cost.
> The maximalist position is not to compromise at all - all software on a system, whether it's running at boot or during runtime, and whether it's running on the primary CPU or any other component on the board, should be free.
As stated, I don't think this actually makes for a consistent position. The author throws out the terms "system" and "board" and hopefully assumes they define a good boundary, but they actually don't. PCIe expansion cards and hard drives are both not "on the board" but still part of the "system". Peripherals and other support devices (keyboard/usb switch/UPS/serial server) are not "on the system" or perhaps not even part of "the system" at all, yet significantly [ae]ffect the system.
Defining that bound is important, because without it you end up with a wishy-washy escape hatch like "within which software installation is not intended after the user obtains the product", making the definition completely fail at being a maximalist position. (and then when taken as a maximalist/hardline position ends up causing strife...)
The original FSF position was based on the copyright license status of what they shipped on distribution media. This is neatly consistent, but completely outmoded in today's globally networked software-pervasive environment. The spirit of this lives on in things like linux-libre.
I'd say that any modern definition of software freedom must pull in a larger scope that focuses on effective individual freedom in the face of pervasive global networking. A mouse, a video card, and an IoT device can all have proprietary updateable firmware, but drastically different affects on your individual freedom - I've never wanted to change my mouse firmware, if/when video card firmware is updated is (almost) completely under your control and the card lacks easy backhaul, whereas a a proprietary IoT device is essentially a rogue agent on your network.
My own definition involves something like a much more fine-grained definition of device, analyzing how easy it is to modify/inspect code on each device, as well as analyzing trust/security relationships between devices. This ends up being independent from maximalism, as you can choose how important it is for any given device to be libre or not, and see how any compromise specifically affects your freedom.
For example I recognize that my hard drives themselves are completely and utterly non-free. But also my desire for libre hard drives is pretty low as well. The libre device/system those drives store data for protects itself by the use of full disk encryption, and the drives are unlikely to become active attackers due to lacking any network connections besides SATA/SAS to the libre system. Meanwhile the system they're attached to is a KGPE-D16 running libreboot, which has nearly zero proprietary code running in its main CPU domain. I compromise on CPU microcode because the small binary size means low complexity, I choose if/when to update, and it's similar trust to the unavoidable shipped silicon. But no javascript (I run web browsing on another machine with virt-viewer =). I'd say this is a much healthier way of analyzing software freedom, whereas the traditional way kind of lumps many things together and then finds reasons to ignore inconvenient details.
Modern drives are directly connected to the PCIe bus (over NVMe) and even have DMA access IIRC, which potentially enables them to do much more than just store bits of data.
FWIW, even external drives connected through TB have similar level of access.
That's a great point - modern drives are actual PCIe devices themselves, not the more restricted SATA protocol.
The librebooted machine that I was talking about has a properly situated IOMMU that should protect it against rogue PCIe devices (otherwise this same point applies to things like network card firmware). But it's important to keep in mind that many motherboards do not.
Same as with any other requirement: when you can afford it.
Slack's desktop client shouldn't require Electron (making it unusable on low-end devices), but it does. They can get away with that requirement, because there's hardly any alternative client worth your effort/attention.
I was sceptical about BitWarden (versus 1Password), because their UX was quite far behind - but they're slowly catching up, and I might have to reconsider my choice there.
If I wasn't "required" to run a dozen different proprietary/non-portable apps, I'd be using OpenBSD instead of macOS.
I do not "require" X to be free, but given all other things being equal, it's my preferred choice. Unfortunately, a lot of the time, we do not have the luxury of being picky.