Age-old lesson: change the tires on the moving vehicle that is your business when it's a Geo Metro, not when it's a freight train.
I'm sure the people with the purse strings didn't care, though, and just wanted to funnel the GH userbase into Azure until the wheels fell off, then write off the BU. Bought for $7.5B, it used to make $250M, but now makes $2B, so they could offload it make a profit. I wonder who'll buy it. Prob Google, Amazon, IBM, Oracle, or a hedge fund. They could choose not to sell it, but it'll end up a writeoff if the userbase jumps ship.
I'm surprised nobody has tried to throw together a commercial alternative to GitHub. 50% of it is available as FOSS, the other 50% you can vibecode in a month (you can vibecode reliably, Microsoft/Google just suck at it). Afaict, reason we all keep using GitHub is it has a million features and isn't as ugly, difficult and slow as GitLab. (sorry GitLab, I love your handbook, hate your UX)
Remember when people started using WTFPL because it "sounded good", only to later find out it left them and their users legally liable? This is that but for websites.
And the public doesn't have to audit it. The govt already audits/inspects/validates plenty of sensitive physical products, typically through 3rd party industry associations. You don't get to peek inside, but people signing NDAs do.
Even if this wasn't done, at the very least they must publish their software testing procedures, the way UL, ETL, and CSA require to certify devices for the US power grid. (https://www.komaspec.com/about-us/blog/ul-etl-csa-certificat...) They can also do black box testing.
But ideally they would actually inspect the software to ensure its design is correct. Otherwise vibe-coded apps with swiss cheese code will be running critical infrastructure and nobody will know until it's too late.
> What you need is not a government mandate for infallibility, it's updates
So, we don't need an electrical code to enforce correct wiring. We just need a kind soul driving by our house to notice the company who built our house wired it up wrong. Then that kind person can inform the company of the bad wiring.
And if the company agrees it's their wiring at fault, we can wait 3 months for a fix. Then the next month another kind soul finds more bad wiring. And we just have to hope there is an army of kind strangers out there checking every building built by every company. And hope in the meantime that the building doesn't burn down.
Meanwhile, people have to live with bad wiring for years, that could have been completely prevented to begin with, by an electrician following the electrical code we all already agree on.
> So, we don't need an electrical code to enforce correct wiring.
For an analogy to work, its underlying elements should have a relation to the target. Your analogy is not in the same universe. For electrical work, there is a baseline of materials and practices which is known to produce acceptable results if adhered to. For software, there isn't. (Don't tell me about the Space Shuttle. Consumer software doesn't cost tens of millions and isn't written with dedicated teams over the decades.)
The analogy does work. The house is any software provided by any vendor. The kind strangers are white hat security researchers. The people living in the house are the users.
Software absolutely has baseline materials, have you never written software before? Never used a library? Programming language? API? Protocol? Data format or specification? CPU instruction? Sorting algorithm? A standard material is just a material tested to meet a standard. A 10d nail is a 10d nail if it meets the testing specs for 10d nails (ASTM F1667). Software can be tested against a spec. It's not rocket surgery.
No known practices with acceptable results?? Ever heard of OWASP? SBOMs? Artifact management? OIDC? RBAC? Automated security scanning? Version control? Code signing? Provenance? Profiling? Static code analysis? Strict types? Formal proofs? Automated testing? Fuzzing? Strict programming guidelines (ex. NASA/DOD/MISRA/AUTOSAR)? These are things professionals know about and use when they want standard acceptable results.
What are you talking about re: space shuttle and tens of millions? Have you actually read the coding standards for Air Force or NASA? They're simple, common-sense guidelines that any seasoned programmer would agree are good to follow if you want reliability.
I think the problem here is there's too many armchair experts saying "Can't be done" when they don't know what they're talking about, or jaded old fogeys who were on some horrible government project and decided anything done with rigor will be terrible. That's not the way it is in the trades, in medicine, in law, and those folks actually have more to think about than software engineers, and more restrictions. I think SWEs are just trying to get out of doing work and claiming it's too difficult, and the industry doesn't want to stop the free ride of lack of accountability it's had for decades.
AI is going to introduce 100x more security holes than before, so something will have to be done to improve security and reliability. We need to stop screwing around and create the software building code, before the government does it for us.
> What are you talking about re: space shuttle and tens of millions?
GP was almost certainly referring to "They Write the Right Stuff," an old article that is pretty well known in spaces like this. It discusses a process that (a) works extremely well (the engine control software was ~420 kLoC with a total of 17 bugs found in a window of 11 versions) and (b) is extremely expensive (the on-board shuttle software group had a budget of ~35 million per year in mid-90s dollars).
I mean this is still a semi-bs response on your case, even if you don't realize it.
Many of these devices have security flaws that are horrific and out of best practices by over a decade.
Just having something like "Have a bonded 3rd party security team review the source code and running router software" would solve around 95% of the stupid things they do.
> Just having something like "Have a bonded 3rd party security team review the source code and running router software" would solve around 95% of the stupid things they do.
It would certainly help, but no economically feasible amount of auditing and best practices could lead to having a warranty on that software. My thesis is that our current understanding of software is fundamentally weaker than that of practical applications of electricity, so it makes no sense to present analogies between the two.
> So, we don't need an electrical code to enforce correct wiring.
Are you familiar with how the actual electrical code works? It's a racket. The code is quite long and most of the inspectors don't know most of it so only a small subset is ever actually checked, and that only in the places where the person doing the work is actually pulling permits and the local inspector isn't corrupt or lax in areas the local tradespeople have learned that they're lax. Then we purposely limit the supply of licensed electricians so that they're expensive enough that ordinary people can't afford one, so that handyman from Craigslist or whatever, who isn't even allowed to pull permits, is the one who ends up doing the work.
It only basically works because no one has the incentive to purposely burn down your house and then it only happens in the cases where the numerous violations turned out to actually matter, which is rare enough for people to not get too upset about it.
But the thing that makes it a racket is the making the official process expensive on purpose to milk wealthy homeowners and corporations who actually use the official process, which is the same thing that drives common people to someone who charges a price they can afford even knowing that then there no inspection.
> Then that kind person can inform the company of the bad wiring.
The point is rather that when the homeowner discovers that their microwave outlet is heating up, they can fix it themselves or hire an independent professional to do it instead of the company that built the house (which may or may not even still exist) being the only one who can feasibly cause it to not stay like that until the house is on fire.
I mean, if you could download an update that would fix the wiring in your house, it would be much less critical that the initial installer got it right. (Still much more important than your router, though; it doesn't stop being an electrocution hazard during the un-updated period.)
Trying to make analogies from software to hardware will always fall down on that point. If you want to argue that there should be stricter security & correctness requirements for routers, maybe look more toward "here is how people actually treat them in practice" with regard to ignoring updates...?
> I mean, if you could download an update that would fix the wiring in your house, it would be much less critical that the initial installer got it right
As in my example, some random stranger needs to first find out your "house" (the vendor's software) is wired wrong. And this needs to happen for every "house" (every piece of software). While waiting for this to be discovered, your house burns down (hackers penetrate millions of devices, or perhaps just Microsoft Sharepoint that the govt is uses).
> This allowed the threat actor to perform authenticated operations, including force-updating tags
Hey look, infrastructure underpinning the security of thousands of products, being compromised in a way a simple setting could have prevented (Do not allow overriding tags is an old GH setting). Yet another reason we need a Software Building Code. I wonder how many more of these reasons we'll find in 2026.
> Users on civilian network can continue downloads through the Advance tab in the error message.
They are literally telling users to click through the browser errors about the bad cert. They don't mention that there is a very specific error they should be looking for (expired cert). This gives any MITMer the opportunity right now to replace downloaded executables with malware-laden ones using nothing more than a self-signed cert and a proxy. You can bet your boots China, NK, Iran, Russia are all having a good laugh. Biggest military in the world and they can't get a web server working.
Oh wow, they really are telling people to bypass the cert warning! It's a shame that the average layperson won't understand how breathtakingly stupid this is, because more people need to be paying attention to the staggering incompetence of the US military under this administration.
Honestly this isn't even the first time this kind of advice has been given to non-DoD users needing to access a DoD service over commercial means.
The Navy a few years back were experimenting with letting users check basic HR things in their service record (e.g. to request days off) and despite the leadership's stated intent being for Sailors to be able to do this on their actual personal mobile devices, the IT people duly signed all the relevant server certs under the DoD PKI "because policy forces us to", and then cooked up user training guides that patiently explained to Sailors how to bypass security warnings in their browser.
So if nothing else at least there's experience to go by here, ha.
Well it means that you can MITM a user and they won't know the difference (an expired cert is an expired cert, whether it's self-signed or not, the user clicks through anyway). It also means nobody is doing the regular maintenance to rotate keys and do upgrades/patches/etc.
We need a software building code. This wouldn't be allowed to happen with non-software. The fact that anyone can build any product with software, make it work terribly, and when it fails impacts the lives of thousands (if not millions), needs to be stopped. We don't allow this kind of behavior with the electrical or building code. Hell, we don't even allow mattresses to be sold without adding fire resistance. The software that is critical to people's lives needs mandatory minimum specifications, failure resistance, testing, and approval. It is unacceptable to strand 150,000 people for weeks because a software company was lazy (just like it was unacceptable to strand millions when CrowdStrike shit the bed). In addition to approvals, there should be fines to ensure there are consequences to not complying.
It's great to assert "we need" but I implore you to consider the downsides first.
I work for an electrical contractor and I don't think being annoyed by shitty UI is nearly the same problem as electrical fires. Why govern the whole set of software with 1 set of rules?
Software isn't safety critical until it is, but we already have code to regulate software on electrical equipment, planes, etc. Why do you recommend software have a code? I'd much rather each individual thing that's safety critical have regulations around software in place than have to learn a 4000 page manual that changes every time you cross a jurisdiction, where enforcement varies, etc.
Software engineers can't even agree on best practices as is.
Imo, put the code around the safety critical thing (e.g. cars, planes, buildings). Restricting "critical" software will only get abused the way essential workers did during covid.
Also keep in mind the way buulding code gets enforced: you get an inspection upon completion or milestones. Software has a tendency to evolve and need maintenance or add features after; I don't want to trust this to a bureacrat. I don't like google or apple getting involved on "their platform" and I certainly don't want an incompetent government getting involved.
Before we have a software code, let's make and adopt some guidelines we can agree to. In construction, plenty of builders have their own sets of internal rules that are de facto codes. When one of those gets popular enough for life safety software, let's consider pushing for that.
The solar power industry was born, rolled out products, learned from their failures, and implemented electrical and building code changes, in a third of the time that the software industry has existed.
We already know what the failures are. We already know what the solutions are. We know it because people have been born and died in the span of time we have been dealing with these same problems. There is no need to assemble guidelines (that no company would follow anyway without being forced to).
> Software engineers can't even agree on best practices as is.
I'm not talking about "best practice"; I'm talking about, before you ship a build to customers, you must at least run it once to look for errors. This is kid stuff, yet the companies don't do it, and subsequently half the flights in the USA are delayed for weeks. There is no need to argue about this, there is no question that there are basic practices that should be considered malpractice not to do. We must make this law or they will continue to disregard these basic practices and we will continue to suffer for it.
> Software has a tendency to evolve and need maintenance or add features after
That is a flaw in business practice, it has nothing to do with software itself. I can run a suite of Perl programs today that I wrote 20 years ago, and they run flawlessly. No need for maintenance, it just works. The reason is, we just happened to treat this one language as something that should not break, and should last a long time. No reason we couldn't treat other software the same way. The fact that other software doesn't is a choice by a lazy industry and uncaring business models, and this choice needs to be challenged, the way every industry has had to be challenged by codes (the reason codes exist is industry cannot be trusted to "do the right thing", they need to be forced).
But despite this, codes change all the time. The electrical code changes as solar progresses. Building codes change as we learn new things or new materials are introduced. The codes do change slowly, precisely so the work is well thought-out, coordinated, and safe, which nobody can say about modern software. The time for move fast and break things needs to end.
>That is a flaw in business practice, it has nothing to do with software itself
I don't think it's a flaw and throwing this away isn't worth a quarter of the hassle that comes with any enforcment implementation I can conceive of. Please think about what testing, safety, security is "enough", how you test it, and if it's worth the tradeoffs.
Who is at fault for code violations? The scope of software is generally too big that prearchitected designs don't work and you must assign life safety faults to a PE. Software doesn't work like that, it's not singularly done. You shouldn't need to file a permit for expansion to add a feature or plug a security hole.
You point to solar, but solar is less complicated than the things most of this website would deride as simple in software. The electrical codes. It has hardly changed at all since it's inception, and only inspired a handful (<12) of changes since the 2008 NEC. Most jurisdictions only update every 2 cycles or so, so we're talking 3 updates.
Move fast and break things is fine when it's okay to break things; software is fundamentally different than physical infrastructure and you paint with really broad strokes here when you just assert "need" and "right".
I work with building code every day and I fundamentally disagree that writing a "critical software code" would be net beneficial.
> Software doesn't work like that, it's not singularly done.
This is definitely a business practice problem, and not some intrinsic quality of software.
And even if your business decides that software is never done, nothing stops you from getting a particular build inspected and certified. It's physically possible. The problems are mostly cultural and in other areas: Governments are generally hesitant to regulate business. Software companies lobby extensively to keep it that way. Vendors and customers are generally not willing to pay the increased cost of software inspections. Developers don't like the idea of being held to a quality standard.
There are lots of "software building codes" IEC-62304, MISRA, DO-178C, etc. Problem is that the vast majority of software doesn't fit into those categories. And as you mention, since you can build any product with software, you would have to have categorization for any new standards to make sense.
> We need a software building code. This wouldn't be allowed to happen with non-software. The fact that anyone can build any product with software, make it work terribly, and when it fails impacts the lives of thousands (if not millions), needs to be stopped.
Uncle Bob had a bit to say about this a while ago[1], and is largely what his Clean Code philosophy is trying to achieve, despite the detractors.
To the detractors and down-voters...
If you're complaining about over-abstraction, arguing about the 'better language' to use, or that 'Clean Code' doesn't work - you're missing the point by a few hundred miles.
The whole point of 'Clean Code' is to establish a body of ethics software 'engineers' collectively agree to, and rigorously follow to avoid "killing 10,000 people with a little software failure".
If you think you can make a better 'Clean Code', do so! That's the whole point about ethics debates. But! You need to proffer an alternative and prosecute why your version is better than Clean Code.
I think a better ideea would be that software should not have disclaimers. Authors should assume full responsibility in court if their work misbehaves.
Software is a bit unique in that in almost all cases there are no consequences to responsible parties for malpractice. Shipping defects is the standard of care. Not so for doctors, plumbers, and hair stylists. It’s why I’m so surprised by the AI slop criticism. Humans had no code standards to begin with, at least, not compared to standards of care in licensed professional services.
I have no idea why you'd been downvoted. Everything you said is common sense. I guess this is a case of "it's hard to get a man to understand something if his paycheck depends upon him not understanding it."
EU has the NIS2 directive, the CRA (cybersecurity resiliency act), and a few sector specific ones (DORA for financial, MDR/IVDR for medical/diagnostical, and there's probably a bunch more)
these are slowly but surely pushing manufacturers/sellers/distributors to try to do the right things
it requires transparency about support period commitment, a bug tracker program, issuing updates (I guess in case there's a CVE), doing risk assessment during development, etc., and requirements kick in based on turnover (or headcount).
and it seems like the correct approach, these are already things good products come with
Or maybe it's "the NFPA doesn't need to prevent against your wires suddenly becoming aluminum because somebody discovered new math" like "DSA encryption has been broken" affects software.
I'm sure the people with the purse strings didn't care, though, and just wanted to funnel the GH userbase into Azure until the wheels fell off, then write off the BU. Bought for $7.5B, it used to make $250M, but now makes $2B, so they could offload it make a profit. I wonder who'll buy it. Prob Google, Amazon, IBM, Oracle, or a hedge fund. They could choose not to sell it, but it'll end up a writeoff if the userbase jumps ship.
reply