Obviously. Hardware designers absolutely love to think that hardware design is totally different to software design and only they have the skills, but in reality it's barely diffetent. Stuff runs in parallel. You occasionally have to know about really hardware things like timing and metastability. But the venn diagram of hardware/software design skills is pretty much two identical circles.
The reason for the "talent shortage" (aka "talent more expensive than we'd like") is really just because hardware design is a niche field that most people a) don't need to do, b) can't access because almost all the tools are proprietary and c) can't afford, outside of tiny FPGAs.
If Intel or AMD ever release a CPU range that comes with an eFPGA as standard that's fully documented with free tooling then you'll suddenly see a lot more talent appear as if by magic.
> The reason for the "talent shortage" (aka "talent more expensive than we'd like") is really just because hardware design is a niche field that most people a) don't need to do, b) can't access because almost all the tools are proprietary and c) can't afford, outside of tiny FPGAs.
Mostly B. Even if you work in company that does both you'll rarely get a chance to touch the hardware as a software developer because all the EDA tools are seat-licensed, making it an expensive gamble to let someone who doesn't have domain experience take a crack at it. If you work at a verilog shop you can sneak in verilator, but the digital designers tend to push back in favor of vendor tools.
> digital designers tend to push back in favor of vendor tools.
Which is fair in my experience because Verilator has serious limitations compared to the other three - no 4-state simulation (though that is apparently coming!), no GUI, no coverage, UVM etc. UVM is utter shite tbf, and I think they are working on support for it.
Also it's much slower than the commercial simulators in my experience. Much slower to compile designs, and runtime is on the order of 3x slower. Kind of weird because it has a reputation for being faster but I've seen this same result in at least two different companies with totally different designs.
I gave up on Verilator support in a previous company when we ran into a plain miscompilation. There was some boolean expression that it simply compiled incorrectly. Difficult to trust with your $10m silicon order after that!
It's definitely nice that it doesn't require any ludicrously expensive licenses though.
In fact I'll go further - in my experience people with a software background make much better hardware designers than people with an EE background because they are aware of modern software best practices. Many hardware designers are happy to hack whatever together with duck tape and glue. As a result most of the hardware industry is decades behind the software industry in many ways, e.g. still relying on hacky Perl and TCL scripts to cobble things together.
The notable exceptions are:
* Formal verification, which is very widely used in hardware and barely used in software (not software's fault really - there are good reasons for it).
* What the software guys now call "deterministic system testing", which is just called "testing" in the hardware world because that's how it has always been done.
> in my experience people with a software background make much better hardware designers than people with an EE background because they are aware of modern software best practices.
I know them. Especially older folks. Ramming all parts on one huge sheet instead of separation by function. Refusing to use buses. Refusing to insert part numbers into schematics so they can just export BoM directly and writing BoM by hand instead.
Watching these guys is like watching lowest office worker inserting values from Excel into calculator so he can then write the result into same Excel table.
Age has an effect, no matter if it's software or electronics. These types learned their trade once, some decades ago, and keep driving like that.
If you want old dogs to learn new tricks, teach them. No company has the money to spend nor the inclination to even suggest education to their workers. Companies usually consider that a waste of time and money. I don't know why. Probably because "investing" in your work force is considered stupid because they'll fire you the moment a quarterly earnings call looks less than stellar.
> If you want old dogs to learn new tricks, teach them
These guys are epitome of arrogance. I have been doing this for N years, you have nothing to teach me! Then the same guy will be staring for several hours straight on a prototype board which is hard shorted because he accidentally created a junction in his schematic. ERC (electrical rules checker) would catch it, if guy would bother to run it...
>* Formal verification, which is very widely used in hardware and barely used in software (not software's fault really - there are good reasons for it).
When developing with C, model checking or at least fuzzing is practically mandatory, otherwise it is negligent.
>> Stuff runs in parallel. You occasionally have to know about really hardware things like timing and metastability. But the venn diagram of hardware/software design skills is pretty much two identical circles.
I don't know your background, but this feels like from someone who hasn't worked on both the aspects for a non-trivial industry project. The thing is software spans a huge range - web FE/BE, GUI, Database, networking, os, compiler, hpc, embedded etc. Not all of them have the same background to be a good HW designer. Sure you can design HW as if you are writing software, but it won't be production worthy - not when you are pushing the boundaries.
My work straddles both HW architecture and SW. I design processors, custom ISA optimized for SW application algorithms, and ensuring optimized micro-architecture implementation on the HW side to meet the PPA. I sit at the intersection of HW, SW and verification. People like me are rare, not just in my company but in the industry. Things fall through the gap, if you don't have someone to bridge it and then you have a sub-optimal design.
I don't deny that SW people cannot learn HW design, there is nothing magical after all; just hardwork and practice. But to say that the venn diagram is two identical circle is plain wrong. The cognitive load to shuttle up and down the two HW/SW stacks is a lot more than either of them.
> someone who hasn't worked on both the aspects for a non-trivial industry project
I have.
When I say software I mean e.g. proficient C++/Rust developers. There's absolutely no reason any of them would struggle with silicon design. Yet silicon designers treat it as if it's some fundamentally different skill, like the difference between playing a piano and a trombone, rather than something more like the difference between programming GPUs and CPUs.
>> rather than something more like the difference between programming GPUs and CPUs.
Again, I get your point, but you are really trivializing HW design here and I don't want anyone starting or migrating from SW to get a wrong impression that you can just pick it up. Sure, with enough thought, patience, skill and hard work anyone can do it - but that applies to anything. But don't expect that just because you know Scala or are a good parallel programmer, you can design good HW that is PPA competitive. You have a better shot than others, but that's it.
Exactly money is problem. I am by trade hardware designer. I have no problem to sit down, create PCB in KiCAD and have it made perfect on first try. But I am doing this just as a hobby because it does not pay much. SWE just pays better even with the AI scarecrow behind it.
At least in the US, yes. Check out general1465's reply to me.
The problem, I think, is that there are many competent hardware design engineers available abroad and since hardware is usually designed with very rigorous specs, tests, etc. it's easy to outsource. You can test if the hardware design engineer(s) came up with an adequate design and, if not, refuse payment or demand reimbursement, depending on how the contract is written. It's all very clear-cut and measurable.
Software is still the "Wild West", even with LLMs. It's nebulous, fast-moving, and requires a lot of communication to get close to reaching the maintenance stage.
I'm talking about chip design: Verilog, VHDL, et al.
Very specifications-driven and easily tested. Very easy to outsource if you have a domestic engineer write the spec and test suite.
Mind you, I am not talking about IP-sensitive chip design or anything novel. I am talking about iterative improvements to well-known and solved problems e.g., a next generation ADC with slightly less output ripple.
Sure, so, yeah "general1465" seemed to be talking about PCB Design.
And from what I know of SemiEngineering's focus, they're talking about chip design in the sense of processor design (like Tenstorrent, Ampere, Ventana, SiFive, Rivos, Graphcore, Arm, Intel, AMD, Nvidia, etc.) rather than the kind of IP you're referring to. Although, I think there's still an argument to be made for the skill shortage in the broader semiconductor design areas.
Anyway, I agree with you that the commoditized IP that's incrementally improving, while very important, isn't going to pay as well as the "novel stuff" in processor design, or even in things like photonics.
Definitely not. You do normally have pretty good specifications, but the level of testing required is much higher than software.
> Very easy to outsource
The previous company I was in tried to outsource some directed C tests. It did not go well. It's easy to outsource but it's even easier to get worthless tests back.
> the level of testing required is much higher than software
No dispute there. I suppose I meant "simply" instead of "easily".
Outside of aeronautics software (specifically, aviation and spaceships/NASA), the topology of the software solution space can change dramatically during development.
Stated differently: the cyclomatic complexity of a codebase is absurdly volatile, especially during the exploratory development stage, but even later on... things can very abruptly change.
AFAICT, this is not really the case with chip design. That is, the sheer amount of testing you have to do is high, but the very nature of *what you're testing* isn't changing under your feet all the time.
This means that the construction of a test suite can largely be front-loaded which I think of as "simple", I suppose...
The reason for the "talent shortage" (aka "talent more expensive than we'd like") is really just because hardware design is a niche field that most people a) don't need to do, b) can't access because almost all the tools are proprietary and c) can't afford, outside of tiny FPGAs.
If Intel or AMD ever release a CPU range that comes with an eFPGA as standard that's fully documented with free tooling then you'll suddenly see a lot more talent appear as if by magic.