Can a Computer Science Student Be Taught to Design Hardware?

(semiengineering.com)

85 points | by stn8188 1 day ago

29 comments

  • EdNutting 1 day ago
    * The fact that there are comments misunderstanding the article, that are talking about PCB Design rather than (Silicon) Chip Design, speaks to the problem facing the chip industry. Total lack of wider awareness and many misunderstandings.

    * Chip design pays better than software in many cases and many places (US and UK included; but excluding comparisons to Finance/FinTech software, unless you happen to be in hardware for those two sectors)

    * Software engineers make great digital logic verification engineers. They can also gradually be trained to do design too. There are significant and valuable skill and knowledge crossovers.

    * Software engineers lack the knowledge to learn analogue design / verification, and there’s little to no knowledge-crossover.

    * We have a shortage of engineers in the chip industry, particularly in chip design and verification, but also architecture, modelling/simulation, and low-level software. Unfortunately, the decline in hardware courses in academia is very long standing, and AI Software is just the latest fuel on the fire. AI Hardware has inspired some new people to join the industry but nothing like the tidal wave of new software engineers.

    * The lack of open source hardware tools, workflows, high-quality examples, relative to the gross abundance of open source software, doesn’t help the situation, but I think it is more a symptom than it is a cause.

    • Tharre 1 day ago
      > * Chip design pays better than software in many cases and many places (US and UK included;

      Where are these companies? All you ever hear from the hardware side of things are that the tools suck, everyone makes you sign NDAs for everything and that the pay is around 30% less. You can come up with counterexamples like Nvidia I suppose, but that's a bit like saying just work for a startup that becomes a billion dollar unicorn.

      If these well paying jobs truly exist (which I'm going to be honest I doubt quite a bit) the companies offering them seem to be doing a horrendous job advertising that fact.

      The same seems to apply to software jobs in the embedded world as well, which seem to be consistently paid less then web developers despite arguably having a more difficult job.

      • EdNutting 1 day ago
        Oh by the way, I agree, NDAs all the time, and many of the tools are super user-unfriendly. There's quite a bit of money being made in developing better tools.

        As for a list of companies, in the UK or with a UK presence, the following come to mind: Graphcore, Fractile, Olix, Axelera, Codasip, Secqai, PQShield, Vaire, SCI Semiconductor and probably also look at Imagination Tech, AMD and Arm. There are many other companies of different sizes in the UK, these are just the ones that popped into my head in the moment tonight.

        [Please note: I am not commenting on actual salaries paid by any of these companies, but if you went looking, I think you'd find roles that offer competitive compensation. My other comments mentioning salaries are based on salary guides I read at the end of last year, as well as my own experience paying people in my previous hardware startup up to May 2025 (VyperCore).]

      • EdNutting 1 day ago
        Depends if you're looking at startups/scaleups or the big companies. Arm, Imagination Tech, etc. for a very long time did not pay anything like as well (even if you were doing software work for them). That's shifted a lot in the UK in recent years (can't speak for the rest of the world). Even so, I hear Intel and AMD still pay lower base salary than you might get at a rival startup.

        As for startups/scaleups, I can testify from experience that you'll get the following kind of base salaries in the UK outside of hardware-for-finance companies (not including options/benefits/etc.). Note that my experience is around CPU, GPU, AI accelerators, etc. - novel stuff, not just incrementing the version number of a microcontroller design:

        * Graduate modelling engineer (software): £50k - £55k * Graduate hardware design engineer: £45k - £55k

        * Junior software engineer: £60k - £70k * Junior hardware engineer: £60k - £70k

        * Senior/lead software engineer (generalist; 3+ yoe): £75k - £90k * Senior compiler engineer (3+ yoe): £100k - £120k * Senior/lead hardware design engineer: £90k - £110k * Senior/lead hardware verification engineer: £100k - £115k

        * Staff engineering salaries (software, hardware, computer architecture): £100k - £130k and beyond * Principal, director, VP, etc. engeering salaries: £130k+ (and £200k to £250k not unreasonable expectation for people with 10+ years experience).

        If you happen to be in physical design with experience on a cutting edge node: £250k - £350k (except at very early stage ventures)

        Can you find software roles that pay more? Sure, of course you can. AI and Data Science roles can sometimes pay incredible salaries. But are there that many of those kinds of roles? I don't know - I think demand in hardware design outstrips availability in top-end AI roles, but maybe I'm wrong.

        From personal experience, I've been paid double-digits percentage more being a computer architect in hardware startups than I have in senior software engineering roles in (complex) SaaS startups (across virtual conferencing, carbon accounting, and autonomous vehicle simulations). That's very much a personal journey and experience, so I appreciate it's not a reflection of the general market (unlike the figures I quoted above) so of course others will have found the opposite.

        To get a sense of the UK markets for a wide range of roles across sectors and company sizes, I recommend looking at salary guides from the likes of: * IC Resources * SoCode * Microtech * Client-Server

        • mobiuscog 1 day ago
          > Senior/lead software engineer (generalist; 3+ yoe): £75k - £90k

          For London. Maybe higher for Remote US.

          For the rest of the country, it's a fair amount lower, typically around the £60k region.

          • EdNutting 1 day ago
            I was quoting salaries for people in Bristol and Cambridge ;)
    • AshamedCaptain 1 day ago
      > The fact that there are comments misunderstanding the article, that are talking about PCB Design rather than (Silicon) Chip Design, speaks to the problem facing the chip industry. Total lack of wider awareness and many misunderstandings.

      No, there is no misunderstanding. Even the US companies mentioned _in the very article_ that have both software and "chip design" roles (however you call it) will pay more to their software engineers. I have almost never heard of anyone moving from software to the design side, but rather most people move from design side to software which seems like the more natural path.

      • EdNutting 1 day ago
        You've taken two separate points I made and rolled them into one, resulting in you arguing against a point I didn't make.

        The "misunderstandings and lack of awareness" I was referring to is in regards to many people outside the semiconductor industry. These aspects are hurting our industry, by putting people off joining it. I was not referring to people inside the industry, nor the SemiEngineering article.

        As for salaries: See my other comments. In addition, I think it's worth acknowledging that neither hardware nor software salaries are a flat hierarchy. Senior people in different branches of software or hardware are paid vastly different amounts (e.g. foundational AI models versus programming language runtimes...). For someone looking at whether to go into software or hardware roles, I would advise them that there's plenty of money to be made in either, so pursue the one which is more interesting to them. If people are purely money-motivated, they should disappear off into the finance sector - they'll make far more money there.

        As for movement from software into hardware: I've primarily seen this with people moving into hardware verification - successfully so, and in line with what the article says too. The transfer of skills is effective, and verification roles at the kind of processor companies I've been in or adjacent to, pay well and such engineers are in high-demand. I'm speaking from a UK perspective. Other territories, well, I hear EU countries and the US are in a similar situation but I don't have that data.

        Do more hardware engineers transition into software than the other way around? Yeah, for sure, but that's not the point I think anyone is arguing over. It's not "do people do this transition" (some do, most don't), rather it's:

        "We would like more people to be making this transition from SW into HW. How do we achieve that?"

        And to that I say: Let's dispel a few myths, have a data-driven conversation about compensation, and figure out what's really going to motivate people to shift. If it only came down to salary, everyone would go into finance/fintech (and an awful lot of engineering grads do...) but clearly there's more to the decision than just salary, and more to it than just market demand.

    • georgeburdell 19 hours ago
      Software pays better, which is why so many hardware people switched, including myself. In my group, which is mixed between the two, my software job classification nets me a higher bonus and easier promotions

      Edit: also never have to stay late to rework components on dozens of eval boards, and also never have to talk with manufacturers 10 timezones away

      • butterbomb 19 hours ago
        > Software pays better, which is why so many hardware people switched

        Something I noticed years ago browsing jobs in random large companies:

        hardware or anything close to hardware (firmware, driver dev, etc.) was all outsourced. Every single job I saw in that domain was in India or China/Taiwan.

        High level software jobs (e.g. node.js developer to develop the web front end for some hardware device) were still in the US.

        I’ve wondered if thats impacted why so many hardware people ran off to software.

      • waynesonfire 19 hours ago
        Maybe the grass-is-greener on the other side applies here, but, I would find it a privildeg to be in a position where I could take a pay-cut and work on hardware.

        Also, I'm not convinced hardware pays less, I would just do it for less pay.

    • butterbomb 19 hours ago
      I knew a guy who was a digital verification engineer for intel. He was unceremoniously laid off and ended up taking work doing some sort of compliance at a very low paying state agency I was a developer at.

      Pretty sharp guy, we worked together a few times on problems far outside both our responsibilities/domain. I always wondered why he ended up taking that gig. Must have been horrible doing compliance work on what was likely at least a 100% pay cut.

    • RealityVoid 1 day ago
      > * The lack of open source hardware tools, workflows, high-quality examples, relative to the gross abundance of open source software, doesn’t help the situation, but I think it is more a symptom than it is a cause.

      To this, I would point to librelane/yosys/TinyTapeout/waferspace and say there are quite a bit of opportunities to learn stuff and there are oss initiative trying to _do stuff_ in this field. I wouldn't know how it applies to the wider industry, but the ecosystem deff piqued my interest. I do write quite a bit of embedded systems in my day to day though, so I got a rough idea what is in a chip. Would love to have the time to dive deeper.

      • throawayonthe 1 day ago
        that's all digital though right?
        • EdNutting 1 day ago
          Of the things mentioned, yes. But there’s opensource analogue stuff too. Still, even with the open source stuff that there is, it’s a hard hobby to get into from scratch. The barriers to entry are still relatively high compared to just whipping up a website or toying with a Raspberry Pi.
        • jononor 1 day ago
          You can submit analog to TinyTapeout now!
    • jvanderbot 1 day ago
      How does one pivot? It seems to me the job market demand is probably even more concentrated than the software market?
      • MrMorden 20 hours ago
        Getting an EE degree is always an option — but since CS isn’t an engineering degree getting a second bachelor’s will take four years part-time.

        I’m doing that now at ASU and the total requirement for me is 71 semester credits. Maybe I could have found a program for which I only needed 60ish, but that’s the only program in the country with part-time remote classes that will cover what I need (antennas and RF). Someone who is interested in digital design will have more options. (And I haven’t really looked at other countries so YMMV considerably outside the US.)

      • EdNutting 1 day ago
        From Software into Hardware? Your fastest route in is to learn Python and find one of the many startups hiring for CocoTB-based verification roles. Depends a bit on what country you're in - I'm happy to give recommendations for the UK!

        If you're feeling like learning SystemVerilog, then learn Universal Verification Methodology (UVM), to get into the verification end.

        If you want to stay in software but be involved in chip design, then you need to learn C, C++ or Rust (though really C and C++ still dominate!). Then dabble in some particular application of those languages, such as embedded software (think: Arduino), firmware (play with any microcontroller or RPi - maybe even write your own bootloader), compiler (GCC/LLVM), etc.

        The other route into software end of chip design is entry-level roles in functional or performance modelling teams, or via creating and running benchmarks. One, the other, or both. This is largely all C/C++ (and some Python, some Rust) software that models how a chip works at some abstract level. At one level, it's just high-performance software. At another, you have to start to learn something of how a chip is designed to create a realistic model.

        And if you're really really stuck for "How on earth does a computer actually work", then feel free to check out my YouTube series that teaches 1st-year undergraduate computer architecture, along with building the same processor design in Minecraft (ye know, just for fun. All the taught material is the same!). [Shameless plug ;) ]

    • pjc50 19 hours ago
      I was going to see if I could quote some job postings from my employer to compare this, and then discovered that even the intranet jobs board does not have salary ranges posted. Sigh. Going to have to feed that back to someone.

      > Software engineers make great digital logic verification engineers. They can also gradually be trained to do design too. There are significant and valuable skill and knowledge crossovers.

      > Software engineers lack the knowledge to learn analogue design / verification, and there’s little to no knowledge-crossover.

      Yes. These are much more specific skills than HN expects, you need an EE degree or equivalent to do analogue IC design while you do not to do software.

      However I think the very specific-ness is a problem. If you train yourself in React you might not have the highest possible salary but you'll never be short of job postings. There are really not a lot of analogue designers, they have fairly low turnover, and you would need to work in specific locations. If the industry contracts you are in trouble.

    • culopatin 21 hours ago
      If there is a shortage, and engineers are trainable, are there apprenticeships available? I’d gladly move to this field.
      • EdNutting 19 hours ago
        In the UK, yes there are apprenticeships available (generally at the bigger companies like Arm) but not a huge number of them.

        The new UK Semiconductor Centre has recently been asking (among many other questions) why the industry hasn't taken up the govt apprenticeship schemes more given the lack of engineers. The answers as to why are ultimately "it's complicated".

        Your view on the salary during an apprenticeship will depend a lot on where you're coming from and expectations. They're generally lower than UK Median Salary (for any type of job; April 2025 it was £39k) at around £30kpa, but you're being paid to learn (rather than university studies, where you spend to money to learn). Also, god knows why, but the apprenticeships aren't always in the most in-demand areas (though if I had to guess, it would be because there already aren't enough employees to do the in-demand work, let alone spend some of that time training new people... which in the long-term is a disaster but we're in a short-term-thinking kind of world).

        • culopatin 15 hours ago
          Im coming from an ok paid job in the us, but like you said, any pay is better than paying a school, and you get real on the job experience, not some textbook version of reality.
      • pxtail 21 hours ago
        Nah, we don't do that here, instead ideal entry level applicant should have 5y of experience when applying.
    • epolanski 1 day ago
      Are all positions onsite for these kind of jobs?
      • EdNutting 19 hours ago
        It varies a lot by company and by role.

        Most jobs in the architecture/modelling/design/verification roles are basically like software roles (in terms of working patterns / work environment). So, fully remote, hybrid and fully on-site are all possibilities. Hybrid (1-3 days per week in-office) is probably the most common arrangement I've come across in the UK.

        If you're moving into stuff like physical design then you start to get involved in chip bring-up, in which case you need to be in a lab which you're unlikely to be able to build at home. That's when on-site starts to become a requirement.

    • hearsathought 18 hours ago
      > * Chip design pays better than software in many cases

      You are comparing the narrowest niche of hardware engineering to the broad software profession overall?

      > (US and UK included; but excluding comparisons to Finance/FinTech software, unless you happen to be in hardware for those two sectors)

      How many hardware jobs are in the finance/fintech sector? I've never anyone working on hardware in finance nor have I seen a job posting for one. And I doubt the highest paid hardware engineer is making remotely close to what the highest paid software engineer in finance is making.

      > but I think it is more a symptom than it is a cause.

      Or parents, industry professionals, college professors/advisors, etc advise students on future job prospects and students choose accordingly.

      • EdNutting 13 hours ago
        > You are comparing

        Following the lead of everyone else... But if you like I could compare chip design to a selected narrow niche of software that helps me make my point? It doesn't matter. The point is that "hardware doesn't pay" isn't universally true, in the same way "software pays well" is also an untrue universal statement. See my other comments for more nuance or dive into salary guides. One of my comments listed some starting points.

        > How many hardware jobs are in the finance/fintech sector?

        Quite a few in absolute terms. Not many in relative terms. Pick the view that matches what you wanted to hear.

        High Frequency Trading uses FPGAs and custom-ASICs extensively. They're even building their own fully custom data centres (from the soil testing to the chips to the software - in some cases, all done in-house). It's a secretive industry though, by nature, so you'd have to go digging to find out what Jump Trading, XTX Markets, Optiver, etc. are up to -- in London, Bristol, Cambridge, Amsterdam - to name but a few cities with these jobs. I know because I have friends doing them :-).

        > Or [...] advise students

        Yeah I would love that! But it hasn't really been working as a mechanism for a long time now. Most such people I come across have no awareness of semiconductors. At UK universities, we don't have department-specific expert career advice services, so they're useless. Parents are rarely familiar with the field (as per the general population). Professionals have minimal to no contact with students, especially anyone under 18. Professors/advisors are the best bet but that's really only going to capture students that were _already_ showing an interest.

        To be honest, I think having a few more popular US/UK/EU YouTube channels doing any kind of FPGA-based or silicon-based hardware design (i.e. not just RPi or PCB stuff) would help hugely. I've not worked out how a content strategy in this space that I think will work - yet!

  • mikewarot 1 day ago
    I think there are two separate areas of concern here, hardware, and computation. I strongly believe that a Computer Science program that only includes variants of the Von Neumann model of computation is severely lacking. While it's interesting to think about Turing Machines and Church numbers, etc... the practical use of FPGAs and other non-CPU based logic should definitely be part of the modern CS education.

    The vagaries of analog electronics, RF, noise, and the rest is another matter. While it's possible that a CS graduate might have a hint of how much they don't know, it's unreasonable to expect them to cover that territory as well.

    Simple example, did you know that it's possible for 2 otherwise identical resistors to have more than 20db differences in their noise generation?[1] I've been messing with electronics and ham radio for 50+ years, and it was news to me. I'm not sure even a EE graduate would be aware of that.

    [1] https://www.youtube.com/watch?v=omn_Lh0MLA4&t=445s

    • blibble 1 day ago
      the design of FPGAs was certainly part of my CS degree!

      they even made us use them in practical labs, and connect them up to an ARM chip

      • mikewarot 1 day ago
        I'm glad to hear that. Did you have to learn Verilog, VHDL, or something else in the FPGA programming?
  • Glyptodon 1 day ago
    I have trouble believing there's a talent shortage in the chip industry. Lots of ECE grads I know never really found jobs and moved on to other things (including SWE). Others took major detours to eventually get jobs at places like Intel.
    • KRAKRISMOTT 1 day ago
      No shortage of talent. It's just that the big players are used to cheap almost minimum wage Taiwanese wages and refuse to pay the full price of an EE.
  • skoocda 9 hours ago
    I have a degree in EE (2016) and am doing mostly ML engineering with a considerable amount of SWE tasks in my day-to-day.

    Of my graduating class, very few are designing hardware. Most are writing code in one form or another. There were very few jobs available in EE that didn't underpay and lock you into an antiquated skillset, whether in renewables/MRI/nuclear/control etc.

    We had enough exposure to emerging growth areas (computer vision, reinforcement learning, GPUs) to learn useful skills, and those all had free and open source systems to study after graduation, unlike chip design.

    The company sponsoring this article is a contributor to that status quo. The complete lack of grassroots support for custom chips in North America, including a dearth of open source design tools or a community around them, has made it a complete non-starter for upskilling. Nobody graduates from an EE undergrad with real capability in the chip design field, so unless you did graduate studies, you probably just ended up learning more and more software skills.

    But the relentless off-shoring of hardware manufacturing is likely the ultimate cause. These days, most interesting EE roles I see require fluency in Mandarin.

  • realo 1 day ago
    Not all hardware is digital.

    RF design, radars, etc... are more an art than a science, in many aspects.

    I would expect a Physics-trained student to be more adaptable to that type of EE work than a CS student...

    • bob1029 1 day ago
      Not all hardware is digital but we can solve most of the hard parts in the digital domain. There's no reason to do everything analog just because it starts or ends that way.

      A lot of RF design has been reduced to arrays of dumb antennas that are wired together in software. Starlink is probably the best example of this right now.

      You still need people who can build the analog systems and engineer the nasty parts of the signal chain, but you don't need a lot of them.

    • cracki 1 day ago
      Someone with a physics background might be better prepared for the analog world than someone with a digital background.
  • david-gpu 22 hours ago
    I have a MsC in CS. While I spent half of my career writing device drivers, the other half was doing computer architecture. You could say I had a foot on the low level software side, and the other foot on the high-level hardware side. I found them to be two sides of the same coin. Understanding how hardware folks see the world took a few years, but it was very doable.

    My biggest gripe with the semiconductor industry as a career, compared to software, is twofold.

    First, it is very concentrated. If you want to make good money there are only a handful of potential employers, and this only a handful of cities/neighborhoods where you will have to live; remote work is theoretically possible but not all employers make it effective. I found this the most frustrating. The upside is that people know this and this tend to stay at the same employer for a long time, so you get to learn from people with a deep understanding of the product, and people are mindful to keep a pleasant work environment.

    Second, the pay isn't as good at the top end. If you have FAANG-level skills, you will typically do much better financially there than in the semiconductor industry —with the notable exception of NVidia for the past decade or so.

  • OhMeadhbh 4 hours ago
    After having reviewed multiple RISC-V core generators, I suspect it is easier to teach a computer science student to design hardware than it is to teach an electrical engineering student to design software.

    (But I am not saying either task is easy.)

  • stn8188 1 day ago
    The subheading to this article seems a little extreme: "To fill the talent gap, CS majors could be taught to design hardware, and the EE curriculum could be adapted or even shortened."

    The article is more in the area of chip design and verification than PCB hardware, so I kinda understand where it's coming from.

    • mixmastamyk 1 day ago
      Weird article, came to it hoping to see if I could train into a new job. But instead it went on and on about AI for almost the entire piece. Never learned what classes I might need to take or what the job prospects are.
    • em3rgent0rdr 1 day ago
      > "CS majors could be taught to design hardware, and the EE curriculum"

      "Electrical and Computer Engineering" (ECE) departments already exist and already have such a major: "Computer Engineering".

  • AshamedCaptain 1 day ago
    Why would they? Pay is just much lower, despite the fact that there's way more responsability. I personally know more people who switch from hardware to software than viceversa.
    • harimau777 1 day ago
      I'd do anything short of murder to get out of software. If I could find a career that paid enough to live somewhere nice and didn't have the horrible working conditions that software does (stack rank, fake agile, unrealistic deadlines, stack rank, etc.) I'd do it in a heartbeat.
      • deaux 23 hours ago
        >stack rank, fake agile

        95% of software jobs in the world never come into touch with stack ranking or fake agile.

        > live somewhere nice

        If only the remaining 5% of software jobs allows you to "live somewhere nice", then your issue lies with your definition of "somewhere nice".

      • anon291 5 hours ago
        There is so much more to software than SaaS apps. I do compilers now for new chips. The work environment is so much better. Go for the hard problems always.
  • NoiseBert69 1 day ago
    As a computer engineer I usually copy reference schematics and board layouts from datasheets the vendors offers. 95% of my hardware problems can be solved with it.

    Learning KiCad took me a few evenings with YT videos (greetings to Phil!).

    Soldering needs much more exercise. Soldering QFN with a stencil, paste and oven (or only pre-heater) can only be learned by failing many times.

    Having a huge stock of good components (sorted nicely with PartsDB!) lowers the barrier for starting projects dramatically.

    But as always: the better your gear gets - the more fun it becomes.

    • throwup238 1 day ago
      Even as a professional EE working on high speed digital and mixed signal designs (smartphones and motherboards), I used reference designs all the time, for almost every major part in a design. We had to rip up the schematics to fit our needs and follow manufacturer routing guidelines rather than copying the layout wholesale, but unless simulations told us otherwise we followed them religiously. When I started I was surprised how much of the industry is just doing the tedious work of footprint verification and PCB routing after copying existing designs and using calculators like the Saturn toolkit.

      The exception was cutting edge motherboards that had to be released alongside a new Intel chipset but that project had at least a dozen engineers working in shifts.

  • joezydeco 1 day ago
    UIUC CS grad from the late 80s. CS students had to take a track of electrical engineering courses. Physics E&M, intro EE, digital circuits, microprocessor/ALU design, microprocessor interfacing.... It paid off immensely in my embedded development career.

    I'm guessing this isn't part of most curricula anymore?

    • saltcured 1 day ago
      At UC Berkeley in the early-mid 90s, I think I had two digital design courses. The first was low level basics like understanding logic gates, flip flops, gray coding, PROM, ALUs, multiplexers, etc., with a physical project using 7000-series chips on breadboard. The second was the whole 32 bit MIPS/SPIM pipelined CPU design and simulation project based on the Patterson and Hennessy text book.

      But, I seem to recall there were ways to bypass most hardware background knowledge for a CS degree. You had to do intro math and physics that did classical mechanics, but you could stop short of most of the electromagnetic stuff or multivariate calculus. You could get your breadth credits in other areas like statistics, philosophy, and biology. I think you could also bypass digital design with mix of other CS intro courses like algorithms, operating systems, compilers, graphics, database systems, and maybe AI?

    • wmichelin 1 day ago
      I had to take computer architecture. We made a 4 bit CPU... or maybe it was 8 bit. I can't remember. But it was all in a software breadboard simulator thing. LogicWorks.
    • em3rgent0rdr 1 day ago
      That curricula is often more specifically called "Computer Engineering". CS students meanwhile usually aren't bothered by anything below the compiler.
      • joezydeco 1 day ago
        I actually started Illinois as a Computer Engineering major and switched to Computer Science because I thought I'd get to use all the cool supercomputers at the Beckman Institute. Those electrical courses were all part of my CS requirements. Illinois CS was big on architecture, having designed Illiac and all of that stuff. Hennessy/Patterson for life.

        The supercomputer thing... never happened. And I turned out to have a CE career anyway.

    • cracki 1 day ago
      Where I studied, they reduced that, at least the workload and class time, in favor of more math and informatics.

      Definitely no ALU design on the curriculum, no interfacing or busses, very little physics. They don't even put a multimeter in your hand.

      Informatics is considered a branch of logic. If you want to know how to design a computer, you should have studied EE, is their thinking.

    • alephnerd 1 day ago
      > I'm guessing this isn't part of most curricula anymore

      My sibling is a CS@UIUC grad and they as well as CS+X were still required to do that.

      In other universities such as Cal it's a different story. Systems programming and computer architecture course requirements have either been significantly reduced or eliminated entirely in CS programs over the past decade.

      I've documented this change before on HN [0][1][2]. The CS major has been increasingly deskilled in the US.

      [0] - https://news.ycombinator.com/item?id=45413516

      [1] - https://news.ycombinator.com/item?id=45404647

      [2] - https://news.ycombinator.com/item?id=45397327

  • agg23 1 day ago
    I wasn't taught directly (and don't know what I'm doing still), but I've had a lot of fun learning about retro hardware design as a software engineer. I've made a few of my own reverse engineered designs, trying to synthesize how the real designers would have built the chip at the time, and ported others for the Analogue Pocket and MiSTer project.

    Here's an example of my implementation of the original Tamagotchi: https://news.ycombinator.com/item?id=45737872 (https://github.com/agg23/fpga-tamagotchi)

  • peteforde 1 day ago
    I had written a whole big thing that could be summarized as "yes, of course" but then I read the article and realized that it is very specifically about designing silicon, not devices.

    I understand that it makes sense for a blog called Semiconductor Engineering to be focused on semiconductor engineering, but I was caught off guard because I have been working on the reasonable assumption that "hardware designer" could be someone who... designs hardware, as in devices containing PCBs.

    In the same way that not all software developers want to build libraries and/or compilers, surely not all hardware designers want to get hired at [big chip company] to design chips.

    • EdNutting 1 day ago
      It is funny how "hardware design" is commonly used in the chip industry to describe what semiconductor design/verification engineers do. And then there's PCB designers using those same chips in their _hardware designs_.

      Also there's computer architects being like "So, are we hardware design? Interface design? Software? Something else?"...

      Meanwhile, all the mechanical engineers are looking from the outside saying "The slightest scratch and your 'hard'ware is dead. Not so 'hard' really, eh?" ;) ;)

      Every sector has its nomenclature and sometimes sectors bump into each other. SemiEngineering is very much in the chip design space.

  • assimpleaspossi 1 day ago
    I'm a hardware designer. An EE. But over the last umpteen years I've gradually switched over to software because that's where I was needed. What I've found is that I became a very good software programmer but I still lack all the fundamentals of software engineering. There are things I won't or can't use because it would require too much study for me to get good at it or even understand it.

    I would bet that a CS guy would have similar problems switching to hardware engineering.

    • mixmastamyk 1 day ago
      > There are things I won't or can't use

      Curious as to what that is?

  • anonymousiam 1 day ago
    I lived in both worlds (hardware/software) throughout my career. In school, I learned (in order): Analog electronics (including RF), Digital electronics, Microprocessors, Software, Systems. I've always thought that it's strange how few software people know hardware, and vice versa. In the software domain, when I began referencing hardware elements while explaining something, the software audience would usually just glaze over and act like they were incapable of understanding. Same goes for the hardware people when I would reference software elements.

    I learned Ada sometime around 1991. Counting assembly for various platforms, I had already learned about a dozen other languages by then, and would later learn many more.

    Sometime around 2000 I learned VHDL. In all of the material (two textbooks and numerous handouts) there was no mention of the obvious similarities to Ada. I wish somebody had just produced a textbook describing the additional features and nomenclatures that VHDL added to Ada -- That would have made learning it even easier. The obvious reason that nobody had done that is that I was among a very small minority of hardware people who already knew Ada, and it just wouldn't be useful to most people.

    In all of my work, but especially in systems integration work, I've found that my knowledge of multiple domains has really helped me outperform my peers. Having an understanding of what the computer is doing at the machine level, as well as what the software is doing (or trying to do) can make the integration work easy.

    More on-topic: I think it would be a great improvement to add some basic hardware elements to CS software courses, and to add some basic CS elements to EE courses. It would benefit everyone.

    • musicale 1 day ago
      > In all of the material (two textbooks and numerous handouts) there was no mention of the obvious similarities to Ada

      Really? That's kind of the point of VHDL, isn't it? (vs. Verilog's unholy combination of C-like syntax with begin/end blocks, etc.)

      VHDL also inherits Ada's module style, designed to have different implementations of the same thing (and verbosity, where it seems like you often have to say the same thing repeatedly, for better or for worse - more type checking at the expense of more typing at the keyboard.)

  • bsoles 1 day ago
    >> "Either we hire good CS people who have the basic understanding of EE, and we train them to become good engineers, or we hire good engineers who are good in CS, and we try to upskill them on the CS side."

    The former (CS -> EE) is very unlikely to happen at a large scale than the latter (EE -> CS). It is much easier to teach EEs to become (albeit, often bad) software engineers, than teaching CS student to be good engineers.

    Also, the former (CS -> EE) will not happen in academia because of (1) turf wars, and (2) CS faculty not having any understanding, nor interest in electronics/hardware/engineering.

    I once proposed to teach an IoT class in the CS department of a major university in US, the proposal basically fell on deaf ears.

  • contubernio 1 day ago
    Is this not what electrical engineers are for?
    • anthk 1 day ago
      EE engineers design components and new materials for maybe computers (or not), CS engineers should be able to design CPU's.
  • bee_rider 1 day ago
    Is the idea here that the code-generation apocalypse will leave us with a huge surplus of software folks? Enabling software people to go over to hardware seems to be putting the cart before the horse, otherwise.

    Hardware people go to software because it is lower-stress and can pay better (well, at least you have a higher chance of getting rich, start-ups and all that).

  • SomaticPirate 1 day ago
    Hilarious to see Cadence and Synopsys in this article. They are arguably the cause. The complete lack of open source tooling and their agressive tooling price is the exact reason this ecosystem continues to be an absolute dumpster fire.

    I used Vivado (from Xilinx) a bit during my undergrad in computer engineering and was constantly surprised at how much of a complete disaster the tooling chain was. Crashes that would erase all your work. Strange errors.

    I briefed worked at a few hardware companies and I was always taken aback by the poor state of the tooling which was highly correlated with the license terms dicated by EDA tools. Software dev seemed much more interesting and portable. Working in hardware meant you would almost always be searching between Intel, Arm, AMD and maybe Nvidia if you were a rockstar.

    Software by comparison offered plentiful opportunities and a skill set that could be used at an insurance firm or any of the fortune 100s. I've always loved hardware but the opaque datasheets and IP rules kills my interest everytime.

    Also, I would argue software devs make better hardware engineers. Look at Oxide computer. They have fixed bugs in AMD's hardware datasets because of their insane attention to detail. Software has eaten the world and EEs should not be writing the software that brings up UEFI. We would have much more powerful hardware systems if we were able to shine a light on the inner workings of most hardware.

    • cfd456 16 hours ago
      Designing silicon requires a much greater attention to detail than designing software since the penalty for bugs or a bad design is higher
  • dilawar 1 day ago
    EE folks should design languages because they understand hardware better?!

    And CS folks should design hardwares because they understand concurrency better?!

    • rbanffy 1 day ago
      I know you said it in jest, but there is a strong justification for cross-feeding the two disciplines - on one side, we might get hardware that’s easier to program and, on the other end, we might get software that’s better tuned to the hardware it runs on.
    • lawstkawz 1 day ago
      Working in EE post BSc in EE from 99-06, it's pretty much CS + I know how to bread board and solder if absolutely necessary.

      A whole lot of my coursework could be described as UML diagramming but using glyphs for resistors and ground.

      Robots handle much of the assembly work these days. Most of the human work is jotting down arbitrary notation to represent a loop or when to cache state (use a capacitor).

      Software engineers have come up with a whole lot of euphemistic notations for "store this value and transform it when these signals/events occur". It's more of a psychosis that long ago quit serving humanity and became a fetish for screen addicts.

  • Tade0 1 day ago
    My degree is in computer science but I studied at the faculty of electrical engineering.

    My courses didn't get into the details of semiconductor design (particularly manufacturing), but we had one on the physical principles behind this whole thing - bandgaps and all.

    We also had to design analog circuits using the Ebers-Moll transistor model, so pretty basic, but still not exactly linear.

    Overall these are very different fields but at the end of the day they both have models and systems, so you could make a student of one of them learn the other and vice versa.

    It just has to be worth the effort.

  • tensility 20 hours ago
    SICP should probably be required reading in the CS curriculum. It's a great start at understanding register hardware simulation.
  • anthk 1 day ago
    In Europe in order to get a CS degree and be an actual "Engineer" you must be able to so at least on a basic level.
  • IshKebab 1 day ago
    Obviously. Hardware designers absolutely love to think that hardware design is totally different to software design and only they have the skills, but in reality it's barely diffetent. Stuff runs in parallel. You occasionally have to know about really hardware things like timing and metastability. But the venn diagram of hardware/software design skills is pretty much two identical circles.

    The reason for the "talent shortage" (aka "talent more expensive than we'd like") is really just because hardware design is a niche field that most people a) don't need to do, b) can't access because almost all the tools are proprietary and c) can't afford, outside of tiny FPGAs.

    If Intel or AMD ever release a CPU range that comes with an eFPGA as standard that's fully documented with free tooling then you'll suddenly see a lot more talent appear as if by magic.

    • elektronika 1 day ago
      > The reason for the "talent shortage" (aka "talent more expensive than we'd like") is really just because hardware design is a niche field that most people a) don't need to do, b) can't access because almost all the tools are proprietary and c) can't afford, outside of tiny FPGAs.

      Mostly B. Even if you work in company that does both you'll rarely get a chance to touch the hardware as a software developer because all the EDA tools are seat-licensed, making it an expensive gamble to let someone who doesn't have domain experience take a crack at it. If you work at a verilog shop you can sneak in verilator, but the digital designers tend to push back in favor of vendor tools.

      • IshKebab 1 day ago
        > digital designers tend to push back in favor of vendor tools.

        Which is fair in my experience because Verilator has serious limitations compared to the other three - no 4-state simulation (though that is apparently coming!), no GUI, no coverage, UVM etc. UVM is utter shite tbf, and I think they are working on support for it.

        Also it's much slower than the commercial simulators in my experience. Much slower to compile designs, and runtime is on the order of 3x slower. Kind of weird because it has a reputation for being faster but I've seen this same result in at least two different companies with totally different designs.

        I gave up on Verilator support in a previous company when we ran into a plain miscompilation. There was some boolean expression that it simply compiled incorrectly. Difficult to trust with your $10m silicon order after that!

        It's definitely nice that it doesn't require any ludicrously expensive licenses though.

    • sifar 1 day ago
      >> Stuff runs in parallel. You occasionally have to know about really hardware things like timing and metastability. But the venn diagram of hardware/software design skills is pretty much two identical circles.

      I don't know your background, but this feels like from someone who hasn't worked on both the aspects for a non-trivial industry project. The thing is software spans a huge range - web FE/BE, GUI, Database, networking, os, compiler, hpc, embedded etc. Not all of them have the same background to be a good HW designer. Sure you can design HW as if you are writing software, but it won't be production worthy - not when you are pushing the boundaries.

      My work straddles both HW architecture and SW. I design processors, custom ISA optimized for SW application algorithms, and ensuring optimized micro-architecture implementation on the HW side to meet the PPA. I sit at the intersection of HW, SW and verification. People like me are rare, not just in my company but in the industry. Things fall through the gap, if you don't have someone to bridge it and then you have a sub-optimal design.

      I don't deny that SW people cannot learn HW design, there is nothing magical after all; just hardwork and practice. But to say that the venn diagram is two identical circle is plain wrong. The cognitive load to shuttle up and down the two HW/SW stacks is a lot more than either of them.

      • IshKebab 1 day ago
        > someone who hasn't worked on both the aspects for a non-trivial industry project

        I have.

        When I say software I mean e.g. proficient C++/Rust developers. There's absolutely no reason any of them would struggle with silicon design. Yet silicon designers treat it as if it's some fundamentally different skill, like the difference between playing a piano and a trombone, rather than something more like the difference between programming GPUs and CPUs.

        • sifar 15 hours ago
          >> rather than something more like the difference between programming GPUs and CPUs.

          Again, I get your point, but you are really trivializing HW design here and I don't want anyone starting or migrating from SW to get a wrong impression that you can just pick it up. Sure, with enough thought, patience, skill and hard work anyone can do it - but that applies to anything. But don't expect that just because you know Scala or are a good parallel programmer, you can design good HW that is PPA competitive. You have a better shot than others, but that's it.

    • acuozzo 1 day ago
      > is really just because hardware design is a niche field

      Which doesn't pay as well as jobs in software do, unfortunately.

      • general1465 1 day ago
        Exactly money is problem. I am by trade hardware designer. I have no problem to sit down, create PCB in KiCAD and have it made perfect on first try. But I am doing this just as a hobby because it does not pay much. SWE just pays better even with the AI scarecrow behind it.
      • IshKebab 1 day ago
        Really? In my experience in the UK it pays ~20% better. We're talking about silicon hardware design. Not PCBs.
        • acuozzo 1 day ago
          At least in the US, yes. Check out general1465's reply to me.

          The problem, I think, is that there are many competent hardware design engineers available abroad and since hardware is usually designed with very rigorous specs, tests, etc. it's easy to outsource. You can test if the hardware design engineer(s) came up with an adequate design and, if not, refuse payment or demand reimbursement, depending on how the contract is written. It's all very clear-cut and measurable.

          Software is still the "Wild West", even with LLMs. It's nebulous, fast-moving, and requires a lot of communication to get close to reaching the maintenance stage.

          • EdNutting 1 day ago
            PCB Design != Chip Design.

            The article was about chip design.

            Not trying to stop you debating the merits and shortcomings of PCB Design roles, just pointing out you may be discussing very very different jobs.

            • acuozzo 1 day ago
              I'm talking about chip design: Verilog, VHDL, et al.

              Very specifications-driven and easily tested. Very easy to outsource if you have a domestic engineer write the spec and test suite.

              Mind you, I am not talking about IP-sensitive chip design or anything novel. I am talking about iterative improvements to well-known and solved problems e.g., a next generation ADC with slightly less output ripple.

              • EdNutting 1 day ago
                Sure, so, yeah "general1465" seemed to be talking about PCB Design.

                And from what I know of SemiEngineering's focus, they're talking about chip design in the sense of processor design (like Tenstorrent, Ampere, Ventana, SiFive, Rivos, Graphcore, Arm, Intel, AMD, Nvidia, etc.) rather than the kind of IP you're referring to. Although, I think there's still an argument to be made for the skill shortage in the broader semiconductor design areas.

                Anyway, I agree with you that the commoditized IP that's incrementally improving, while very important, isn't going to pay as well as the "novel stuff" in processor design, or even in things like photonics.

              • IshKebab 19 hours ago
                > easily tested.

                Definitely not. You do normally have pretty good specifications, but the level of testing required is much higher than software.

                > Very easy to outsource

                The previous company I was in tried to outsource some directed C tests. It did not go well. It's easy to outsource but it's even easier to get worthless tests back.

                • acuozzo 6 hours ago
                  > the level of testing required is much higher than software

                  No dispute there. I suppose I meant "simply" instead of "easily".

                  Outside of aeronautics software (specifically, aviation and spaceships/NASA), the topology of the software solution space can change dramatically during development.

                  Stated differently: the cyclomatic complexity of a codebase is absurdly volatile, especially during the exploratory development stage, but even later on... things can very abruptly change.

                  AFAICT, this is not really the case with chip design. That is, the sheer amount of testing you have to do is high, but the very nature of *what you're testing* isn't changing under your feet all the time.

                  This means that the construction of a test suite can largely be front-loaded which I think of as "simple", I suppose...

    • IshKebab 1 day ago
      In fact I'll go further - in my experience people with a software background make much better hardware designers than people with an EE background because they are aware of modern software best practices. Many hardware designers are happy to hack whatever together with duck tape and glue. As a result most of the hardware industry is decades behind the software industry in many ways, e.g. still relying on hacky Perl and TCL scripts to cobble things together.

      The notable exceptions are:

      * Formal verification, which is very widely used in hardware and barely used in software (not software's fault really - there are good reasons for it).

      * What the software guys now call "deterministic system testing", which is just called "testing" in the hardware world because that's how it has always been done.

      • general1465 1 day ago
        > in my experience people with a software background make much better hardware designers than people with an EE background because they are aware of modern software best practices.

        I know them. Especially older folks. Ramming all parts on one huge sheet instead of separation by function. Refusing to use buses. Refusing to insert part numbers into schematics so they can just export BoM directly and writing BoM by hand instead.

        Watching these guys is like watching lowest office worker inserting values from Excel into calculator so he can then write the result into same Excel table.

        • cracki 1 day ago
          Age has an effect, no matter if it's software or electronics. These types learned their trade once, some decades ago, and keep driving like that.

          If you want old dogs to learn new tricks, teach them. No company has the money to spend nor the inclination to even suggest education to their workers. Companies usually consider that a waste of time and money. I don't know why. Probably because "investing" in your work force is considered stupid because they'll fire you the moment a quarterly earnings call looks less than stellar.

          • general1465 23 hours ago
            > If you want old dogs to learn new tricks, teach them

            These guys are epitome of arrogance. I have been doing this for N years, you have nothing to teach me! Then the same guy will be staring for several hours straight on a prototype board which is hard shorted because he accidentally created a junction in his schematic. ERC (electrical rules checker) would catch it, if guy would bother to run it...

          • IshKebab 1 day ago
            > If you want old dogs to learn new tricks, teach them.

            That's not really how our industry works - or even how it should work IMO.

            If old dogs want to keep their jobs they should teach themselves new tricks.

      • EdNutting 1 day ago
        Side note: Formal theorem proving is even more rare than formal model checking..!
      • imtringued 1 day ago
        >* Formal verification, which is very widely used in hardware and barely used in software (not software's fault really - there are good reasons for it).

        When developing with C, model checking or at least fuzzing is practically mandatory, otherwise it is negligent.

  • Joel_Mckay 1 day ago
    Hardware is artificially underpaid work, good positions are sparse in the US, and generally most engineers end up in niche coding environments.

    Most people that land a successful long career, also refuse to solve some clown firms ephemeral problems at a loss. The trend of externalizing costs onto perspective employees starts to fail in difficult fields requiring actual domain talent with $3.7m per seat equipment. Regulatory capture also fails in advanced areas, as large firms regress into state sponsored thievery instead.

    Advice to students that is funny and accurate =3

    "Mike Monteiro: F*ck You, Pay Me"

    https://www.youtube.com/watch?v=jVkLVRt6c1U

  • webdevver 1 day ago
    digital circuit design strikes me as a risky gambit for a career, given that almost everyone who ive bumped into in that industry was invariable not actually doing any design, but rather was tasked with writing test cases and verifying the functionality of some specific logical block.

    tests are ofcourse very important, but fact of the matter is, bright smart and arrogant young engineers-to-be are very eager to show everyone how much better their version of the 'thing' is, and desperately want to write their version of the thing: they don't want to verify someone else's version of the thing.

    if we're being honest, how many people do you really need to do the design of some hardware feature? realistically the design can be done by one person.

    so you might have one lead designer, delegates each block to 10 guys, and everything else is basically 'monkey work' of writing up the state machine logic, testing it, and hooking it all up.

    and now lets count the number of companies that can put up the capital for tape-out: amd, intel, arm, nvidia, meta, aws, google chips, apple, and lets say plus 50 for fintechs, startups, and other 'smaller' orgs.

    so if you want to do design, you might be competing for... lets say 3 lead designers per org on avg, 3 * 50 = 150 silicon design spots for the entire globe. to add, a resource in such scarce supply will no doubt be heavily guarded by its occupants.

    i did this calculation back when i was still in uni. i'll never know if it paid off, or if it was even rooted in logic, but i remember thinking to myself back then: "no way in hell am i gonna let these old guys pidgeon hole me into doing monkey work with a promise of future design opportunities." arrogant, yes, but i can't say i regret my decision judging from the anecdotes i get from friends in the hardware world.

    • RealityVoid 1 day ago
      > and now lets count the number of companies that can put up the capital for tape-out: amd, intel, arm, nvidia, meta, aws, google chips, apple, and lets say plus 50 for fintechs, startups, and other 'smaller' orgs.

      And basically anyone who has a job in tech [1] or someone who just pulled their salary out of the ATM has enough money to do a tapeout with the cash in their hand [2] or chinese students for basically free[3]. Of course, for _some_ scopes of tapeout. These are older nodes and you have limited area. But you might not need anything fancy for your design.

      The rest of the post, I think has a bunch of misunderstandings or wrong facts, but I don't work in the field, (ish) so I might be as clueless as you and I need to get back to my day job so I won't try countering you just yet.

      [1] https://wafer.space/ [2] https://app.tinytapeout.com/calculator?tiles=1&pcbs=1 [3] https://ysyx.oscc.cc/docs/en/

      • cfd456 20 hours ago
        3 designers per silicon org is a ludicrous underestimate
        • RealityVoid 18 hours ago
          Yes, one of the ludicrous statements. I would invite him to take a look at OpenTitan for example and see how many designers work there on that thing.
    • webdevver 1 day ago
      some additional thoughts:

      i think that, for digital design to be interesting, the cost of entry must be lowered by probably orders upon orders of magnitude.

      the google skywaterpdk thing, whatever it is (or was?), did produce a great deal of hobbyist designs and proved that there really isn't anything special about rtl - infact, its really quite monotonous and boring.

      which is a good attitude to have, really. lots of hobbyist designs got cranked out quickly on what, as i understood, was a very obsolete pdk from two decades ago.

      but its fundamentally still too expensive and too limited. open source software 'blew up' because

      1. the cost of entry was free...

      2. ...for state of the art tools.

      its not enough to be free, or open source. it also has to be competitive. llvm/gcc won the compiler world because they blew the codegen of proprietary compilers out of the water, ofcourse being open source it became a positive feedback loop of lots of expert eyeballs -> better compiler -> more experts look at it -> better compiler -> ...

      for digital design to become interesting, you can't trick the kids: they want the same tech the 'big boys' are using. so, what scope is there to make it economical for someone like Intel carving out some space for a no-strings-attached digital design lottery?

      i get the impression that, unlike for most manufacturing processes, the costs of silicon digital electronics increases every year, and the amortisation schedule becomes bigger, not smaller.

      so if anything, it seems that the more high tech silicon manufacturing becomes, the smaller the pool of players (who have the ever-increasing capital expenditure necessary) becomes, which should indicate that the opportunities for digital design work are actually going to be shrinking as time goes on.