FPGA Future: Insights from Expert Graeme Smecher

Niranjana R

Updated on:


FPGA Insights has engaged in an exclusive interview with Graeme Smecher

mugshot colour


Q1) Can you provide an overview of your organization and the services/products it offers?

I wear two hats these days. Three-Speed Logic is my consultancy, where I do instrumentation work for hire. Over the past decade, I’ve focused on readout for superconducting detectors (TES bolometers and MKIDs).

At heart, that means software-defined radio (SDR) and digital signal processing (DSP) algorithms, surrounded by a mixture of control plane, networking, embedded Linux, Python, and hardware/schematic design, plus commissioning and support.

I’ve also worked on some time-of-flight imaging and radiation-hardening projects and tinkered with yet another RISC-V core (called Minimax) that excels in its own little size/performance niche.

More recently, I’ve founded a start-up (t0.technology) with two brilliant colleagues. We’re bringing expertise gained from the astrophysics world to nearby applications like infrared and THz imaging, quantum computing, and space domain awareness.

These applications rely on the technology that’s converging, but designers speak different languages and have not fully taken advantage of the common ground they share. We’ve been working on a hardware platform that should scale bigger and perform better than commercial offerings, and have a firmware offering that’s aimed at highly multiplexed detector arrays.

Q2) Can you explain the benefits of using FPGAs over other types of processors?

I hate to give a negative answer, but the economics of FPGAs are often enough to settle any competitive questions. My usual guidance is: “Don’t use an FPGA unless you have to.” The hassle and expense is a competitive disadvantage that’s hard to overcome.

On the other hand: FPGAs are unrivaled when there’s too much pin-wiggling or parallel processing for an MCU, and not enough demand to justify a custom ASIC. It’s a durable sweet spot, and instrumentation is my “happy place” within it. Within these constraints, FPGAs can work miracles and I love testing their limits.

Q3) What are the most significant trends observed in the FPGA industry over the past year? How will these trends shape the industry’s future?

The pandemic supply-chain crisis is mostly over now. The return to normal lead times, plus the arrival of “exotic” heterogeneous devices (like Versal AI Edge, or the revival of Xilinx’s space-qualified devices roadmap), feels a bit like rainfall after a long drought. For FPGA vendors, the next few years will determine which devices fail and which succeed. For developers, new devices are a great opportunity to be ambitious.

I’m also excited about the growth I see within the FPGA community, both in size and diversity. We need to welcome newcomers, especially from underrepresented groups. We need to ensure their first exposure to the FPGA community is free from bigotry, or they may go somewhere else – to our detriment.

Q4) How do you see FPGA development evolving to meet the demands of modern applications and complex workloads?

FPGAs aren’t just FPGAs anymore: the shift to heterogeneous devices means next year’s silicon isn’t just bigger than this year’s, it’s more complex. The dividing line between software, firmware, and hardware is already fuzzy, and only getting fuzzier.

This is actually a positive force for us. It’s no longer reasonable for vendors to develop and ship tools that are entirely siloed from each other. As we move past one- or two-language designs to so-called “polyglot” designs, it’s no longer acceptable to license tooling that extracts more cash for every language or feature.

Finally, getting on my soapbox: I hope to see some tools flat-out recommodified. In the software world, we used to pay good money for bad compilers. Now, excellent compilers are free. In the FPGA world, I hope simulators follow the same trajectory.

I hope these forces improve how our tools compose and interoperate. I hope they fuel stronger open standards and quicker adoption. Here’s an example from the vendor front and an example from the community front:

Xilinx/AMD now supports VHDL-2019 for their synthesis flows. This is startlingly fast for EDA vendors, and a huge change over their glacial support for VHDL-2008. I hope simulation support follows quickly.

I’m optimistic about projects like CIRCT, which aims to replace Verilog as a machine-generated and machine-consumed “middle” representation for logic. LLVM was transformational for the software world, and an expansion into HDL/HLS domains is exciting.

It looks like the “language wars” are finally receding into a polyglot stew, and I’m excited to see where it leads.

Q5) Sectors that stand to benefit the most from FPGA integration, and why?

The obvious growth markets are data center, automotive, and AI – fields I don’t work indirectly. I’ll give you my perspective from the instrumentation side of things.

We have always used FPGAs since they’ve been essential parts of our synthesis and signal capture chains. However, each new generation of FPGAs has “eaten” more of our designs:
We used to use standalone FPGAs and standalone ARM SoCs – now, we have a single SoC FPGA with an integrated ARM core. We used to use separate JESD204B data converters – now, we use an RFSoC with integrated converters.

This “FPGA eats world” effect has been unambiguously good for the FPGA vendors, who capture more dollars from the BOM. It makes products better for designers and users since it makes hardware designs simpler, and the resulting boards are more versatile, more reusable, and less risky. Power savings is also major because it has a primary impact on reliability and (for large deployments) operating cost.

This process will continue as AI cores move into the picture: an incredible correlator experiment my colleagues designed currently uses FPGA crates for data acquisition and GPU clusters for processing. The prospect of replacing these competing types of silicon with a single, consistent architecture is compelling for both programmatic and technical reasons. And, even if GPUs are still part of these deployments – more competition typically means better affordability.

Finally, I’ve already mentioned the RFSoC several times. Under 5 GHz, the differences between an oscilloscope, a VNA, a spectrum analyzer, a network analyzer, and a phased array are becoming increasingly hard to spot from the hardware alone.

RFSoCs are the next logical step in this process. From my perspective, all-digital implementations of formerly analog devices are a wonderful opportunity to challenge incumbents.

Q6) The role of FPGAs in accelerating AI applications and advancements expected in the near future.

To set this up: AI and data center applications need competition, and both GPUs and FPGA approaches need to challenge each other on pricing and tooling. Of course, all of this is contingent on the AI space remaining a magnet for investment.

I hope the FPGA push into the data center and AI applications is successful, for purely selfish reasons. A successful entry of FPGA-like devices in the data center would sustain an ecosystem of new devices that can be used in relatively small volumes elsewhere.

I like to use the analogy of “whale fall”: a whale carcass that sinks to the sea floor and nourishes an entire ecosystem of slimy or spiny scavengers. AI Edge devices create a particularly interesting “whale fall” because they are price-sensitive.

If vendors want these devices in every vehicle, at every urban intersection, and in every toll booth, everyone else who uses these devices will benefit from the price push that’s necessary to make it work.

Q7) Ensuring the security and integrity of FPGA designs, especially in sensitive applications like finance and defense.

I’m going to duck this question. It’s difficult to genuinely secure the contents of a device the user physically has in their possession. Reverse engineering is often better controlled through licensing provisions or other non-technical means. We follow security best practices to protect source code and associated IP, but do not protect synthesized bitstreams and compiled object code against reverse engineering in our deployed physical devices.

For us, an open and inspectable platform is far more valuable than a device that has been locked down. For example, I routinely deploy firmware with SSH support (including root access) enabled, and have found it to be an invaluable tool when supporting a system with a long deployment cycle – even if it lets people “inside the box”. Open, transparent collaboration is a superpower and I’ve never understood the desire to be unnecessarily secretive or controlling. It’s actually one of our best competitive advantages against larger incumbents.

Q8) Advice for students and professionals interested in pursuing a career in FPGA development to stay updated with the latest trends and technologies.

Instrumentation is a wonderful, challenging career option that nobody seems to talk about. I was five years into my consultancy before the light bulb went on: “Aha, that’s what I do!”
For newcomers: don’t be afraid to reach out to professionals (who, after all, had to start somewhere). And, don’t believe everything they tell you – sometimes, you need to learn by burning your own fingers.


1 thought on “FPGA Future: Insights from Expert Graeme Smecher”

  1. “I hope to see some tools flat-out recommodified.” – This is a typo, and I’m hoping to see simulators (for example) *decommodified*, as in “no longer fed like a parking meter”.

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.