checker659 parent
I think FPGAs (or CGRAs really) will make a comeback once LLMs can directly generate FPGA bitstreams.
No need. I gave ChatGPT this prompt: "Write a data mover in Xilinx HLS with Vitis flow that takes in a stream of bytes, swaps pairs of bytes, then streams the bytes out"
And it did a good job. The code it made probably works fine and will run on most Xilinx FPGAs.
> The code it made probably works fine
Solve your silicon verification workflow with this one weird trick: "looks good to me"!
What does "directly generate FPGA bitstreams" mean?
Placement and routing is an NP-Complete problem.
And I certainly can't imagine how a language model would be of any use here, in a problem which doesn't involve language.
They are "okay" at generating RTL, but are likely never going to be able to generate actual bitstreams without some classical implementation flow in there.
I think in theory, given terabytes of bitstreams, you might be able to get an LLM to output valid designs. Excepting hardened IP blocks, a bitstream is literally a sequence of sram configuration bits to set the routing tables and LUTs. Given the right type of positional encoding I think you could maybe get simple designs working at a small scale.