Preferences

HansHamster parent
One common student project we had used the FPGA to generate a (VGA*) video signal. For example using the onboard ADC to sample a signal and visualise the waveforms. A more advanced idea was to also implement a line-drawing algorithm on the FPGA to generate wireframe graphics. While this can also be done on a microcontroller and some even include video outputs and GPUs, I think it is a nice way to see on a low level how to generate the signals with the correct timing. I used this for example to add a video output to a Gameboy.

Another a bit more exotic and involved application is a Time to Digital Converter, which can take advantage of the low-level routing inside the FPGA to sample a digital signal with significantly higher precision than the clock (resolutions of 10s of picoseconds down to below 10ps depending on the FPGA).

For work, we mostly use FPGAs for data acquisition systems, low level data processing, high speed data links and so on.


dragontamer
Alas, modern embedded screens (ex: NewhavenDisplays) are either SPI (for small screens) or "8080-protocol" (8080 bus-like protocol) on the faster / larger screens and somewhat easily implemented using bitbanging. So VGA is somewhat out-of-date for a hobbyist, the market has moved on from VGA in practice.

> Another a bit more exotic and involved application is a Time to Digital Converter, which can take advantage of the low-level routing inside the FPGA to sample a digital signal with significantly higher precision than the clock (resolutions of 10s of picoseconds down to below 10ps depending on the FPGA).

That certainly sounds doable and not too difficult to think about actually. But as you mentioned, its exotic. I don't think many people need picosecond resolution timing, lol.

Still, the timing idea is overall correct as an FPGA-superpower. While picosecond resolution is stupidly exotic, I think even single-digit nanosecond-level timing is actually well within a hobbyist's possible day-to-day. (Ex: a 20MHz clock is just 50 nanoseconds, and bit-stuffing so that you pass 4-bits of info / 16-time slots per clock tick means needing to accurately measure the latency of 3.125ns level signals...). This is neither exotic nor complicated anymore, and is "just" a simple 80Mbit encoding scheme that probably has real applicability as a custom low-power protocol.

And its so simple that it'd only use a few dozen or so LUTs of a FPGA to accurately encode/decode.

Ex: 0000 is encoded with a 0ns phase delay off the master clock.

0001 is encoded as 3.125ns phase delay off the clock.

0010 is encoded as 6.25ns phase delay off the clock.

... (etc. etc.)

1111 is encoded as 46.875ns phase delay off the master clock.

HansHamster OP
Yes, VGA is really not very useful nowadays, but I think it is still a useful (student) project for FPGA beginners that is relatively easy to implement, more exciting than blinking an LED and can be built on for other things.

The downside of SPI (and to some degree 8080) screens is the low refresh rate / missing vsync. There are also screens with an RGB interface, which is then again similar to VGA but digital. But yes, this does not really require an FPGA and an ARM controller with RGB interface is probably much more useful for most applications. (Or even MIPI-DSI, but I have not used it myself so far.)

Still, I have a TFP410 lying around that I wanted to strap to my FPGA at some point to get something better than VGA.

> Still, the timing idea is overall correct as an FPGA-superpower.

And while this is especially true on FPGAs with dedicated hardware like a serdes or gearbox, one can still squeeze out a bit more on most FPGAs with DDR IO or several phase-shifted clocks.

This item has no comments currently.