The target market seem to be people who want something to integrate ML in their robot or RC car. It features decent inference performance with (comparatively) low power demand. But they can't even do training.
Unless you're building something with space and/or power constraints you are much better off with a laptop or desktop.
I have to say, this sounds like quite a niche market they're targeting here.
IoT and embedded computing isn't a niche market. Most households have more embedded and mobile devices than they have desktop/laptop computers.
It's a variant of their development board for their AI accelerator ship for embedded setups (think quality control, licence plate detector, analysing visitors via webcam etc). This variant is probably just to make it more attractive for hobbyists; increasing mindshare and community size. It doesn't need a huge market, and maybe people come up with cool ideas nobody has thought of yet.
Does anyone make Beowulf clusters of these? I could see that being more cost-effective if you're resource-limited even by what higher end GPUs can do.