With limited information, does not sound anything special. It's well known for decades that CPUs are pretty bad for paraller workloads. GPUs are better but still quite bad. Even comparing this type of paraller maximizer against CPUs and GPUs pretty much tells everything.
Against GPUs this may be OK. Against ASICs it's only advantage is probably ability to use standard compilers and/or ability to use on multiple purposes.
From limited information, this sounds like programmable ASIC-like chip. It probably wll have some use but because ASICs will still be more efficient, there is too much hype right now.