AFAIK, or can figure anyway, ATI introduced the noticeable time delay in MMC to better implement the same sort of pause/commercial skip features you find on many cable boxes &/or dvrs, only for the theater cards... you couldn’t find a review of the then new 550 for example without a large section devoted to complaints about the ATI-branded Cyberlink software’s performance. MMC on the other hand sold a bunch of AIWs for them. Unfortunately the result IMHO was a rather un-delightful Rube Goldberg contraption of sorts, and MMC was eventually abandoned after absorbing a lot of ATI resources.
At any rate, I’m fairly certain any live TV delay is a software & driver issue... Hardware effects it by making things easier or harder to code, & the more functions performed in hardware, the less software (and the CPU) have to do, but whether or not there’s a delay is something decided by whomever designs the TV software. And to a very, very large extent, the software designers and developers also determine the load on the PC itself. The non-Avivo AIWs were/are a great example themselves: < 10% CPU usage capping mpg2 in older versions of MMC vs 20 - 30% in final versions. [as you couldn’t turn off capture in last versions of MMC - without mods - can’t compare watching TV to watching TV].
How the new AIWs perform then depends a LOT on ATI’s software people. A new AIW design would be much closer to the theater TV cards/devices, so developing one-fits-all software should be much easier, but the question is: if ATI would ever do it? There’s the cost involved - when MMC was developed the AIWs were a lot more expensive, without loads of cheap competition. And there’s consumer demand - would it make a positive difference to the majority of potential customers? Features like no TV delay would also depend on demand I’d think.
To me, with AMD in the picture, it might make more sense to support and encourage something like the MythTV project, rather than licensing Cyberlink or similar, but who knows? RE: HTPCs, for years AMD & Intel have been trying to drum up interest in motherboard chipsets with on-board graphics, which is more ideal for HDMI & HTPCs... It likely wouldn’t take much to implement tuner daughter cards [PAL/NTSC], and if by contributing code etc they could make sure a cross platform solution like Myth worked, could make a difference in the market.
That sort of set-up would also be more ideal I think for this sort of thing:
“With a digital tuner shouldn't a multi-plex of sorts be available? 1 physical input, multiple signals to tuners, so you could have reams of streams”
Every time you split the signal you lose strength, & inside the case, you have an opportunity for more interference from the PC’s other electronics. Rather than trying to fit sufficient shielding & maybe even signal amplification on a card where real estate is expensive, you can shift it elsewhere. You could use anything from hardware attached to a card opening at the rear of the case, to an enclosure fitting in a drive bay, to a regular break-out box. It’s not the bandwidth that makes separate cards necessary today - it’s the real estate taken up by bulky, off-the-shelf tuners & connectors. And completely external hardware suffers from bandwidth restrictions and being inherently dumb. Connecting directly to the motherboard’s graphics circuits could solve both.