amiga5k, plasma doesn't do well in the long term with static images (burn in) so i doubt we'll ever see computer plasma screens
but there's one thing else people seem to forget, it's not just interlace or refresh speed, it's also 'fade' that counts, older televisions and computer monitors worked interlaced, true, but they also had 'pixels' (for the lack of a better word) that 'ghosted' or 'faded' for a longer time, thus 'smearing' the image, making the (moving) interlaced image even smoother
computer screens 'ghost' a lot less, making lower frequencies more visible (flickering) but improving things (less ghosting) at higher frequencies
mmm... might have to brush up on my english though, this is hard to explain...