Not an expert but the dpi that you want to scan at depends on the size of image you're scanning and the size you want it on screen. For example, if you want an image to take up the entire screen (1366x768 pixels) then to have the best quality you want at least 1366x768 pixels. If the original image is 4 inches by 2.2 inches then you'll want a scan dpi of at least 342 dpi. When you have this image in Windows you can change its dpi to 72/96dpi for it to fit the screen. Windows won't know what size of screen you're using but you simply want it to take up 1366x768 pixels and the handiest way of doing this is to set the dpi to 72dpi (I think) and resize the image to whatever number of pixels you desire (1366x768).
This is the same for web development, you usually should scan at a higher dpi and then modify the image to change the dpi to 72dpi and resize the image to size you want on screen. Note changing dpi has no affect on image quality, it simply changes the size of the image when printed. For displaying however the dpi is set at 72 or 96 but this is the dpi no matter what the size of the display. An image taking up half the screen on my old 14" takes up half the screen on my 21" screen when the resolutions are the same.
For equivalent printed image sizes, for more detail you need higher dpi. For a screen (unlike printing on paper) you cannot fit in any more dots that the screen was built with so it can be easier to change the dpi to 72 and resize the image to the resoultion that you require on screen, i.e. 1366x768 or whatever.
Note also for web development you desire the smallest image sizes for the limited bandwidth but for a dedicated display connected to a local machine you can have larger images and simply resize them at run time in what ever application you are using.
In this post I'm assuming that you'll have your computer's resolution matched to the plasma's. Its much easier when dealing with computer displays to forget about dpi and simply work with pixels or percentages of the screen resolution.