Vga monochrome converter




















In addition to the dip switches, there are four buttons to allow changing the image position. You may still need to adjust your monitor for optimum settings. The reset button can change the phase of the incoming video signal to optimize output.

If you see a glitch in the image, pressing this button may help. DIP 4 only applies in Monochrome mode. Like this: Like Loading Additional information Weight 3. That's more a thing for the Windows control panel, or maybe the software you're using. EG, I don't know if you can still do it, but Word used to have a sub-setting in its "Wordperfect compatibility mode" options that would turn everything to white text on a blue background regardless of the actual page or text colour, sort of like using the DOS version of WP or Edit, or pre-graphical Word , which I found rather more soothing on the eyes when using an old laptop or CRT sure, it's still blue, but it wasn't super bright, and far better than black on bright white.

Beyond that, there's probably specific visual accessibility tools that would give you what you want, but I'm not really familiar with any of them.

I would like to connect this computer to a composite video 1v peak-to-peak CCTV monitor. Can anyone give me the right answer or sell me a lead? Stephen in UK. TTL if I remember rightly is 5V output So, get a CGA colour cable, cut the end off. Ditto whatever plugs into your CCTV monitor I'm assuming separate sync here; if not, that's another conversation and essentially you're going to need some kind of modulator or at least sync combiner.

Syncs and ground run to each other. On the video side Wrap the whole sorry affair in heatshrink and duct tape and off you go. That'll give whatever kind of result IBM originally managed, I would expect.

Very limited and muddled range of grey levels output with a lot of duplication as, e. If you want to be more sophisticated then each line should have a different peak voltage, which all together add up to 1.

I've no idea on the exact figures or the necessary resistances, but I'd suggest green as the highest peak value, then red, followed by blue Might have to play with it and see what looks most logical, especially when putting black, dark grey, light grey, and white next to each other.

Don't know if there's any monitor-type sensing within the standard, but if the CCTV is standard 15kHz, you certainly want to make sure that the board isn't accidentally told that it has an MDA-compatible screen connected instead. The engineering of THAT is very much left as an exercise for the reader.

Better yet, an RGBI to S-Video converter or even a janky composite-to-SVideo adaptor should give you pure monochrome output on one line and the colour component on the other, which you can then just leave disconnected from the screen. Actually this answers some of the other questions, both my own and that of others who already commented. The typical colour mixing formula is along the lines of 0.

This is essentially a recreation of the colour sensitivity of the eye, where blue just looks darker than red, and red darker than green, most of the time, and certainly at the same physical intensity.

That sort of thing shouldn't have been too difficult for IBM to achieve on the VGA, for example, as you could just switch the DAC output from going straight to the output port which it would also do for "pure green" monochrome , to routing through a small set of pretty rudimentary discrete components three better-chosen single resistors and three diodes before being dumped back on the green output line if wanting to convert unspecialised colour software for mono output.

IE, it only offers 1-bit black and white, no greyscales at all. The question is, how does the VGA "sum" the three channels onto green for mono output? If you want vaguely real-seeming greyscale and moreover, any kind of definition between the three primary colours, and between the three pure secondaries - pretty crucial for any software that uses a legacy RGBI colour mode for example And even then, you'd still have levels all up, just with many duplicates.

Typically for monochrome TV and "greyscale" filters in image editors etc, there's a certain formula used to balance the three channels with their relative perceptual intensities. Green gets the lion's share, blue very little, and red somewhere in between. So a red flag flying over a background of a blue sky and grassy fields will actually show up, though the sky may actually come over darker than the ground unless it's quite pale cyan rather than deep blue.

If that's what IBM used to produce the greyscale output, then it's entirely possible that the " greyscale" claim is valid. Sierra would have just adjusted the RGB colours in the palette such that the summed values gave a reasonably smooth progression beyond the already pretty good 64 normally available, likely with green as the main driver and varying amounts of red and blue to nudge the overall value up a little the simplest though not entirely precise way would be just to have pure green as the XXXX-XX00 value, then a matching intensity of blue added for XXXX-XX01, swapped for red to give XX10, and then both for XX Or perhaps they just forced colour-summed output of the regular graphics instead of accepting some BIOS-driven switching to green-only?

Similar methods have been used for other instances of higher colour depth simulation with greyscales after all. It's basically how bit mode works though that alternately increments green, then all three channels, because of green's greater depth , can give a similar illusion in bit , and works reasonably well down to 9-bit I've used it on an original Atari ST to convert greyscale pictures, without the benefit of the STe's bit palette matching up more neatly with the 16 colour registers , with a bit of user preference exercisable over the "tone" of the picture in terms of the order in which the levels change and if they ping-pong instead of being truly alternate.

No reason why it couldn't also work even better with bit. And in all of those, if you use a mono monitor especially composite or zero out the colour saturation on a TV or colour 15khz, it pretty much just looks like a smoother greyscale. So there's all kinds of ways that the original standards can be cheated and worked around. EG Photochrome which could pull the equivalent of about It's not something that's really that easy to do as a live calculated thing, especially in 3D or the like, but if you're working within a still fairly limited colourspace and can precalculate all the values you need the two palettes for extending the greyscale range, plus the encoded bitmaps for all the on screen elements, or the exact colour sequencing for a photoreal static image on a machine not made for that it can work quite well and maybe not even take up any significant extra CPU time if all you're doing is reloading perhaps palette registers in each VBlank, essentially as a straight blit.

Vintage computing is all about the hacks. It was only meant to be a greyscale mode, but somewhere they messed up and only managed to make it sort-of-work on composite and not at all on RGB , so just kept the unsuccessful result undocumented, then tweaked it a little for composite when the tech improved just enough that the equivalent of an extra IC or three could be squeezed onto the board.

Which is a bit of a shame, because it probably would have been the most commonly used mode even with RGB if it had been possible to implement it, and would have made compatibility between colour and mono rather easier All these adapters could support monochrome monitors, although they were at their best when connected to color displays.

At first, if you wanted to connect a CGA card to a monochrome monitor, you typically would use a monochrome composite monitor. If you were really lacking for a monitor, a black and white monitor would do in a pinch, but you probably would have to run the composite output of the CGA card through an RF converter box.

The resulting graphics would be nowhere near as sharp as with a straight composite connection. Lets start with our test images :. By the late 80s and early 90s, monochrome plasma displays were common on laptops, and this design allowed them to give some definition to what would otherwise have been color graphics.

In x, IBM would selectively dither a 2x2 block of pixels remember that the display stretches x to x to simulate the 64 intensities that should be available. This was nonsense, the hardware was limited to 64 shades of gray due to the 6-bits per color regardless of being displayed on a monochrome or color VGA monitor.



0コメント

  • 1000 / 1000