![]() | This is an archive of past discussions about Video Graphics Array. Do not edit the contents of this page. If you wish to start a new discussion or revive an old one, please do so on the current talk page. |
Archive 1 |
From the article: the video memory for color mode is mapped at 0xb8000-0xbffff. I thought VGA graphics memory started at 0xa0000? At least, in linear (320x200x256) mode, where each byte was one pixel? -- pne 05:07, 26 Aug 2004 (UTC)
Answer: In graphics mode yes it's 0xa0000. Colour text mode is "0xb8000". -- Funkymonkey.
What about the VESA standard for successors to VGA?
Feel free to add your own info :) -- Funkymonkey.
0xb0000-0xb7fff is a perfectly valid address space for the VGA when operating in a Mono text mode (Mode 7). -- Funkymonkey
Vector Graphics Array Can anyone tell me how to connect a xbox360 console to a monitor using a VGA AV cable?What is meant by a female/female adapter?
I removed the reference to 800x600 and 640x400 modes, as I'm pretty sure they're not possible using standard VGA hardware. 800x600 maybe at a low refresh rate? Remember the distinction between a clone VGA and a "Super VGA" is blurred, some clone VGA's such as Oak's OTI037 256K VGA were capable of 800x600 I remember. However, this page is about the true blue IBM original. The main reason these modes should be near impossible (especially 640x400 in 256 colours) on standard VGA hardware is that the video bandwidth (28Mhz max) is too low. Horizontal Scan Rate would be unacceptably low. I'd love to be proved wrong however - if someone can demostrate the CRTC settings for a 640x400 256 colour, or 800x600 mode that would run on an IBM VGA with multisync monitor i'd be interested to see it. I also removed the reference to the 'Direct-X' term double buffering. Double Buffering has long been used as a term before the introduction of Direct-X. -- Funkymonkey May 8, 2005 - Hi Funkymonkey! There was an old MS-DOS program "FRACTINT", which claimed to support output resolutions up to 800x600 (16-color) on a true "IBM VGA adapter." Years ago, I fooled around with the program, but on an SVGA adapter, so I can't verify the program's claims. I do remember "ModeX" allowed up to 360x480 (60Hz) without resorting to outrageous refresh-rates -- a handful of MS-DOS games used this mode (my favorite was "Bananoid", a shareware clone of Arkanoid.) http://spanky.triumf.ca/www/fractint/hardware_modes.html#video_notes_anchor June 23rd, 2005. I read that link and it seems you're right about the 800x600 mode - tres cool, and a good find! We should integrate the information from that link back into the VGA page, I think i'll do that! -- Funkymonkey.
Cool - I'd love to see those programs, if you can make them available. What monitors have you found are able to sync to these low refresh rates? Thanks. Funkymonkey
I'm willing to accept that 800x600 is possible, and it has been show (Fractint as you say) but 640x400 in 256 colours seems unlikely due to the very low horizontal frequency that would involved. In 256 colour modes, the VGA is operating at half the horizontal clock speed as two normal clocks elapse for each pixel. Can we leave out the reference to 640x400 in 256 colours until there is some proof of it's existance? --Funkymonkey.
Leave it in if you like then, I just think it's nice if we have some evidence to backup the mode's described as possible. --Funkymonkey. The article says "360x480 (highest resolution compatible with standard VGA monitors". I guess that's for 256 color modes. What is the highest 16 color mode resolution compatible with a standard VGA monitor? Calvero2 11:32, 8 April 2007 (UTC)
Could be possible that VGA can display 256x177? I found that when playing Rastan on Dosbox. Or pearps it's EGA tweak.
It is my understanding that VGA was an important landmark in that, for the first time, standard IBM PC color computer monitors came to have the same resolution as North American TV broadcasts (NTSC standard); thence the name "Video Graphics Array", or an array of pixels matching the resolution of standard video. As written the article named "NTSC", color TV broadcasts have 486 "viewable" horizontal scanning lines per frame which closely matches VGA's 480 number. As for the 640 number it surely must match the equivalent resolution if the scanning were done with vertical lines instead of with horizontal lines, but I'm unsure about this. I don't know were to add this comment without disturbing the balance of this article. Perhaps if the original author sees this note he/she may add it appropriately.
Can someone familiar with this topic take a look at the recently created article on HVGA? Thanks. -- Rick Block (talk) 01:25, 8 June 2007 (UTC)
AFAIK, the firt VGA appeared in the IBM Personal System/2. It was a part of the Micro Channel architecture memory controller, like modern IGP in the Northbridge. Originally, there was no such thing as "VGA Chipset" in the PS/2. The first VGA-compatible ISA chipset was a Paradise VGA (later Western Digital). - Alecv 19:24, 8 June 2007 (UTC) Answer: Shame i don't have my old PS/2 to hand, but IIRC when looking at the board you could identify discrete chips that made up the VGA. Things weren't that integrated back then. If you have ever seen an 8514/A board (i own one) it is made up of a very large number of discrete chips. It may well be that some of the core logic was integrated into the MCA memory controller, but i'm pretty sure it was a chipset all in all. (I may be talking crap though, because it depends on whether you count the video RAM, and the DAC as seperate or not) 149.254.200.220 18:02, 5 September 2007 (UTC)funkymonkey
From experimentation, I don't believe that the mode 13h palette is the graphics adapter's default palette. By setting the graphics mode directly, I found 8 shades of the EGA colouring scheme, and 192 blank colours. Though I did this under QEMU, so it might not be right, however, the so-called "default palette" is almost definitely stored in the BIOS. I'll try on actual hardware (no original VGA chipset though :\) --thematrixeatsyou 08:49, 2 November 2007 (UTC)
¿Is a simple answer really so much to ask for? All that technical information is great and wonderful- But I can’t tell from the article what the Hell a VGA does, only that they are (or maybe were even that’s not really clear) used in video games (I was told by a salesman that they can be used to convert a TV screen to a computer screen, but thinking he was telling me what HE wanted me to hear, I thought I’d check it out- obviously, this article was less than no help, or I wouldn’t be leaving a comment).
Include a SIMPLE explanation for those of us who aren’t computer nerds. —Preceding unsigned comment added by 71.34.68.223 (talk • contribs) 08:06, 8 March 2009 (UTC)
It's possible on the official VGA cards to put the card into a linear 320x240x8 mode instead of the more common planar "mode X". There's a register that can be used to enable a 128k window from A0000 to BFFFF instead of the usual 64k at A0000 to AFFFF. Combined with some of the register tweaks of "mode X", you'd get a linear 320x240x8. I doubt it was used on many games, if any, because some clone cards didn't support this setting. --Myria (talk) 21:47, 17 November 2007 (UTC)
I'd like the readers to be able to see how a 256-color image actually appeared back in the 1980s. (I suppose a 256 color GIF image would be an appropriate substitute.) ---- Theaveng (talk) 15:34, 7 January 2008 (UTC)
It shows the front porch as the start of the line, however the front porch happens at the END of the line, whereas the sync pulse is the official start of the line (when the electron gun moves from one edge to the other edge). ---- Theaveng (talk) 15:46, 18 May 2009 (UTC)
Needs a picture of a VGA plug and/or port. See the page for DVI (Digital Video Interface) for an example. 76.2.89.37 (talk) 17:20, 12 April 2010 (UTC)
hii, i want to know if in any computer vga is not install so what the effect of it on any pc... —Preceding unsigned comment added by 59.95.109.207 (talk) 16:29, 20 May 2010 (UTC)
A diagram of an VGA signal would be really helpful. This is a good one http://www.vga-avr.narod.ru/vga_timing/vga_timing.gif — Preceding unsigned comment added by 209.94.128.118 (talk) 18:55, 4 December 2011 (UTC)
I think this fact should be emphasized in this article. As I recall at the time, color computer screens were of lesser resolution that a standard color TV which was regarded as the "panacea" of picture quality. Thence when VGA first appeared it was highly regarded for being "as good as" a color TV set of the time (north American NTSC standard), which can be loosely described as having a resolution of 640 X 480 pixels. The name "Video Graphics Array" can be read as "having the same resolution as NTSC color video". By the way the "A" was commonly mistaken for Video Graphics ADAPTER because the earlier standard had been CGA and EGA where both "A´s" stood for "Adapter" where in VGA the "A" stands for ARRAY.
OK, there were two things, which the "edit note" line just wasn't long enough to contain.
One: stating the VGA's 16-colour modes (in 640x350 and 320x200, though it also holds for 640x200) as distinct from those of EGA. These are listed either separately, or on combined lines in most VGA listing tables, e.g. "mode 1+" vs "mode 1". Although the great majority of programs using 16 colour PC graphics at 320x200, or use 640x350 at all just go for the EGA palette as that's the easiest method and provides greatest compatibility, and go all-out with 256 colour (or 640x480 mode) in specific VGA modes, I have seen it done. Particularly where programs have been converted to the PC from the Amiga or ST platform where they used either a more limited but fully customised palette (typically 16 colours), or a medium-resolution colour mode with decidedly non-PC-standard colours (eg from a Mac II Color). For example, the early 90s "Space Shuttle Simulator" game was VGA/MCGA only, but used 16-colour 320x200 mode, with a customised palette matching that of the non-PC computers offering a much subtler and more variable range of shades than the EGA despite its similarly narrow range of simultaneous colours.
One presumes it made the porting easier, without losing any graphical fidelity or having to put in more work to upgrade the graphics, whilst reducing the on-disk and in-RAM data footprint, and giving faster screen updates, for titles where having a great spread of colour was not as important so long as certain bases were covered.
There's at least a couple other notable PC games I've played which use 320x200x16 and 640x350x16 with non-EGA colours but, of course, will they come back to mind right now? Will they hell. I can just see a vague outline of the loading screen in my mind's eye but no actual words or anything. :( ((Corncob 3D looms large, but I think that was actually 640x200 in 16 colours, either only slightly customised or just plain EGA))
Two: Removing the notes about ColoRIX doing 640x400 in 256 colours. I'm sorry, maybe there's a print citation for this somewhere? But I can't find anything online. In fact, what I DO find is evidence that it will do that mode, and 640x480, and 800x600, and even 1024x768, but ONLY with specifically compatible, non-VGA-standard cards... XGAs, SVGAs, and "expanded VGA" adapters (ie standard VGAs with their own proprietary add-on SVGA-like modes), which are of course NOT "VGA" for the purposes of an article such as this. This was from a wide ranging review of different high-rez and high-colour paint packages in a 1988 PC magazine that had been put online. A smaller article from another edition of the same magazine a few months later also mentioned that it now had the capability to go up to 360x480 in 256 colours on a standard VGA adaptor in addition to the higher rez 256-colour modes available on more expensive non-standard cards with suitable drivers.
Other resources, including some I've recently cited for other information, show that the VGA specification simply doesn't allow for the combination of 256 colour mode with high horizontal resolution. The hardware can't drag pixel data out of the VRAM, fling it down the internal bus and drop it into the DAC fast enough. The most you're ever going to get is 800 (900?) pixels from front to back porch in 4-bpp mode, or 400 (450?) in 8-bpp mode (or, 400/450 bytes per line), with 640/720 and 320/360 being the standard within normal overscan limits.
With a fair bit of overscan, you might achieve something exotic like 400x600 in 256 colours, giving decent resolution with an uneven aspect ratio (and slow, choppy refresh as you're using very nearly the entire 256kb at once; halving this to 400x300 actually seems to work well in the case of a very few games that offer it, like the original Grand Theft Auto), but not 512, 640, 720 or 800-by-anything (not 600 lines, not 480, not 400, not 350, not 240, not 200...) without dropping the bit depth.
I'm sorry, it just doesn't seem to be possible within the limits of the hardware, and there's no evidence I've ever seen, or can find now, that suggests it's possible to exceed it. You may as well ask a C64 to run in 80-column mode, an Amiga to give more than 4bpp in 70ns or 2bpp in 35ns pixel mode, a Spectrum to offer a full-rez mode with single-pixel colour addressability, or an ST to pump out hi-rez colour or 16 colours in 80 column mode with full pixel level colour addressability. It's not going to happen.
However, someone, somewhere, might have been very very clever. If you've hard evidence that ColoRIX or some other software can actually break this barrier and do a 256KB, 640x400x8bpp mode on a STANDARD, fully spec-compliant, no-proprietary-anything VGA card, then it can go back in. And I'll be both impressed and rather pleased at mankind's ingenuity. Particularly as this would mean somehow tricking the internal (4bpp) pixel bus into running at a minimum of almost 36Mhz when it only has crystals for 28 and 25Mhz onboard.
However, I highly doubt it, as some bugger would have done it already, and become legend for their achievement - even though it would have very quickly have been rendered pointless by the rise of VESA compatible SVGAs, competitive demo-hackers and the like put a lot of stock in what you're able to do within the limits of pure stock hardware.
Hell, no-one ever even made serious use of 360x240/350/400/480 (or 400x350?). The former two would have been very good for games, giving a twin-buffer 256-colour display within the 256kb limit, and something resembling a normal aspect ratio (ie more horizontal pixels than vertical), with the choice being between squarer pixels or a slightly better monitor refresh rate. I think the only places I've seen them on offer are in graphics demos, FracTint, Quake and GTA. 193.63.174.211 (talk) 13:18, 7 November 2012 (UTC)
Who put that in? It's completely unnecessary. As the adaptor could output a fair variety of horizontal and vertical resolutions, each one in free combination with the other so long as the rules over memory use and colour depth were respected, it should only be necessary to add a note regarding that.
IE, that it could display any combination of X, Y, or Z columns of pixels horizontally with up to 256 colours, U, V or W with up to 16 colours, and A, B, or C lines at 60Hz and D, E, F at 70Hz (or half as many with line doubling) vertically.
Rather than a long aspergic list of every single possible graphic mode, which is really not the sort of thing Wikipedia is about.
Besides, I'm highly skeptical about some of them - I don't believe, for example, that a 600-line (or even anything taller than 512-line) mode was possible on a standard VGA. 800 pixels across may have been possible, but not that many high. Otherwise, whence cometh the difference between VGA and SVGA? The 256/512-pixel-wide modes seem unlikely as well, unless they were effectively just 320/640 pixel ones masked down. The pixel and line timings ran off fixed quartz crystal standards after all, not a freely reprogrammable PLL loop like in modern cards, hence the reasonably well-fixed 320/640/720 width and 200/350/400/480 height modes (working within 800 and 900 pixel-clock wide and 441/525 scanline frames, just using more or less of the available height for 350 vs 400 line, and width for 720 vs 800 column modes, rather than changing the line/frame display time). 25ish MHz for 320/640, and 28 for 720-800 width, plus a fixed 31kHz line frequency (which was kinda inviolable because changing it could cause monitor damage).
I'm going to see what I can do about editing that down now. Please think about what you're doing, in future, before dumping such a big ol' list of stuff like that. 193.63.174.211 (talk) 12:01, 20 June 2014 (UTC)
4throck (talk) 13:26, 20 June 2014 (UTC)
I put the list there. Agreed that as you suggest would be better:
IE, that it could display any combination of X, Y, or Z columns of pixels horizontally with up to 256 colours, U, V or W with up to 16 colours, and A, B, or C lines at 60Hz and D, E, F at 70Hz (or half as many with line doubling) vertically.
It's the same information, but on a better layout.
I also suggest that you change than in the article itself. Complaining about it here will only take your time and not get it done.
Agree that some combinations are non-standard, but I remember using MAME in 256x240. But again, perhaps that was SuperVGA. Some references are needed on that part of the article I agree.
4throck (talk) 22:38, 20 June 2014 (UTC) You did a nice job and provided good information :-) Completely agree with the scope of the article. As it is, at least the reader will understand that the original standard was expanded with new modes, either through better hardware or through software hacks. But the drive to have better graphics was there, thus SVGA appears shortly after. In the future some of that info may be moved into more technical articles, but it's good as it is. Thanks!
I removed:
"when using a DVI or HDMI connection, especially on larger sized LCD/LED monitors or TVs, quality degradation, if present, is prominently visible."
This made little sense in context. Perhaps whoever wrote this meant that the quality degradation is ***noticeable*** in comparison to digital formats like DVI or HDMI. Those formats themselves, if used properly, should not show any VGA-style degradation.
If that is what was meant, I guess we can add it back in. If not, well, it'll just confuse people...
Bilditup1 (talk) 17:19, 11 October 2014 (UTC)
I am almost certain that the 'V' in VGA actually stands for 'Versatile' and NOT 'Video', as defined by IBM, the inventor of the standard.
I think 'Video' is actually a (very) widespread misnomer, which arose because it seemed 'obvious' that the 'V' stood for 'video'.
A more common misnomer is thinking that the 'A' stands for 'Adapter', which is incorrect and I'm pleased to see that this mistake is not repeated here.
A few sources around the Web cite 'Versatile Graphics Array', but really we need to find the original IBM published standard, or at least a similarly authoritative source. I've searched but can't find it.
Two questions: 1/ Amongst our older contributors, does 'Versatile' rather than 'Video' ring any bells of recognition? 2/ Can anyone help pin down some suitably authoritative source, such as IBM's own documentation?
If 'Versatile' turns out to be correct, we should still mention 'Video' as a commonly-used alternative, simply because it has *almost* become accepted usage, despite being wrong.
Steve — Preceding unsigned comment added by Steve Thackery (talk • contribs) 18:58, 13 April 2014 (UTC)
Colleagues, I think I am wrong. I have found some vaguely contemporary IBM documents and they definitely refer to "Video Graphics Array". Thus it looks very much like - even if it did once stand for "Versatile..." - IBM themselves definitely adopted "Video..." early on (or perhaps always did, and "Versatile..." is the misnomer).
I feel I should apologise for wasting your time over this. As I cannot find any supporting evidence it is clear I must withdraw this suggestion.
Steve Steve Thackery (talk) 21:31, 14 April 2014 (UTC)
I was one of the the techies responsible for the UK announcement of PS/2 (I did Micro Channel and I can assure everyone that the term is (and always was) VIDEO Graphics Array. — Preceding unsigned comment added by 31.49.247.15 (talk) 17:29, 28 June 2015 (UTC)
I'm going to go and add this, and see if it gets reverted. As far as I know, 640x480 (and possibly 640x400?) with a single bitplane - i.e. classic monochrome - is also a standard VGA resolution. I remember being able to set it... in QBasic, in DOS, in Windows 3.1 and even Win 95 with the standard VGA driver (you know... the one which set the dark colours to be not-quite-primary, but not EGA standard either)... and really, it has to be there as an option, otherwise what do you do if you only have 128k or 64k of RAM onboard? Only use the lower resolutions? In which case why not buy an EGA or MCGA board?
Only 256k allows 640x480 in 16 colours (and 320x480 in 256), even though 64k allows 320x200 in 256.
128k limits you to 640x400 (or 640x350, or 320x480) in 16, 640x200/320x400 in 256, and 640x480 in mono (or potentially 4-colour/4-grey, if anyone had bothered to think about offering it... it might have been available as a hack?)
64k makes 640x480 mono-only, along with 640x400, 640x350 and 320x480 (again, 4-colour is a neglected possibility), with 640x200, 320x400 and 320x240(!) all dropping to 16... 193.63.174.211 (talk) 18:49, 23 January 2012 (UTC)
217.99.252.102 (talk) 11:28, 27 September 2015 (UTC)
vga cable signaling spec?
The expression in the article does imply an exact frequency of 60/1.001, but how is this determined from the rounded numbers in the source or other timings provided through EDID, which I believe are rounded too and don't allow fractional math? Indeed, the microsecond durations seen here appear to have been calculated with these rounded numbers. 83.93.8.224 (talk) 20:10, 27 February 2016 (UTC)
VGA is analog, but it is not "Amplitude Modulated". Modulation requires a carrier which is altered by the modulating signal. VGA is a baseband signal without modulation. — Preceding unsigned comment added by Whitcwa (talk • contribs) 17:36, 29 January 2016 (UTC)