VideoAmp's internal EDID emulator usage - Redemp/VideoAmp_wiki GitHub Wiki
Technical details on how the EDID emulator works
When you plug the VideoAmp board VGA input to your host computer, the VGA RGB lines of the VideoAmp will pull some current from the host PC, thanks to RGB lines 75ohm impedance, which triggers a plug-and-play event for detection by your computer. Then, your computer will try to read serial monitor identification data from a serial EEPROM through pin 12 and 15 of the VGA, what is called the DDC bus, which is actually an I2C bus with 100kb maximum speed.
The VideoAmp internal EDID emulator is meant to answer to this monitor identification data like a standard "VGA monitor", which usually returns a detailled timing descriptor (called DTD) of what resolution is supposed to be used for a particular display.
The VideoAmp EDID emulator is not based on a serial I2C EEPROM as most of cheaper solutions that exist (like the "soft15kHz" dongle). It is based on software emulation directly by the onboard micro-controller (MCU) of a true E-DDC Video EEPROM with segment pointer and support for extra-large EDID content (see E-DDC).
Internally, the videoamp uses a fast flash memory chip to store the EDID binary blob. When the EDID is read by the host computer given a segment value, the MCU will fetch the right segment page from the memory chip and return the EDID data accordingly. Thanks to this unique property, the EDID binary blob is not limited to 128bytes as most EDID emulators can do, and can be dynamically and quickly modified even without actually performing a slow I2C access. In the case of the VideoAmp, the flash memory can be updated from the USB serial bus in a fraction of a second.
This particular function is usually only found on high-end EDID emulators that costs more than hundreds of euros, and can serve multiple purposes. One usage can be to perform "EDID" overload for a monitor that has a bad internal data, or even to "fake" a monitor presence with whatever monitor identification is needed.
In our particular Retro-Gaming usage, this can be exploited to fake what a "old schoold" 15kHz monitor would return back to your host computer when you plugged it into a VGA port. This is what SailorSat made with her Soft15kHz dongle around 2010. While this can still be used today, one of the problem is that if a user want to reprogram the dongle to change the reported detailled timing, either he has to reflash the onboard EDID eprom with another board, or by chance he has a GPU that allows bi-directional I2C communication and use specific tools to reprogram it. Nowadays, such GPU with bidirectionnal I2C access are very rare. Furthermore, this technique cannot be used to set extended EDID segments because accessing the EEPROM is performed using a standard EEPROM protocol where byte adressing cannot be more than 255, thus it limits the maximum size of the EDID blob to 2x128bytes.
This is where the difference appears with the VideoAmp: the reprogramming of the EDID data is not performed any more on the same I2C bus (meaning it is faster), and the size is not limited by the I2C EEPROM protocol as it uses a USB serial communication to set the data.
General guideline to make EDID emulation working
Cables, HDMI and DP converters and VGA port
First, you need either a native VGA port on your computer (one that has E-DDC pins 12&15), or a special DisplayPort or HDMI converter that allows what is called "EDID passthrough". The "EDID passthrough" function means that EDID data is not processed or converted by the converter, it is directly forwarded to the VideoAmp E-DDC bus.
Sadly, this function is not very common on most converters, even on those where it is advertised as being implemented. Specifically for DisplayPort the E-DDC bus must be converted to AUX bus by master/slave I2C components, which means some processing is done by the adapter, which can ruin the "EDID passthrough" function completly.
As of today, here are my tests results for 2 usually recommended DisplayPort adapters:
- the StarTech DP2VGA2: only 384 bytes of EDID can be returned by the converter (up to 3+2x6=15 DTD using 2x CIA-861 extension blocks)
- the CableDeconn DP to HDMI/VGA/DVI (not 4k): only 128 bytes EDID can be returned (maximum 3 DTD).
My preference goes to HDMI to VGA adapters that you can modify yourself to get full EDID passthrough because the HDMI standard uses actually exactly the same E-DDC protocol as VGA, only pins are mapped differently (to 15 and 16).
The way to modify a HDMI to VGA converter is to wire directly the VGA pins to the HDMI pins, and bypassing the converter chip by cutting the traces on the pcb.
Warning: not all HDMI to VGA converter's chips accept DDC bypassing.

Operating systems
The EDID emulation does not depends on the operating system. Users using Windows 7, 10, 11, MacOS or Linux (even standard Linux) have reported getting working 15kHz signals for their Scart TV.
The only known specificity on Linux is for interlaced resolutions. It seems like Linux has a different way of interpreting vertical timings compared to Windows and MacOS and needs twice (2x) the value to correctly interprets a DTD. If you plan to use interlaced videomodes on Linux (given your gpu accepts them), please enable the "2x VField" option when you write your modelines into the flash memory of the VideoAmp.
Drivers
Since EDID reading is performed by the GPU and reported by its driver back to the operating system, you must have installed proper drivers for your GPU. The default video driver of Windows will not read the EDID monitor information and simply return a "generic monitor" with VESA resolution like 640x480, 800x600 or 1024x768.
Several users reported that modified drivers (like old Windows 7 ATI patched driver) does not report the detection of the VideoAmp in the monitor properties of Windows. In this case, please try with the manufacturer drivers to see if it solves the issue.
Detailled timings and overlapping resolutions for GPU scaling
Sometime the video signal is not the one you expect to have, even if you only put 1 modeline into the EDID. For example, on laptops you can get a wrong video signal because the gpu driver simply keeps the LCD panel signal and duplicate it. See for example below what I got from a user running Windows 10:

The problem is often solved by simply changing the refresh rate (here to 59.924Hz):

When loading modelines into the board, you can define resolutions that have identical vertical values (like same vertical number of lines to 240, vertical front and back porch), but different horizontal values (like 1280 or 320 pixel width). These overlapping resolutions will be available to you, but if you select for example a 320x240 resolution, the GPU will keep the actual signal value to 1280x240 (due to pixel clock limit), giving you black vertical bars on each sides of the screen.
To make your GPU automatically fill the screen and apply hardware scaling, go to your GPU driver utility and find the "stretch to full screen" option that you have somewhere in the monitor parameters. This option exists on Intel, AMD and NVIDIA utilities.

Doing this will allow you to use 320x240 desktop resolution, while the actual video signal is 1280x240 (15kHz). This is transparent to all your game and application.
Some known limitations
Interlaced videomodes/resolutions
Most modern GPU have dropped support for interlaced resolutions. There is nothing we can do here, except complain to them about loosing this function. Some users reported that on Linux with modified AMD drivers you can still use interlaced resolutions.
Force a re-read of the EDID
To force a refresh of the reading, you must force a plug-and-play event in the operating system, at the GPU driver level.
One hardware and manual solution is to unplug and replug the VGA cable.
Another software solution on Windows is to use either CRU's "restart" utility, or you can use the VideoAmp companion software to perform a gpu disable/enable operation on the device manager ("Video" tab).
On v4 boards, we added a hardware switch on the VGA input connector, which simulates a hardware unplug/replug and force re-detection of the board.
Maximum size of EDID
As per discussion with ToastyX, author of Custom Resolution Utility, the AMD and NVIDIA drivers cannot read more than 7 CIA-861 extension blocks.
This means only up to 50 detailled timing descriptors can be stored on the board and read back by the respective driver. If you put more than 50 DTDs, the additionnal DTDs will not be read by the driver, while being correctly present in the EDID blob.
Minimum pixel clock, super resolutions (and why we need to stretch emulator image)
The pixel clock is the number of pixels a GPU need to "generate" per second for a given video mode. This pixel clock frequency appears in a modeline description as the first value of the modeline (see wikipedia here)
Modern GPUs cannot go lower than a threshold. For example Intel integrated GPUs cannot output video modes with pixel clock <10MHz. Also some standards like HDMI restrict to pixel clocks to be > 25MHz (while my tests shown you can actually go lower like 15MHz).
That limitation is usually solved by using super(wide) resolutions that have the same vertical number of lines and refresh rates based on their lowres equivalent (like 1280x240p@60Hz or 1920x240p@60Hz for a 240p@60Hz 15kHz signal).
In that case you can either "stretch" in X the image in your emulator (MAME, supermodel or flycast have this option), or use the gpu scaling trick presented in previous section.
For example, GroovyMAME perfectly works by stretching picture in X to use super(wide)-resolutions, and dynamic videomode switching on Windows 11, using the closest videomode reported in the EDID. This is actually handled by switchres library internally. Please note that switchres can only switch amongs the resolutions loaded in your VideoAmp board, accepted by your gpu and reported to your OS (maximum 50 different modelines with NVIDIA for example).
Depending on your games, usually 10x videomodes may be enough to cover most of your needs (like 200p@60Hz, 224p@60Hz, 240p@60Hz, 240p@58Hz, 256p@55Hz, etc.). Do not ever enable vertical stretch in Y to get clear pixels rendering, else you will experience blurry image.
To allow dynamic resolution switching, you have to allow system videomodes by disabling the configuration parameter "lock system videomodes".
An example mame.ini file for GroovyMAME is provided as part of gameassets in the VideoAmp software archive.